WorldWideScience

Sample records for repeated emotional faces

  1. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    Science.gov (United States)

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  2. Emotion Words: Adding Face Value.

    Science.gov (United States)

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Steroids facing emotions

    NARCIS (Netherlands)

    Putman, P.L.J.

    2006-01-01

    The studies reported in this thesis have been performed to gain a better understanding about motivational mediators of selective attention and memory for emotionally relevant stimuli, and about the roles that some steroid hormones play in regulation of human motivation and emotion. The stimuli used

  4. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  5. Emotional Labor, Face and Guan xi

    Institute of Scientific and Technical Information of China (English)

    Tianwenling

    2017-01-01

    Emotional Labor, Face and Guan xi are all relevant to performance, appearance, and emotional feelings, which are essential elements in work place. In other words, not only front-line workers, but all employees in an organization is faced up with the three

  6. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  7. Detecting and categorizing fleeting emotions in faces.

    Science.gov (United States)

    Sweeny, Timothy D; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A

    2013-02-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d' analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  8. Detecting and Categorizing Fleeting Emotions in Faces

    Science.gov (United States)

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  9. Matching faces with emotional expressions

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2011-08-01

    Full Text Available There is some evidence that faces with a happy expression are recognized better than faces with other expressions. However, little is known about whether this happy face advantage also applies to perceptual face matching, and whether similar differences exist among other expressions. Using a sequential matching paradigm, we systematically compared the effects of seven basic facial expressions on identity recognition. Identity matching was quickest when a pair of faces had an identical happy/sad/neutral expression, poorer when they had a fearful/surprise/angry expression, and poorest when they had a disgust expression. Faces with a happy/sad/fear/surprise expression were matched faster than those with an anger/disgust expression when the second face in a pair had a neutral expression. These results demonstrate that effects of facial expression on identity recognition are not limited to happy faces when a learned face is immediately tested. The results suggest different influences of expression in perceptual matching and long-term recognition memory.

  10. Processing of emotional faces in social phobia

    Directory of Open Access Journals (Sweden)

    Nicole Kristjansen Rosenberg

    2011-02-01

    Full Text Available Previous research has found that individuals with social phobia differ from controls in their processing of emotional faces. For instance, people with social phobia show increased attention to briefly presented threatening faces. However, when exposure times are increased, the direction of this attentional bias is more unclear. Studies investigating eye movements have found both increased as well as decreased attention to threatening faces in socially anxious participants. The current study investigated eye movements to emotional faces in eight patients with social phobia and 34 controls. Three different tasks with different exposure durations were used, which allowed for an investigation of the time course of attention. At the early time interval, patients showed a complex pattern of both vigilance and avoidance of threatening faces. At the longest time interval, patients avoided the eyes of sad, disgust, and neutral faces more than controls, whereas there were no group differences for angry faces.

  11. Emotional faces and the default mode network.

    Science.gov (United States)

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  13. Digitizing the moving face: asymmetries of emotion and gender

    Directory of Open Access Journals (Sweden)

    Ashish Desai

    2009-04-01

    Full Text Available In a previous study with dextral males, Richardson and Bowers (1999 digitized real time video signals and found movement asymmetries over the left lower face for emotional, but not non-emotional expressions. These findings correspond to observations, based on subjective ratings of static pictures, that the left side of the face is more intensely expressive than the right (Sackeim, 1978. From a neuropsychological perspective, one possible interpretation of these findings is that emotional priming of the right hemisphere of the brain results in more muscular activity over the contralateral left than ipsilateral right side of the lower face. The purpose of the present study was to use computer-imaging methodology to determine whether there were gender differences in movement asymmetries across the face. We hypothesized that females would show less evidence of facial movement asymmetries during the expression of emotion. This hypothesis was based on findings of gender differences in the degree to which specific cognitive functions may be lateralized in the brain (i.e., females less lateralized than males. Forty-eight normal dextral college students (25 females, 23 males were videotaped while they displayed voluntary emotional expressions. A quantitative measure of movement change (called entropy was computed by subtracting the values of corresponding pixel intensities between adjacent frames and summing their differences. The upper and lower hemiface regions were examined separately due to differences in the cortical enervation of facial muscles in the upper (bilateral versus lower face (contralateral. Repeated measures ANOVA’s were used to analyze for the amount of overall facial movement and for facial asymmetries. Certain emotions were associated with significantly greater overall facial movement than others (p fear > (angry =sad > neutral. Both males and females showed this same pattern, with no gender differences in the total amount of facial

  14. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Neonates' responses to repeated exposure to a still face.

    Science.gov (United States)

    Nagy, Emese; Pilling, Karen; Watt, Rachel; Pal, Attila; Orvos, Hajnalka

    2017-01-01

    The main aims of the study were to examine whether human neonates' responses to communication disturbance modelled by the still-face paradigm were stable and whether their responses were affected by their previous experience with the still-face paradigm. The still face procedure, as a laboratory model of interpersonal stress, was administered repeatedly, twice, to 84 neonates (0 to 4 day olds), with a delay of an average of 1.25 day. Frame-by-frame analysis of the frequency and duration of gaze, distressed face, crying, sleeping and sucking behaviours showed that the procedure was stressful to them both times, that is, the still face effect was stable after repeated administration and newborns consistently responded to such nonverbal violation of communication. They averted their gaze, showed distress and cried more during the still-face phase in both the first and the second administration. They also showed a carry-over effect in that they continued to avert their gaze and displayed increased distress and crying in the first reunion period, but their gaze behaviour changed with experience, in the second administration. While in the first administration the babies continued averting their gaze even after the stressful still-face phase was over, this carry-over effect disappeared in the second administration, and the babies significantly increased their gaze following the still-face phase. After excluding explanations of fatigue, habituation and random effects, a self-other regulatory model is discussed as a possible explanation for this pattern.

  16. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  17. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  18. Modulation of the composite face effect by unintended emotion cues.

    Science.gov (United States)

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  19. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  20. Neonates’ responses to repeated exposure to a still face

    Science.gov (United States)

    Pilling, Karen; Watt, Rachel; Pal, Attila; Orvos, Hajnalka

    2017-01-01

    Aim The main aims of the study were to examine whether human neonates’ responses to communication disturbance modelled by the still-face paradigm were stable and whether their responses were affected by their previous experience with the still-face paradigm. Methods The still face procedure, as a laboratory model of interpersonal stress, was administered repeatedly, twice, to 84 neonates (0 to 4 day olds), with a delay of an average of 1.25 day. Results Frame-by-frame analysis of the frequency and duration of gaze, distressed face, crying, sleeping and sucking behaviours showed that the procedure was stressful to them both times, that is, the still face effect was stable after repeated administration and newborns consistently responded to such nonverbal violation of communication. They averted their gaze, showed distress and cried more during the still-face phase in both the first and the second administration. They also showed a carry-over effect in that they continued to avert their gaze and displayed increased distress and crying in the first reunion period, but their gaze behaviour changed with experience, in the second administration. While in the first administration the babies continued averting their gaze even after the stressful still-face phase was over, this carry-over effect disappeared in the second administration, and the babies significantly increased their gaze following the still-face phase. Conclusion After excluding explanations of fatigue, habituation and random effects, a self-other regulatory model is discussed as a possible explanation for this pattern. PMID:28771555

  1. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  2. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    Science.gov (United States)

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  3. 5-HTTLPR differentially predicts brain network responses to emotional faces

    DEFF Research Database (Denmark)

    Fisher, Patrick M; Grady, Cheryl L; Madsen, Martin K

    2015-01-01

    The effects of the 5-HTTLPR polymorphism on neural responses to emotionally salient faces have been studied extensively, focusing on amygdala reactivity and amygdala-prefrontal interactions. Despite compelling evidence that emotional face paradigms engage a distributed network of brain regions...... to fearful faces was significantly greater in S' carriers compared to LA LA individuals. These findings provide novel evidence for emotion-specific 5-HTTLPR effects on the response of a distributed set of brain regions including areas responsive to emotionally salient stimuli and critical components...... involved in emotion, cognitive and visual processing, less is known about 5-HTTLPR effects on broader network responses. To address this, we evaluated 5-HTTLPR differences in the whole-brain response to an emotional faces paradigm including neutral, angry and fearful faces using functional magnetic...

  4. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  5. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  6. Emotion categorization does not depend on explicit face categorization

    NARCIS (Netherlands)

    Seirafi, M.; de Weerd, P.; de Gelder, B.

    2013-01-01

    Face perception and emotion recognition have been extensively studied in the past decade; however, the relation between them is still poorly understood. A traditional view is that successful emotional categorization requires categorization of the stimulus as a ‘face', at least at the basic level.

  7. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. State anxiety and emotional face recognition in healthy volunteers

    OpenAIRE

    Attwood, Angela S.; Easey, Kayleigh E.; Dalili, Michael N.; Skinner, Andrew L.; Woods, Andy; Crick, Lana; Ilett, Elizabeth; Penton-Voak, Ian S.; Munafò, Marcus R.

    2017-01-01

    High trait anxiety has been associated with detriments in emotional face processing. By contrast, relatively little is known about the effects of state anxiety on emotional face processing. We investigated the effects of state anxiety on recognition of emotional expressions (anger, sadness, surprise, disgust, fear and happiness) experimentally, using the 7.5% carbon dioxide (CO2) model to induce state anxiety, and in a large observational study. The experimental studies indicated reduced glob...

  9. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    Science.gov (United States)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  10. Poignancy: Mixed Emotional Experience in the Face of Meaningful Endings

    Science.gov (United States)

    Ersner-Hershfield, Hal; Mikels, Joseph A.; Sullivan, Sarah J.; Carstensen, Laura L.

    2009-01-01

    The experience of mixed emotions increases with age. Socioemotional selectivity theory suggests that mixed emotions are associated with shifting time horizons. Theoretically, perceived constraints on future time increase appreciation for life, which, in turn, elicits positive emotions such as happiness. Yet, the very same temporal constraints heighten awareness that these positive experiences come to an end, thus yielding mixed emotional states. In 2 studies, the authors examined the link between the awareness of anticipated endings and mixed emotional experience. In Study 1, participants repeatedly imagined being in a meaningful location. Participants in the experimental condition imagined being in the meaningful location for the final time. Only participants who imagined “last times” at meaningful locations experienced more mixed emotions. In Study 2, college seniors reported their emotions on graduation day. Mixed emotions were higher when participants were reminded of the ending that they were experiencing. Findings suggest that poignancy is an emotional experience associated with meaningful endings. PMID:18179325

  11. Alcoholism and dampened temporal limbic activation to emotional faces.

    Science.gov (United States)

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  12. Task-irrelevant emotion facilitates face discrimination learning.

    Science.gov (United States)

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    OpenAIRE

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James; Silton, Rebecca L.

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five particip...

  14. The complex duration perception of emotional faces: Effects of face direction

    Directory of Open Access Journals (Sweden)

    Katrin Martina Kliegl

    2015-03-01

    Full Text Available The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009 reported that an overestimation of angry faces could only be found when the model’s gaze was oriented towards the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance and an evolutionary context.

  15. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  16. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  17. The contribution of emotional empathy to approachability judgements assigned to emotional faces is context specific

    Directory of Open Access Journals (Sweden)

    Megan L Willis

    2015-08-01

    Full Text Available Previous research on approachability judgements has indicated that facial expressions modulate how these judgements are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgements assigned to emotional faces in different contexts. One hundred and twenty female participants completed the Questionnaire Measure of Emotional Empathy. Participants provided approachability judgements to faces displaying angry, disgusted, fearful, happy, neutral and sad expressions, in three different contexts – when evaluating whether they would approach another individual to: 1 receive help; 2 give help; or 3 when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgements are assigned to facial expressions in context.

  18. Men appear more lateralized when noticing emotion in male faces.

    Science.gov (United States)

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  19. Visual Afterimages of Emotional Faces in High Functioning Autism

    Science.gov (United States)

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  20. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces.

    Directory of Open Access Journals (Sweden)

    Simon Rigoulot

    Full Text Available Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality and emotional speech prosody (auditory modality which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms] were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect, although this effect was often emotion-specific (with greatest effects for fear. Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

  2. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Science.gov (United States)

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  3. State-dependent alteration in face emotion recognition in depression.

    Science.gov (United States)

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  4. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  5. Short-term memory for emotional faces in dysphoria.

    Science.gov (United States)

    Noreen, Saima; Ridout, Nathan

    2010-07-01

    The study aimed to determine if the memory bias for negative faces previously demonstrated in depression and dysphoria generalises from long- to short-term memory. A total of 29 dysphoric (DP) and 22 non-dysphoric (ND) participants were presented with a series of faces and asked to identify the emotion portrayed (happiness, sadness, anger, or neutral affect). Following a delay, four faces were presented (the original plus three distractors) and participants were asked to identify the target face. Half of the trials assessed memory for facial emotion, and the remaining trials examined memory for facial identity. At encoding, no group differences were apparent. At memory testing, relative to ND participants, DP participants exhibited impaired memory for all types of facial emotion and for facial identity when the faces featured happiness, anger, or neutral affect, but not sadness. DP participants exhibited impaired identity memory for happy faces relative to angry, sad, and neutral, whereas ND participants exhibited enhanced facial identity memory when faces were angry. In general, memory for faces was not related to performance at encoding. However, in DP participants only, memory for sad faces was related to sadness recognition at encoding. The results suggest that the negative memory bias for faces in dysphoria does not generalise from long- to short-term memory.

  6. Visual attention to emotional face in schizophrenia: an eye tracking study.

    Directory of Open Access Journals (Sweden)

    Mania Asgharpour

    2015-03-01

    Full Text Available Deficits in the processing of facial emotions have been reported extensively in patients with schizophrenia. To explore whether restricted attention is the cause of impaired emotion processing in these patients, we examined visual attention through tracking eye movements in response to emotional and neutral face stimuli in a group of patients with schizophrenia and healthy individuals. We also examined the correlation between visual attention allocation and symptoms severity in our patient group.Thirty adult patients with schizophrenia and 30 matched healthy controls participated in this study. Visual attention data were recorded while participants passively viewed emotional-neutral face pairs for 500 ms. The relationship between the visual attention and symptoms severity were assessed by the Positive and Negative Syndrome Scale (PANSS in the schizophrenia group. Repeated Measures ANOVAs were used to compare the groups.Comparing the number of fixations made during face-pairs presentation, we found that patients with schizophrenia made fewer fixations on faces, regardless of the expression of the face. Analysis of the number of fixations on negative-neutral pairs also revealed that the patients made fewer fixations on both neutral and negative faces. Analysis of number of fixations on positive-neutral pairs only showed more fixations on positive relative to neutral expressions in both groups. We found no correlations between visual attention pattern to faces and symptom severity in schizophrenic patients.The results of this study suggest that the facial recognition deficit in schizophrenia is related to decreased attention to face stimuli. Finding of no difference in visual attention for positive-neutral face pairs between the groups is in line with studies that have shown increased ability to positive emotional perception in these patients.

  7. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    Science.gov (United States)

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  8. Implicit conditioning of faces via the social regulation of emotion: ERP evidence of early attentional biases for security conditioned faces.

    Science.gov (United States)

    Beckes, Lane; Coan, James A; Morris, James P

    2013-08-01

    Not much is known about the neural and psychological processes that promote the initial conditions necessary for positive social bonding. This study explores one method of conditioned bonding utilizing dynamics related to the social regulation of emotion and attachment theory. This form of conditioning involves repeated presentations of negative stimuli followed by images of warm, smiling faces. L. Beckes, J. Simpson, and A. Erickson (2010) found that this conditioning procedure results in positive associations with the faces measured via a lexical decision task, suggesting they are perceived as comforting. This study found that the P1 ERP was similarly modified by this conditioning procedure and the P1 amplitude predicted lexical decision times to insecure words primed by the faces. The findings have implications for understanding how the brain detects supportive people, the flexibility and modifiability of early ERP components, and social bonding more broadly. Copyright © 2013 Society for Psychophysiological Research.

  9. Disrupted neural processing of emotional faces in psychopathy.

    Science.gov (United States)

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  10. Similar representations of emotions across faces and voices.

    Science.gov (United States)

    Kuhn, Lisa Katharina; Wydell, Taeko; Lavan, Nadine; McGettigan, Carolyn; Garrido, Lúcia

    2017-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations

  11. Are neutral faces of children really emotionally neutral?

    OpenAIRE

    小松, 佐穂子; 箱田, 裕司; Komatsu, Sahoko; Hakoda, Yuji

    2012-01-01

    In this study, we investigated whether people recognize emotions from neutral faces of children (11 to 13 years old). We took facial images of 53 male and 54 female Japanese children who had been asked to keep a neutral facial expression. Then, we conducted an experiment in which 43 participants (19 to 34 years old) rated the strength of four emotions (happiness, surprise, sadness, and anger) for the facial images, using a 7- point scale. We found that (a) they rated both male and female face...

  12. Categorical Perception of emotional faces is not affected by aging

    Directory of Open Access Journals (Sweden)

    Mandy Rossignol

    2009-11-01

    Full Text Available Effects of normal aging on categorical perception (CP of facial emotional expressions were investigated. One-hundred healthy participants (20 to 70 years old; five age groups had to identify morphed expressions ranging from neutrality to happiness, sadness and fear. We analysed percentages and latencies of correct recognition for nonmorphed emotional expressions, percentages and latencies of emotional recognition for morphed-faces, locus of the boundaries along the different continua and the number of intrusions. The results showed that unmorphed happy and fearful faces were better processed than unmorphed sad and neutral faces. For morphed faces, CP was confirmed, as latencies increased as a function of the distance between the displayed morph and the original unmorphed photograph. The locus of categorical boundaries was not affected by age. Aging did not alter the accuracy of recognition for original pictures, no more than the emotional recognition of morphed faces or the rate of intrusions. However, latencies of responses increased with age, for both unmorphed and morphed pictures. In conclusion, CP of facial expressions appears to be spared in aging.

  13. Cross-modal perception (face and voice in emotions. ERPs and behavioural measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2007-04-01

    Full Text Available Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs, through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust. Auditory emotional stimuli (a neutral word pronounced in an affective tone and visual patterns (emotional facial expressions were matched in congruous (the same emotion in face and voice and incongruous (different emotions pairs. Subjects (N=30 were required to process the stimuli and to indicate their comprehension (by stimpad. ERPs variations and behavioural data (response time, RTs were submitted to repeated measures analysis of variance (ANOVA. We considered two time intervals (150-250; 250-350 ms post-stimulus, in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2, more anterior-distributed (Fz, and a positive deflection (P2, more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion, whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness and an inverted effect for other emotions (i.e. fear, anger, and surprise.

  14. Enhanced amygdala reactivity to emotional faces in adults reporting childhood emotional maltreatment

    Science.gov (United States)

    van Tol, Marie-José; Demenescu, Liliana R.; van der Wee, Nic J. A.; Veltman, Dick J.; Aleman, André; van Buchem, Mark A.; Spinhoven, Philip; Penninx, Brenda W. J. H.; Elzinga, Bernet M.

    2013-01-01

    In the context of chronic childhood emotional maltreatment (CEM; emotional abuse and/or neglect), adequately responding to facial expressions is an important skill. Over time, however, this adaptive response may lead to a persistent vigilance for emotional facial expressions. The amygdala and the medial prefrontal cortex (mPFC) are key regions in face processing. However, the neurobiological correlates of face processing in adults reporting CEM are yet unknown. We examined amydala and mPFC reactivity to emotional faces (Angry, Fearful, Sad, Happy, Neutral) vs scrambled faces in healthy controls and unmedicated patients with depression and/or anxiety disorders reporting CEM before the age of 16 years (n = 60), and controls and patients who report no childhood abuse (n = 75). We found that CEM was associated with enhanced bilateral amygdala reactivity to emotional faces in general, and independent of psychiatric status. Furthermore, we found no support for differential mPFC functioning, suggesting that amygdala hyper-responsivity to emotional facial perception in adults reporting CEM may be independent from top–down influences of the mPFC. These findings may be key in understanding the increased emotional sensitivity and interpersonal difficulties, that have been reported in individuals with a history of CEM. PMID:22258799

  15. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    Science.gov (United States)

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  16. Assessment of incongruent emotions in face and voice

    NARCIS (Netherlands)

    Takagi, S.; Tabei, K.-I.; Huis in 't Veld, E.M.J.; de Gelder, B.

    2013-01-01

    Information derived from facial and vocal nonverbal expressions plays an important role in social communication in the real and virtual worlds. In the present study, we investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. We used a face

  17. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  18. Emotional faces influence evaluation of natural and transformed food.

    Science.gov (United States)

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  19. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    Science.gov (United States)

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  1. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  2. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    Science.gov (United States)

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  3. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    Science.gov (United States)

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    Science.gov (United States)

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  5. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  6. Social and emotional relevance in face processing: Happy faces of future interaction partners enhance the LPP

    Directory of Open Access Journals (Sweden)

    Florian eBublatzky

    2014-07-01

    Full Text Available Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. Social relevance was manipulated by presenting pictures of two specific face actors as future interaction partners (meet condition, whereas two other face actors remained non-relevant. As a further control condition all stimuli were presented without specific task instructions (passive viewing condition. A within-subject design (Facial Expression x Relevance x Task was implemented, where randomly ordered face stimuli of four actors (2 women, from the KDEF were presented for 1s to 26 participants (16 female. Results showed an augmented N170, early posterior negativity (EPN, and late positive potential (LPP for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of instructed social relevance. Whereas the meet condition was accompanied with unspecific effects regardless of relevance (P1, EPN, viewing potential interaction partners was associated with increased LPP amplitudes. The LPP was specifically enhanced for happy facial expressions of the future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  7. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    Science.gov (United States)

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  8. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  9. The Perception of Time While Perceiving Dynamic Emotional Faces

    Directory of Open Access Journals (Sweden)

    Wang On eLi

    2015-08-01

    Full Text Available Emotion plays an essential role in the perception of time such that time is perceived to fly when events are enjoyable, while unenjoyable moments are perceived to drag. Previous studies have reported a time-drag effect when participants are presented with emotional facial expressions, regardless of the emotion presented. This effect can hardly be explained by induced emotion given the heterogeneous nature of emotional expressions. We conducted two experiments (n=44 & n=39 to examine the cognitive mechanism underlying this effect by presenting dynamic sequences of emotional expressions to participants. Each sequence started with a particular expression, then morphed to another. The presentation of dynamic facial expressions allows a comparison between the time-drag effect of homogeneous pairs of emotional expressions sharing similar valence and arousal to heterogeneous pairs. Sequences of seven durations (400ms, 600ms, 800ms, 1,000ms, 1,200ms, 1,400ms, 1,600ms were presented to participants, who were asked to judge whether the sequences were closer to 400ms or 1,600ms in a two-alternative forced choice task. The data were then collated according to conditions and fit into cumulative Gaussian curves to estimate the point of subjective equivalence indicating the perceived duration of 1,000ms. Consistent with previous reports, a feeling of time dragging is induced regardless of the sequence presented, such that 1,000ms is perceived to be longer than 1,000ms. In addition, dynamic facial expressions exert a greater effect on perceived time drag than static expressions. The effect is most prominent when the dynamics involve an angry face or a change in valence. The significance of this sensitivity is discussed in terms of emotion perception and its evolutionary significance for our attention mechanism.

  10. Grounding Context in Face Processing: Color, Emotion and Gender

    Directory of Open Access Journals (Sweden)

    Sandrine eGil

    2015-03-01

    Full Text Available In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (versus green, mixed red/green and achromatic background–known to be valenced−on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  11. Your emotion or mine: Labeling feelings alters emotional face perception- An ERP study on automatic and intentional affect labeling

    Directory of Open Access Journals (Sweden)

    Cornelia eHerbert

    2013-07-01

    Full Text Available Empirical evidence suggests that words are powerful regulators of emotion processing. Although a number of studies have used words as contextual cues for emotion processing, the role of what is being labeled by the words (i.e. one’s own emotion as compared to the emotion expressed by the sender is poorly understood. The present study reports results from two experiments which used ERP methodology to evaluate the impact of emotional faces and self- versus sender-related emotional pronoun-noun pairs (e.g. my fear vs. his fear as cues for emotional face processing. The influence of self- and sender-related cues on the processing of fearful, angry and happy faces was investigated in two contexts: an automatic (experiment 1 and intentional affect labeling task (experiment 2, along with control conditions of passive face processing. ERP patterns varied as a function of the label’s reference (self vs. sender and the intentionality of the labelling task (experiment 1 vs. experiment 2. In experiment 1, self-related labels increased the motivational relevance of the emotional faces in the time-window of the EPN component. Processing of sender-related labels improved emotion recognition specifically for fearful faces in the N170 time-window. Spontaneous processing of affective labels modulated later stages of face processing as well. Amplitudes of the late positive potential (LPP were reduced for fearful, happy, and angry faces relative to the control condition of passive viewing. During intentional regulation (experiment 2 amplitudes of the LPP were enhanced for emotional faces when subjects used the self-related emotion labels to label their own emotion during face processing, and they rated the faces as higher in arousal than the emotional faces that had been presented in the label sender’s emotion condition or the passive viewing condition. The present results argue in favor of a differentiated view of language-as-context for emotion processing.

  12. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    Science.gov (United States)

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  13. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Passive and motivated perception of emotional faces: qualitative and quantitative changes in the face processing network.

    Directory of Open Access Journals (Sweden)

    Laurie R Skelly

    Full Text Available Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augments co-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains.

  15. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    Science.gov (United States)

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  16. Differential emotion attribution to neutral faces of own and other races.

    Science.gov (United States)

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  17. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  18. Veiled emotions: the effect of covered faces on emotion perception and attitudes

    NARCIS (Netherlands)

    Fischer, A.H.; Gillebaart, M.; Rotteveel, M.; Becker, D.; Vliek, M.

    2012-01-01

    The present study explores the relative absence of expressive cues and the effect of contextual cues on the perception of emotions and its effect on attitudes. The visibility of expressive cues was manipulated by showing films displaying female targets whose faces were either fully visible, covered

  19. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  20. Are Max-Specified Infant Facial Expressions during Face-to-Face Interaction Consistent with Differential Emotions Theory?

    Science.gov (United States)

    Matias, Reinaldo; Cohn, Jeffrey F.

    1993-01-01

    Examined infant facial expressions at two, four, and six months of age during face-to-face play and a still-face interaction with their mothers. Contrary to differential emotions theory, at no age did proportions or durations of discrete and blended negative expressions differ; they also showed different patterns of developmental change. (MM)

  1. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Science.gov (United States)

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  2. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Directory of Open Access Journals (Sweden)

    Peter E Clayson

    Full Text Available The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression. Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth or incongruent (happy eyes, angry mouth while high-density event-related potentials (ERPs were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs. Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  3. Facing emotions in narcolepsy with cataplexy: haemodynamic and behavioural responses during emotional stimulation.

    Science.gov (United States)

    de Zambotti, Massimiliano; Pizza, Fabio; Covassin, Naima; Vandi, Stefano; Cellini, Nicola; Stegagno, Luciano; Plazzi, Giuseppe

    2014-08-01

    Narcolepsy with cataplexy is a complex sleep disorder that affects the modulation of emotions: cataplexy, the key symptom of narcolepsy, is indeed strongly linked with emotions that usually trigger the episodes. Our study aimed to investigate haemodynamic and behavioural responses during emotional stimulation in narco-cataplexy. Twelve adult drug-naive narcoleptic patients (five males; age: 33.3 ± 9.4 years) and 12 healthy controls (five males; age: 30.9 ± 9.5 years) were exposed to emotional stimuli (pleasant, unpleasant and neutral pictures). Heart rate, arterial blood pressure and mean cerebral blood flow velocity of the middle cerebral arteries were continuously recorded using photoplethysmography and Doppler ultrasound. Ratings of valence and arousal and coping strategies were scored by the Self-Assessment Manikin and by questionnaires, respectively. Narcoleptic patients' haemodynamic responses to pictures overlapped with the data obtained from controls: decrease of heart rate and increase of mean cerebral blood flow velocity regardless of pictures' content, increase of systolic blood pressure during the pleasant condition, and relative reduction of heart rate during pleasant and unpleasant conditions. However, when compared with controls, narcoleptic patients reported lower arousal scores during the pleasant and neutral stimulation, and lower valence scores during the pleasant condition, respectively, and also a lower score at the 'focus on and venting of emotions' dimensions of coping. Our results suggested that adult narcoleptic patients, compared with healthy controls, inhibited their emotion-expressive behaviour to emotional stimulation, and that may be related to the development of adaptive cognitive strategies to face emotions avoiding cataplexy. © 2014 European Sleep Research Society.

  4. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    Science.gov (United States)

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  5. Adult age-differences in subjective impression of emotional faces are reflected in emotion-related attention and memory tasks

    Directory of Open Access Journals (Sweden)

    Joakim eSvard

    2014-05-01

    Full Text Available Although younger and older adults appear to attend to and remember emotional faces differently, less is known about age-related differences in the subjective emotional impression (arousal, potency, and valence of emotional faces and how these differences, in turn, are reflected in age differences in various emotional tasks. In the current study, we used the same facial emotional stimuli (angry and happy faces in four tasks: emotional rating, attention, categorical perception, and visual short-term memory (VSTM. The aim of this study was to investigate effects of age on the subjective emotional impression of angry and happy faces and to examine whether any age differences were mirrored in measures of emotional behavior (attention, categorical perception, and memory.In addition, regression analyses were used to further study impression-behavior associations. Forty younger adults (range 20-30 years and thirty-nine older adults (range 65-75 years participated in the experiment. The emotional rating task showed that older adults perceived less arousal, potency, and valence than younger adults and that the difference was more pronounced for angry than happy faces. Similarly, the results of the attention and memory tasks demonstrated interaction effects between emotion and age, and age differences on these measures were larger for angry than for happy faces. Regression analyses confirmed that in both age groups, higher potency ratings predicted both visual search and visual short-term memory efficiency. Future studies should consider the possibility that age differences in the subjective emotional impression of facial emotional stimuli may explain age differences in attention to and memory of such stimuli.

  6. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    Science.gov (United States)

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less

  7. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    Science.gov (United States)

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  8. Electrophysiological correlates of emotional face processing in typically developing adults and adults with high functioning Autism

    OpenAIRE

    Barrie, Jennifer Nicole

    2012-01-01

    Emotional expressions have been found to affect various event-related potentials (ERPs). Furthermore, socio-emotional functioning is altered in individuals with autism, and a growing body of neuroimaging and electrophysiological evidence substantiates underlying neural differences for face processing in this population. However, relatively few studies have examined the time-course of emotional face processing in autism. This study examined how implicit (not the intended focus of attention) ve...

  9. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks

    DEFF Research Database (Denmark)

    Grady, Cheryl Lynn; Siebner, Hartwig R; Hornboll, Bettina

    2013-01-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender...... of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify...

  10. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    OpenAIRE

    Invitto, Sara; Calcagn?, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emo...

  11. Emotion regulation in mothers and young children faced with trauma.

    Science.gov (United States)

    Pat-Horenczyk, Ruth; Cohen, S; Ziv, Y; Achituv, M; Asulin-Peretz, L; Blanchard, T R; Schiff, M; Brom, D

    2015-01-01

    The present study investigated maternal emotion regulation as mediating the association between maternal posttraumatic stress symptoms and children's emotional dysregulation in a community sample of 431 Israeli mothers and children exposed to trauma. Little is known about the specific pathways through which maternal posttraumatic symptoms and deficits in emotion regulation contribute to emotional dysregulation. Inspired by the intergenerational process of relational posttraumatic stress disorder (PTSD), in which posttraumatic distress is transmitted from mothers to children, we suggest an analogous concept of relational emotion regulation, by which maternal emotion regulation problems may contribute to child emotion regulation deficits. Child emotion regulation problems were measured using the Child Behavior Checklist-Dysregulation Profile (CBCL-DP; T.M. Achenbach & I. Rescorla, 2000), which is comprised of three subscales of the CBCL: Attention, Aggression, and Anxiety/Depression. Maternal PTSD symptoms were assessed by the Posttraumatic Diagnostic Scale (E.B. Foa, L. Cashman, L. Jaycox, & K. Perry, 1997) and maternal emotion regulation by the Difficulties in Emotion Regulation Scale (K.L. Gratz & L. Roemer, 2004). Results showed that the child's emotion regulation problems were associated with both maternal posttraumatic symptoms and maternal emotion dysregulation. Further, maternal emotion regulation mediated the association between maternal posttraumatic symptoms and the child's regulation deficits. These findings highlight the central role of mothers' emotion regulation skills in the aftermath of trauma as it relates to children's emotion regulation skills. The degree of mothers' regulatory skills in the context of posttraumatic stress symptoms reflects a key process through which the intergenerational transmission of trauma may occur. Study results have critical implications for planning and developing clinical interventions geared toward the treatment of

  12. Time for a Change: College Students' Preference for Technology-Mediated Versus Face-to-Face Help for Emotional Distress.

    Science.gov (United States)

    Lungu, Anita; Sun, Michael

    2016-12-01

    Even with recent advances in psychological treatments and mobile technology, online computerized therapy is not yet popular. College students, with ubiquitous access to technology, experiencing high distress, and often nontreatment seekers, could be an important area for online treatment dissemination. Finding ways to reach out to college students by offering psychological interventions through technology, devices, and applications they often use, might increase their engagement in treatment. This study evaluates college students' reported willingness to seek help for emotional distress through novel delivery mediums, to play computer games for learning emotional coping skills, and to disclose personal information online. We also evaluated the role of ethnicity and level of emotional distress in help-seeking patterns. A survey exploring our domains of interest and the Mental Health Inventory ([MHI] as mental health index) were completed by 572 students (mean age 18.7 years, predominantly Asian American, female, and freshmen in college). More participants expressed preference for online versus face-to-face professional help. We found no relationship between MHI and help-seeking preference. A third of participants were likely to disclose at least as much information online as face-to-face. Ownership of mobile technology was pervasive. Asian Americans were more likely to be nontreatment seekers than Caucasians. Most participants were interested in serious games for emotional distress. Our results suggest that college students are very open to creative ways of receiving emotional help such as playing games and seeking emotional help online, suggesting a need for online evidence-based treatments.

  13. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  14. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    Science.gov (United States)

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Young Adults with Autism Spectrum Disorder Show Early Atypical Neural Activity during Emotional Face Processing

    Directory of Open Access Journals (Sweden)

    Rachel C. Leung

    2018-02-01

    Full Text Available Social cognition is impaired in autism spectrum disorder (ASD. The ability to perceive and interpret affect is integral to successful social functioning and has an extended developmental course. However, the neural mechanisms underlying emotional face processing in ASD are unclear. Using magnetoencephalography (MEG, the present study explored neural activation during implicit emotional face processing in young adults with and without ASD. Twenty-six young adults with ASD and 26 healthy controls were recruited. Participants indicated the location of a scrambled pattern (target that was presented alongside a happy or angry face. Emotion-related activation sources for each emotion were estimated using the Empirical Bayes Beamformer (pcorr ≤ 0.001 in Statistical Parametric Mapping 12 (SPM12. Emotional faces elicited elevated fusiform, amygdala and anterior insula and reduced anterior cingulate cortex (ACC activity in adults with ASD relative to controls. Within group comparisons revealed that angry vs. happy faces elicited distinct neural activity in typically developing adults; there was no distinction in young adults with ASD. Our data suggest difficulties in affect processing in ASD reflect atypical recruitment of traditional emotional processing areas. These early differences may contribute to difficulties in deriving social reward from faces, ascribing salience to faces, and an immature threat processing system, which collectively could result in deficits in emotional face processing.

  16. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  17. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    Science.gov (United States)

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  18. An emotional Stroop task with faces and words. A comparison of young and older adults.

    Science.gov (United States)

    Agustí, Ana I; Satorres, Encarnación; Pitarque, Alfonso; Meléndez, Juan C

    2017-08-01

    Given the contradictions of previous studies on the changes in attentional responses produced in aging a Stroop emotional task was proposed to compare young and older adults to words or faces with an emotional valence. The words happy or sad were superimposed on faces that express the emotion of happiness or sadness. The emotion expressed by the word and the face could agree or not (cued and uncued trials, respectively). 85 young and 66 healthy older adults had to identify both faces and words separately, and the interference between the two types of stimuli was examined. An interference effect was observed for both types of stimuli in both groups. There was more interference on positive faces and words than on negative stimuli. Older adults had more difficulty than younger in focusing on positive uncued trials, whereas there was no difference across samples on negative uncued trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    Science.gov (United States)

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  20. Emotional Faces Capture Spatial Attention in 5-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Kit K. Elam

    2010-10-01

    Full Text Available Emotional facial expressions are important social cues that convey salient affective information. Infants, younger children, and adults all appear to orient spatial attention to emotional faces with a particularly strong bias to fearful faces. Yet in young children it is unclear whether or not both happy and fearful faces extract attention. Given that the processing of emotional faces is believed by some to serve an evolutionarily adaptive purpose, attentional biases to both fearful and happy expressions would be expected in younger children. However, the extent to which this ability is present in young children and whether or not this ability is genetically mediated is untested. Therefore, the aims of the current study were to assess the spatial-attentional properties of emotional faces in young children, with a preliminary test of whether this effect was influenced by genetics. Five-year-old twin pairs performed a dot-probe task. The results suggest that children preferentially direct spatial attention to emotional faces, particularly right visual field faces. The results provide support for the notion that the direction of spatial attention to emotional faces serves an evolutionarily adaptive function and may be mediated by genetic mechanisms.

  1. Modulation of the composite face effect by unintended emotion cues

    OpenAIRE

    Gray, Katie L. H.; Murphy, Jennifer; Marsh, Jade E.; Cook, Richard

    2017-01-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers ‘fuse’ the two halves together, perceptually. The illusory distortion induced by task-irrelevant (‘distractor’) halves hinders participants’ judgments about task-relevant (‘target’) halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, obser...

  2. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions...... corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  3. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  4. Social anhedonia is associated with neural abnormalities during face emotion processing.

    Science.gov (United States)

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  5. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  6. Cyber Victimization in High School: Measurement, Overlap with Face-to-Face Victimization, and Associations with Social-Emotional Outcomes

    Science.gov (United States)

    Brown, Christina Flynn; Demaray, Michelle Kilpatrick; Tennant, Jaclyn E.; Jenkins, Lyndsay N.

    2017-01-01

    Cyber victimization is a contemporary problem facing youth and adolescents (Diamanduros, Downs, & Jenkins, 2008; Kowalski & Limber, 2007). It is imperative for researchers and school personnel to understand the associations between cyber victimization and student social-emotional outcomes. This article explores (a) gender differences in…

  7. Memory for faces and voices varies as a function of sex and expressed emotion.

    Science.gov (United States)

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  8. The Effect of Self-Referential Expectation on Emotional Face Processing.

    Directory of Open Access Journals (Sweden)

    Mel McKendrick

    Full Text Available The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  9. Memory for faces and voices varies as a function of sex and expressed emotion.

    Directory of Open Access Journals (Sweden)

    Diana S Cortes

    Full Text Available We investigated how memory for faces and voices (presented separately and in combination varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral. At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations. For the subjective sense of recollection ("remember" hits, neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  10. Brain and behavioral inhibitory control of kindergartners facing negative emotions.

    Science.gov (United States)

    Farbiash, Tali; Berger, Andrea

    2016-09-01

    Inhibitory control (IC) - one of the most critical functions underlying a child's ability to self-regulate - develops significantly throughout the kindergarten years. Experiencing negative emotions imposes challenges on executive functioning and may specifically affect IC. In this study, we examined kindergartners' IC and its related brain activity during a negative emotional situation: 58 children (aged 5.5-6.5 years) performed an emotion-induction Go/NoGo task. During this task, we recorded children's performance and brain activity, focusing on the fronto-central N2 component in the event-related potential (ERP) and the power of its underlying theta frequency. Compared to Go trials, inhibition of NoGo trials was associated with larger N2 amplitudes and theta power. The negative emotional experience resulted in better IC performance and, at the brain level, in larger theta power. Source localization of this effect showed that the brain activity related to IC during the negative emotional experience was principally generated in the posterior frontal regions. Furthermore, the band power measure was found to be a more sensitive index for children's inhibitory processes than N2 amplitudes. This is the first study to focus on kindergartners' IC while manipulating their emotional experience to induce negative emotions. Our findings suggest that a kindergartner's experience of negative emotion can result in improved IC and increases in associated aspects of brain activity. Our results also suggest the utility of time-frequency analyses in the study of brain processes associated with response inhibition in young children. © 2015 John Wiley & Sons Ltd.

  11. Age-Group Differences in Interference from Young and Older Emotional Faces.

    Science.gov (United States)

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  12. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    Science.gov (United States)

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  14. Early life stress and trauma and enhanced limbic activation to emotionally valenced faces in depressed and healthy children.

    Science.gov (United States)

    Suzuki, Hideo; Luby, Joan L; Botteron, Kelly N; Dietrich, Rachel; McAvoy, Mark P; Barch, Deanna M

    2014-07-01

    Previous studies have examined the relationships between structural brain characteristics and early life stress in adults. However, there is limited evidence for functional brain variation associated with early life stress in children. We hypothesized that early life stress and trauma would be associated with increased functional brain activation response to negative emotional faces in children with and without a history of depression. Psychiatric diagnosis and life events in children (starting at age 3-5 years) were assessed in a longitudinal study. A follow-up magnetic resonance imaging (MRI) study acquired data (N = 115 at ages 7-12, 51% girls) on functional brain response to fearful, sad, and happy faces relative to neutral faces. We used a region-of-interest mask within cortico-limbic areas and conducted regression analyses and repeated-measures analysis of covariance. Greater activation responses to fearful, sad, and happy faces in the amygdala and its neighboring regions were found in children with greater life stress. Moreover, an association between life stress and left hippocampal and globus pallidus activity depended on children's diagnostic status. Finally, all children with greater life trauma showed greater bilateral amygdala and cingulate activity specific to sad faces but not the other emotional faces, although right amygdala activity was moderated by psychiatric status. These findings suggest that limbic hyperactivity may be a biomarker of early life stress and trauma in children and may have implications in the risk trajectory for depression and other stress-related disorders. However, this pattern varied based on emotion type and history of psychopathology. Copyright © 2014 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  15. Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices

    Directory of Open Access Journals (Sweden)

    Zachary Adam Yaple

    2016-06-01

    Full Text Available Top-down processing is a mechanism in which memory, context and expectation are used to perceive stimuli. For this study we investigated how emotion content, induced by music mood, influences perception of happy and sad emoticons. Using single pulse TMS we stimulated right occipital face area (rOFA, primary visual cortex (V1 and vertex while subjects performed a face-detection task and listened to happy and sad music. At baseline, incongruent audio-visual pairings decreased performance, demonstrating dependence of emotion while perceiving ambiguous faces. However, performance of face identification decreased during rOFA stimulation regardless of emotional content. No effects were found between Cz and V1 stimulation. These results suggest that while rOFA is important for processing faces regardless of emotion, V1 stimulation had no effect. Our findings suggest that early visual cortex activity may not integrate emotional auditory information with visual information during emotion top-down modulation of faces.

  16. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  17. The Impact of Top-Down Prediction on Emotional Face Processing in Social Anxiety

    Directory of Open Access Journals (Sweden)

    Guangming Ran

    2017-07-01

    Full Text Available There is evidence that people with social anxiety show abnormal processing of emotional faces. To investigate the impact of top-down prediction on emotional face processing in social anxiety, brain responses of participants with high and low social anxiety (LSA were recorded, while they performed a variation of the emotional task, using high temporal resolution event-related potential techniques. Behaviorally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that participants with high social anxiety (HSA, but not with LSA, recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity for angry faces compared to happy faces, suggesting a hypervigilance to angry faces. Importantly, HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they observed unpredictable angry faces, but not when the angry faces were predictable. This probably reflects the top-down prediction improving the deficiency at building a holistic face representation in HSA participants.

  18. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Age-related differences in event-related potentials for early visual processing of emotional faces.

    Science.gov (United States)

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    Science.gov (United States)

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  1. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments

    OpenAIRE

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for id...

  2. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    Science.gov (United States)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  3. Cognitive Biases for Emotional Faces in High- and Low-Trait Depressive Participants

    Directory of Open Access Journals (Sweden)

    Yi-Hsing Hsieh

    2004-10-01

    Full Text Available This study examined the association between trait depression and information-processing biases. Thirty participants were divided into high- and low-trait depressive groups based on the median of their depressive subscale scores according to the Basic Personality Inventory. Information-processing biases were measured using a deployment-of-attention task (DOAT and a recognition memory task (RMT. For the DOAT, participants saw one emotional face paired with a neutral face of the same person, and then were forced to choose on which face the color patch had first occurred. The percentage of participants' choices favoring the happy, angry, or sad faces represented the selective attentional bias score for each emotion, respectively. For the RMT, participants rated different types of emotional faces and subsequently discriminated old faces from new faces. The memory strength for each type of face was calculated from hit and false-positive rates, based on the signal detection theory. Compared with the low-trait depressive group, the high-trait depressive group showed a negative cognitive style. This was an enhanced recognition memory for sad faces and a weakened inhibition of attending to sad faces, suggesting that those with high depressive trait may be vulnerable to interpersonal withdrawal.

  4. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  5. Face-body integration of intense emotional expressions of victory and defeat.

    Directory of Open Access Journals (Sweden)

    Lili Wang

    Full Text Available Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  6. The impact of emotional faces on social motivation in schizophrenia.

    Science.gov (United States)

    Radke, Sina; Pfersmann, Vera; Derntl, Birgit

    2015-10-01

    Impairments in emotion recognition and psychosocial functioning are a robust phenomenon in schizophrenia and may affect motivational behavior, particularly during socio-emotional interactions. To characterize potential deficits and their interplay, we assessed social motivation covering various facets, such as implicit and explicit approach-avoidance tendencies to facial expressions, in 27 patients with schizophrenia (SZP) and 27 matched healthy controls (HC). Moreover, emotion recognition abilities as well as self-reported behavioral activation and inhibition were evaluated. Compared to HC, SZP exhibited less pronounced approach-avoidance ratings to happy and angry expressions along with prolonged reactions during automatic approach-avoidance. Although deficits in emotion recognition were replicated, these were not associated with alterations in social motivation. Together with additional connections between psychopathology and several approach-avoidance processes, these results identify motivational impairments in SZP and suggest a complex relationship between different aspects of social motivation. In the context of specialized interventions aimed at improving social cognitive abilities in SZP, the link between such dynamic measures, motivational profiles and functional outcomes warrants further investigations, which can provide important leverage points for treatment. Crucially, our findings present first insights into the assessment and identification of target features of social motivation.

  7. Amygdala hypersensitivity in response to emotional faces in Tourette's patients.

    Science.gov (United States)

    Neuner, Irene; Kellermann, Thilo; Stöcker, Tony; Kircher, Tilo; Habel, Ute; Shah, Jon N; Schneider, Frank

    2010-10-01

    Tourette's syndrome is characterised by motor and vocal tics as well as a high level of impulsivity and emotional dysregulation. Neuroimaging studies point to structural changes of the basal ganglia, prefrontal cortex and parts of the limbic system. However, there is no link between behavioural symptoms and the structural changes in the amygdala. One aspect of daily social interaction is the perception of emotional facial expressions, closely linked to amgydala function. We therefore investigated via fMRI the implicit discrimination of six emotional facial expressions in 19 adult Tourette's patients. In comparison to healthy control group, Tourette's patients showed significantly higher amygdala activation, especially pronounced for fearful, angry and neutral expressions. The BOLD-activity of the left amygdala correlated negatively with the personality trait extraversion. We will discuss these findings as a result of either deficient frontal inhibition due to structural changes or a desynchronization in the interaction of the cortico-striato-thalamo-cortical network within structures of the limbic system. Our data show an altered pattern of implicit emotion discrimination and emphasize the need to consider motor and non-motor symptoms in Tourette's syndrome in the choice of both behavioural and pharmacological treatment.

  8. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  9. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  10. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    Science.gov (United States)

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  11. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network.

    Science.gov (United States)

    Wessing, Ida; Rehbein, Maimu A; Romer, Georg; Achtergarde, Sandra; Dobel, Christian; Zwitserlood, Pienie; Fürniss, Tilman; Junghöfer, Markus

    2015-06-01

    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Interdependent mechanisms for processing gender and emotion:The special status of angry male faces

    Directory of Open Access Journals (Sweden)

    Daniel A Harris

    2016-07-01

    Full Text Available While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce & Young, 1986, other models suggest a more distributed representation and interdependent processing (e.g., Haxby, Hoffman, & Gobbini, 2000. Here we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy-angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1 or to happy female faces and angry male faces (Experiment 2. In Experiment 1 we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased towards angry while female faces were biased towards happy. Interestingly, in the complementary Experiment 2 we did not find evidence for contingent adaptation, with both male and female faces biased towards angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated.

  13. Human sex differences in emotional processing of own-race and other-race faces.

    Science.gov (United States)

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  14. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    Science.gov (United States)

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  15. From specificity to sensitivity: affective states modulate visual working memory for emotional expressive faces.

    Science.gov (United States)

    Maran, Thomas; Sachse, Pierre; Furtner, Marco

    2015-01-01

    Previous findings suggest that visual working memory (VWM) preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in VWM. To explore the influence of affective context on VWM for faces, we conducted two experiments using both a VWM task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1) and pleasant (Experiment 2) IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively) compared to a low arousal control condition. Results indicated specifically increased sensitivity of VWM for angry looking faces in the neutral condition. Enhanced VWM for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in VWM to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of VWM along with flexible resource allocation. In VWM, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  16. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    Science.gov (United States)

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  17. From Specificity to Sensitivity: Affective states modulate visual working memory for emotional expressive faces

    Directory of Open Access Journals (Sweden)

    Thomas eMaran

    2015-08-01

    Full Text Available Previous findings suggest that visual working memory preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in visual working memory. To explore the influence of affective context on visual working memory for faces, we conducted two experiments using both a visual working memory task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1 and pleasant (Experiment 2 IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively compared to a low arousal control condition. Results indicated specifically increased sensitivity of visual working memory for angry looking faces in the neutral condition. Enhanced visual working memory for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in visual working memory to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of visual working memory along with flexible resource allocation. In visual working memory, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  18. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    Science.gov (United States)

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  20. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Reaction times and face discrimination with emotional content

    Directory of Open Access Journals (Sweden)

    ANA MARÍA MARTÍNEZ

    2002-07-01

    Full Text Available Sixty-two university subjects students located in two groups, with a stocking of age of 21.6 for thegroup of women and 22 for the group of men with the purpose to carry out a study upon visual timesof reaction TRV with emotional content keeping in mind the position: start, half and end; the emotionalcontent: neutral, friendly and threatening; and the combinations of the stimuli. The group of womenI present TR more prolonged than that of the men in all the experimental conditions. Also it wasobserved, that more are prolonged when the stimulus to discriminate this located in the half so muchin men as women.

  2. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  3. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    Science.gov (United States)

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  4. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    Science.gov (United States)

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  5. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    Science.gov (United States)

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  6. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    Science.gov (United States)

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  7. The not face: A grammaticalization of facial expressions of emotion.

    Science.gov (United States)

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  9. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    Science.gov (United States)

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  10. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    Science.gov (United States)

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  11. Emotional memory and perception of emotional faces in patients suffering from depersonalization disorder.

    NARCIS (Netherlands)

    Montagne, B.; Sierra, M.; Medford, N.; Hunter, E.; Baker, D.J.; Kessels, R.P.C.; Haan, E.H.F. de; David, A.S.

    2007-01-01

    Previous work has shown that patients with depersonalization disorder (DPD) have reduced physiological responses to emotional stimuli, which may be related to subjective emotional numbing. This study investigated two aspects of affective processing in 13 patients with DPD according to the DSM-IV

  12. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    OpenAIRE

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybr...

  13. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    Science.gov (United States)

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  14. Sad benefit in face working memory: an emotional bias of melancholic depression.

    Science.gov (United States)

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    Science.gov (United States)

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  16. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research

    Directory of Open Access Journals (Sweden)

    Xu Chen

    2017-04-01

    Full Text Available Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming and early-stage information processing system (attention, given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has

  17. Abnormal early gamma responses to emotional faces differentiate unipolar from bipolar disorder patients.

    Science.gov (United States)

    Liu, T Y; Chen, Y S; Su, T P; Hsieh, J C; Chen, L F

    2014-01-01

    This study investigates the cortical abnormalities of early emotion perception in patients with major depressive disorder (MDD) and bipolar disorder (BD) using gamma oscillations. Twenty-three MDD patients, twenty-five BD patients, and twenty-four normal controls were enrolled and their event-related magnetoencephalographic responses were recorded during implicit emotional tasks. Our results demonstrated abnormal gamma activity within 100 ms in the emotion-related regions (amygdala, orbitofrontal (OFC) cortex, anterior insula (AI), and superior temporal pole) in the MDD patients, suggesting that these patients may have dysfunctions or negativity biases in perceptual binding of emotional features at very early stage. Decreased left superior medial frontal cortex (smFC) responses to happy faces in the MDD patients were correlated with their serious level of depression symptoms, indicating that decreased smFC activity perhaps underlies irregular positive emotion processing in depressed patients. In the BD patients, we showed abnormal activation in visual regions (inferior/middle occipital and middle temporal cortices) which responded to emotional faces within 100 ms, supporting that the BD patients may hyperactively respond to emotional features in perceptual binding. The discriminant function of gamma activation in the left smFC, right medial OFC, right AI/inferior OFC, and the right precentral cortex accurately classified 89.6% of patients as unipolar/bipolar disorders.

  18. Are patients with schizophrenia impaired in processing non-emotional features of human faces?

    Directory of Open Access Journals (Sweden)

    Hayley eDarke

    2013-08-01

    Full Text Available It is known that individuals with schizophrenia exhibit signs of impaired face processing, however, the exact perceptual and cognitive mechanisms underlying these deficits are yet to be elucidated. One possible source of confusion in the current literature is the methodological and conceptual inconsistencies that can arise from the varied treatment of different aspects of face processing relating to emotional and non-emotional aspects of face perception. This review aims to disentangle the literature by focusing on the performance of patients with schizophrenia in a range of tasks that required processing of non-emotional features of face stimuli (e.g. identity or gender. We also consider the performance of patients on non-face stimuli that share common elements such as familiarity (e.g. cars and social relevance (e.g. gait. We conclude by exploring whether observed deficits are best considered as face-specific and note that further investigation is required to properly assess the potential contribution of more generalised attentional or perceptual impairments.

  19. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  20. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    Science.gov (United States)

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (Pemotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  1. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  2. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  3. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  4. Gender differences in the recognition of emotional faces: are men less efficient?

    Directory of Open Access Journals (Sweden)

    Ana Ruiz-Ibáñez

    2017-06-01

    Full Text Available As research in recollection of stimuli with emotional valence indicates, emotions influence memory. Many studies in face and emotional facial expression recognition have focused on age (young and old people and gender-associated (men and women differences. Nevertheless, this kind of studies has produced contradictory results, because of that, it would be necessary to study gender involvement in depth. The main objective of our research consists of analyzing the differences in image recognition using faces with emotional facial expressions between two groups composed by university students aged 18-30. The first group is constituted by men and the second one by women. The results showed statistically significant differences in face corrected recognition (hit rate - false alarm rate: the women demonstrated a better recognition than the men. However, other analyzed variables as time or efficiency do not provide conclusive results. Furthermore, a significant negative correlation between the time used and the efficiency when doing the task was found in the male group. This information reinforces not only the hypothesis of gender difference in face recognition, in favor of women, but also these ones that suggest a different cognitive processing of facial stimuli in both sexes. Finally, we argue the necessity of a greater research related to variables as age or sociocultural level.

  5. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    Science.gov (United States)

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  6. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  7. Ratings of Emotion in Laterally Presented Faces: Sex and handedness effects

    NARCIS (Netherlands)

    van Strien, J.W.; van Beek, S.

    2000-01-01

    Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants

  8. Dysregulation in cortical reactivity to emotional faces in PTSD patients with high dissociation symptoms

    Directory of Open Access Journals (Sweden)

    Aleksandra Klimova

    2013-09-01

    Full Text Available Background: Predominant dissociation in posttraumatic stress disorder (PTSD is characterized by restricted affective responses to positive stimuli. To date, no studies have examined neural responses to a range of emotional expressions in PTSD with high dissociative symptoms. Objective: This study tested the hypothesis that PTSD patients with high dissociative symptoms will display increased event-related potential (ERP amplitudes in early components (N1, P1 to threatening faces (angry, fearful, and reduced later ERP amplitudes (Vertex Positive Potential (VPP, P3 to happy faces compared to PTSD patients with low dissociative symptoms. Methods: Thirty-nine civilians with PTSD were classified as high dissociative (n=16 or low dissociative (n=23 according to their responses on the Clinician Administered Dissociative States Scale. ERPs were recorded, whilst participants viewed emotional (happy, angry, fear and neutral facial expressions in a passive viewing task. Results: High dissociative PTSD patients displayed significantly increased N120 amplitude to the majority of facial expressions (neutral, happy, and angry compared to low dissociative PTSD patients under conscious and preconscious conditions. The high dissociative PTSD group had significantly reduced VPP amplitude to happy faces in the conscious condition. Conclusion: High dissociative PTSD patients displayed increased early (preconscious cortical responses to emotional stimuli, and specific reductions to happy facial expressions in later (conscious, face-specific components compared to low dissociative PTSD patients. Dissociation in PTSD may act to increase initial pre-attentive processing of affective stimuli, and specifically reduce cortical reactivity to happy faces when consciously processing these stimuli.

  9. Increased amygdala responses to emotional faces after psilocybin for treatment-resistant depression.

    Science.gov (United States)

    Roseman, Leor; Demetriou, Lysia; Wall, Matthew B; Nutt, David J; Carhart-Harris, Robin L

    2017-12-27

    Recent evidence indicates that psilocybin with psychological support may be effective for treating depression. Some studies have found that patients with depression show heightened amygdala responses to fearful faces and there is reliable evidence that treatment with SSRIs attenuates amygdala responses (Ma, 2015). We hypothesised that amygdala responses to emotional faces would be altered post-treatment with psilocybin. In this open-label study, 20 individuals diagnosed with moderate to severe, treatment-resistant depression, underwent two separate dosing sessions with psilocybin. Psychological support was provided before, during and after these sessions and 19 completed fMRI scans one week prior to the first session and one day after the second and last. Neutral, fearful and happy faces were presented in the scanner and analyses focused on the amygdala. Group results revealed rapid and enduring improvements in depressive symptoms post psilocybin. Increased responses to fearful and happy faces were observed in the right amygdala post-treatment, and right amygdala increases to fearful versus neutral faces were predictive of clinical improvements at 1-week. Psilocybin with psychological support was associated with increased amygdala responses to emotional stimuli, an opposite effect to previous findings with SSRIs. This suggests fundamental differences in these treatments' therapeutic actions, with SSRIs mitigating negative emotions and psilocybin allowing patients to confront and work through them. Based on the present results, we propose that psilocybin with psychological support is a treatment approach that potentially revives emotional responsiveness in depression, enabling patients to reconnect with their emotions. ISRCTN, number ISRCTN14426797. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces

    Science.gov (United States)

    Luo, Chengwen; Wang, Qingyun; Schyns, Philippe G.; Kingdom, Frederick A. A.; Xu, Hong

    2015-01-01

    Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation. PMID:26717572

  11. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression.

    Science.gov (United States)

    Miskowiak, K W; Glerup, L; Vestbo, C; Harmer, C J; Reinecke, A; Macoveanu, J; Siebner, H R; Kessing, L V; Vinberg, M

    2015-05-01

    Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. Thirty healthy, never-depressed monozygotic (MZ) twins with a co-twin history of depression (high risk group: n = 13) or without co-twin history of depression (low-risk group: n = 17) were enrolled in a functional magnetic resonance imaging (fMRI) study. During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. High-risk twins showed increased neural response to happy and fearful faces in dorsal anterior cingulate cortex (ACC), dorsomedial prefrontal cortex (dmPFC), pre-supplementary motor area and occipito-parietal regions compared to low-risk twins. They also displayed stronger negative coupling between amygdala and pregenual ACC, dmPFC and temporo-parietal regions during emotional face processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping. Different neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key endophenotypes for depression.

  12. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Glerup, L; Vestbo, C

    2015-01-01

    while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. RESULTS: High-risk twins showed increased neural response to happy and fearful faces...... processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low......BACKGROUND: Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. METHOD: Thirty...

  13. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. An exploration of emotional protection and regulation in nurse-patient interactions: The role of the professional face and the emotional mirror.

    Science.gov (United States)

    Cecil, Penelope; Glass, Nel

    2015-01-01

    While interpersonal styles of nurse-patient communication have become more relaxed in recent years, nurses remain challenged in emotional engagement with patients and other health professionals. In order to preserve a professional distance in patient care delivery however slight, nurses need to be able to regulate their emotions. This research aimed to investigate nurses' perceptions of emotional protection and regulation in patient care delivery. A qualitative approach was used for the study utilising in-depth semi-structured interviews and researcher reflective journaling. Participants were drawn from rural New South Wales. Following institutional ethics approval 5 nurses were interviewed and reflective journaling commenced. The interviews and the reflective journal were transcribed verbatim. The results revealed that nurses' emotional regulation demonstrated by a 'professional face' was an important strategy to enable delivery of quality care even though it resulted in emotional containment. Such regulation was a protective mechanism employed to look after self and was critical in situations of emotional dissonance. The results also found that nurses experience emotional dissonance in situations where they have unresolved personal emotional issues and the latter was a individual motivator to manage emotions in the workplace. Emotions play a pivotal role within nurse-patient relationships. The professional face can be recognised as contributing to emotional health and therefore maintaining the emotional health of nurses in practice. This study foregrounds the importance of regulating emotions and nurturing nurses' emotional health in contemporary practice.

  15. Facing Complaining Customer and Suppressed Emotion at Worksite Related to Sleep Disturbance in Korea.

    Science.gov (United States)

    Lim, Sung Shil; Lee, Wanhyung; Hong, Kwanyoung; Jeung, Dayee; Chang, Sei Jin; Yoon, Jin Ha

    2016-11-01

    This study aimed to investigate the effect of facing complaining customer and suppressed emotion at worksite on sleep disturbance among working population. We enrolled 13,066 paid workers (male = 6,839, female = 6,227, age Working Condition Survey (2011). The odds ratio (OR) and 95% confidence intervals (CI) for sleep disturbance occurrence were calculated using multiple logistic regression models. Among workers in working environments where they always engage complaining customers had a significantly higher risk for sleep disturbance than rarely group (The OR [95% CI]; 5.46 [3.43-8.68] in male, 5.59 [3.30-9.46] in female workers). The OR (95% CI) for sleep disturbance was 1.78 (1.16-2.73) and 1.63 (1.02-2.63), for the male and female groups always suppressing their emotions at the workplace compared with those rarely group. Compared to those who both rarely engaged complaining customers and rarely suppressed their emotions at work, the OR (CI) for sleep disturbance was 9.66 (4.34-20.80) and 10.17 (4.46-22.07), for men and women always exposed to both factors. Sleep disturbance was affected by interactions of both emotional demands (engaging complaining customers and suppressing emotions at the workplace). The level of emotional demand, including engaging complaining customers and suppressing emotions at the workplace is significantly associated with sleep disturbance among Korean working population.

  16. Training Approach-Avoidance of Smiling Faces Affects Emotional Vulnerability in Socially Anxious Individuals

    Directory of Open Access Journals (Sweden)

    Mike eRinck

    2013-08-01

    Full Text Available Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs: Although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer, Rinck, & Becker, 2007. The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs’ automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation. We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: It led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia.

  17. Training approach-avoidance of smiling faces affects emotional vulnerability in socially anxious individuals

    Science.gov (United States)

    Rinck, Mike; Telli, Sibel; Kampmann, Isabel L.; Woud, Marcella L.; Kerstholt, Merel; te Velthuis, Sarai; Wittkowski, Matthias; Becker, Eni S.

    2013-01-01

    Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs): although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer et al., 2007). The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs' automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation). We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: it led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia. PMID:23970862

  18. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Directory of Open Access Journals (Sweden)

    Arash eJavanbakht

    2015-06-01

    Full Text Available Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces in adulthood. 52 subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task (EFAT. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and mPFC responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  19. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    Science.gov (United States)

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  20. Detection of emotional faces: salient physical features guide effective visual search.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  1. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Directory of Open Access Journals (Sweden)

    Teresa A Victor

    Full Text Available Major depressive disorder (MDD is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.Unmedicated-depressed participants with MDD (n=22 and healthy controls (HC; n=25 underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  2. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Science.gov (United States)

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  3. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    Science.gov (United States)

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  4. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces.

    Science.gov (United States)

    Garrido, Margarida V; Prada, Marília

    2017-01-01

    The Karolinska Directed Emotional Faces (KDEF) is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008), we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female) each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality ( N = 155 Portuguese students, M = 23.73 years old, SD = 7.24) and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task) and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales). Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male) models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension) is available as supplementary material (available at https://osf.io/fvc4m/).

  5. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces

    Directory of Open Access Journals (Sweden)

    Margarida V. Garrido

    2017-12-01

    Full Text Available The Karolinska Directed Emotional Faces (KDEF is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008, we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality (N = 155 Portuguese students, M = 23.73 years old, SD = 7.24 and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales. Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension is available as supplementary material (available at https://osf.io/fvc4m/.

  7. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  8. When does subliminal affective image priming influence the ability of schizophrenic patients to perceive face emotions?

    Science.gov (United States)

    Vaina, Lucia Maria; Rana, Kunjan D; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A; Podea, Delia

    2014-12-24

    Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ's tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales.

  9. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    Science.gov (United States)

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  10. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    Directory of Open Access Journals (Sweden)

    Lili Guan

    2017-08-01

    Full Text Available The self-face processing advantage (SPA refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral and were asked to judge whether the target face (self, friend, and stranger was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy, self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  11. The impact of early repeated pain experiences on stress responsiveness and emotionality at maturity in rats.

    Science.gov (United States)

    Page, Gayle G; Blakely, Wendy P; Kim, Miyong

    2005-01-01

    The intensive care necessary for premature newborns is characterized by multiple procedures, many of which are painful. Given emerging evidence that such early pain during this time of high brain plasticity may affect long-term neurodevelopmental and social-emotional functioning, this study explored the impact of early repeated pain on emotionality and stress responsivity at maturity. From birth through postnatal day 7, Fischer 344 pups underwent either paw needle prick every day versus every other day or daily paw touch, or were left unperturbed. Each paw received the designated perturbation once per day. At maturity, some animals underwent emotionality testing: either a 4-day series of open field exposures or a single elevated plus-maze (EPM) exposure. The paw prick groups exhibited less open field habituation and occupied the EPM open arms more. Two weeks later, all animals were either subjected to forced swim or not. At 1h post-swim, animals underwent either blood withdrawal for plasma corticosterone (CS) levels and ex vivo natural killer cell activity (NKCA) or were injected intravenously with radiolabeled NK-sensitive syngeneic MADB106 tumor cells and assessed for lung tumor retention. Sex was a major factor in the manifestation of perturbation-related differences in the biologic outcomes. Whereas postnatal pain differentially affected baseline tumor retention between males and females, only males exhibited perturbation-related differences in swim stress-induced increases in tumor retention and CS. Finally, male-female differences were evident in CS, NKCA, and tumor responses to swim stress. These findings suggest that early pain affects neurodevelopmental function in the mature organism; however, these relationships are complicated by sex differences, the postnatal pain schedule, and the outcome measured.

  12. Word wins over Face: Emotional Stroop effect activates the frontal cortical network

    Directory of Open Access Journals (Sweden)

    Shima Ovaysikia

    2011-01-01

    Full Text Available The prefrontal cortex (PFC has been implicated in higher order cognitive control of behaviour. Sometimes such control is executed through suppression of an unwanted response in order to avoid conflict. Conflict occurs when two simultaneously competing processes lead to different behavioral outcomes, as seen in tasks such as the anti-saccade, go/no-go and the Stroop task. We set out to examine whether different types of stimuli in a modified emotional Stroop task would cause similar interference effects as the original Stroop-colour/word, and whether the required suppression mechanism(s would recruit similar regions of the medial PFC (mPFC. By using emotional words and emotional faces in this Stroop experiment, we examined the two well-learned automatic behaviours of word reading and recognition of face expressions. In our emotional Stroop paradigm, words were processed faster than face expressions with incongruent trials yielding longer reaction times (RT and larger number of errors compared to the congruent trials. This novel Stroop effect activated the anterior and inferior regions of the mPFC, namely the anterior cingulate cortex (ACC, inferior frontal gyrus (IFG as well as the superior frontal gyrus. Our results suggest that prepotent behaviours such as reading and recognition of face expressions are stimulus-dependent and perhaps hierarchical, hence recruiting distinct regions of the mPFC. Moreover, the faster processing of word reading compared to reporting face expressions is indicative of the formation of stronger stimulus-response (SR associations of an over-learned behaviour compared to an instinctive one, which could alternatively be explained through the distinction between awareness and selective attention.

  13. A note on age differences in mood-congruent versus mood-incongruent emotion processing in faces

    Directory of Open Access Journals (Sweden)

    Manuel C. Voelkle

    2014-06-01

    Full Text Available This article addresses four interrelated research questions: (1 Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent? (2 Are there age-group differences in the interplay between experienced mood and emotion perception? (3 Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4 does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years were asked to provide multidimensional emotion ratings of a total of 1,026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle, Ebner, Lindenberger, & Riediger, 2013, crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  14. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    Science.gov (United States)

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  15. Gender differences in human single neuron responses to male emotional faces.

    Science.gov (United States)

    Newhoff, Morgan; Treiman, David M; Smith, Kris A; Steinmetz, Peter N

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  16. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    Science.gov (United States)

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  17. ‘Distracters’ do not always distract: Visual working memory for angry faces is enhanced by incidental emotional words.

    Directory of Open Access Journals (Sweden)

    Margaret Cecilia Jackson

    2012-10-01

    Full Text Available We are often required to filter out distraction in order to focus on a primary task during which working memory (WM is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 second maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive versus neutral (or no word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task.

  18. [Abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorder].

    Science.gov (United States)

    Lin, Qiong-Xi; Wu, Gui-Hua; Zhang, Ling; Wang, Zeng-Jian; Pan, Ning; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2018-02-01

    To explore the recognition ability and abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorders (ASD). Photos of Chinese static faces with four basic emotions (fearful, happy, angry and sad) were used as stimulus. Twenty-five ASD children and twenty-two age- and gender-matched typical developed children (normal controls) were asked to match the emotional faces with words. Event-related potential (ERP) data were recorded concurrently. N170 latencies for total emotion and fearful face in the left temporal region were faster than in the right one in normal controls (P<0.05), but the results were not noted in ASD children. Further, N170 latencies in the left temporal region of ASD children were slower than normal controls for total emotion, fearful and happy faces (P<0.05), and their N170 latencies in the right temporal region were prone to slower than normal controls for angry and fearful faces. The holistic perception speed of emotional faces in the early cognitive processing phase in ASD children is slower than normal controls. The lateralized response in the early phase of recognizing emotional faces may be aberrant in children with ASD.

  19. Initial Orientation of Attention towards Emotional Faces in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Ahmadi

    2011-09-01

    Full Text Available Objective: Early recognition of negative emotions is considered to be of vital importance. It seems that children with attention deficit hyperactivity disorder have some difficulties recognizing facial emotional expressions, especially negative ones. This study investigated the preference of children with attention deficit hyperactivity disorder for negative (angry, sad facial expressions compared to normal children.Method: Participants were 35 drug naive boys with ADHD, aged between 6-11 years ,and 31 matched healthy children. Visual orientation data were recorded while participants viewed face pairs (negative-neutral pairs shown for 3000ms. The number of first fixations made to each expression was considered as an index of initial orientation. Results: Group comparisons revealed no difference between attention deficit hyperactivity disorder group and their matched healthy counterparts in initial orientation of attention. A tendency towards negative emotions was found within the normal group, while no difference was observed between initial allocation of attention toward negative and neutral expressions in children with ADHD .Conclusion: Children with attention deficit hyperactivity disorder do not have significant preference for negative facial expressions. In contrast, normal children have a significant preference for negative facial emotions rather than neutral faces.

  20. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    Science.gov (United States)

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  1. Oxytocin effects on emotional response to others' faces via serotonin system in autism: A pilot study.

    Science.gov (United States)

    Fukai, Mina; Hirosawa, Tetsu; Kikuchi, Mitsuru; Ouchi, Yasuomi; Takahashi, Tetsuya; Yoshimura, Yuko; Miyagishi, Yoshiaki; Kosaka, Hirotaka; Yokokura, Masamichi; Yoshikawa, Etsuji; Bunai, Tomoyasu; Minabe, Yoshio

    2017-09-30

    The oxytocin (OT)-related serotonergic system is thought to play an important role in the etiology and social symptoms of autism spectrum disorder (ASD). However, no evidence exists for the relation between the prosocial effect of chronic OT administration and the brain serotonergic system. Ten male subjects with ASD were administered OT for 8-10 weeks in an open-label, single-arm, non-randomized, uncontrolled manner. Before and during the OT treatment, positron emission tomography was used with the ( 11 C)-3-amino-4-(2-[(demethylamino)methyl]phenylthio)benzonitrile( 11 C-DASB) radiotracer. Then binding of serotonin transporter ( 11 C-DASB BP ND ) was estimated. The main outcome measures were changes in 11 C-DASB BP ND and changes in the emotional response to others' faces. No significant change was found in the emotional response to others' faces after the 8-10 week OT treatment. However, the increased serotonin transporter (SERT) level in the striatum after treatment was correlated significantly with increased negative emotional response to human faces. This study revealed a relation between changes in the serotonergic system and in prosociality after chronic OT administration. Additional studies must be conducted to verify the chronic OT effects on social behavior via the serotonergic system. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  2. Intranasal Oxytocin Administration Dampens Amygdala Reactivity towards Emotional Faces in Male and Female PTSD Patients.

    Science.gov (United States)

    Koch, Saskia Bj; van Zuiden, Mirjam; Nawijn, Laura; Frijling, Jessie L; Veltman, Dick J; Olff, Miranda

    2016-05-01

    Post-traumatic stress disorder (PTSD) is a disabling psychiatric disorder. As a substantial part of PTSD patients responds poorly to currently available psychotherapies, pharmacological interventions boosting treatment response are needed. Because of its anxiolytic and pro-social properties, the neuropeptide oxytocin (OT) has been proposed as promising strategy for treatment augmentation in PTSD. As a first step to investigate the therapeutic potential of OT in PTSD, we conducted a double-blind, placebo-controlled, cross-over functional MRI study examining OT administration effects (40 IU) on amygdala reactivity toward emotional faces in unmedicated male and female police officers with (n=37, 21 males) and without (n=40, 20 males) PTSD. Trauma-exposed controls were matched to PTSD patients based on age, sex, years of service and educational level. Under placebo, the expected valence-dependent amygdala reactivity (ie, greater activity toward fearful-angry faces compared with happy-neutral faces) was absent in PTSD patients. OT administration dampened amygdala reactivity toward all emotional faces in male and female PTSD patients, but enhanced amygdala reactivity in healthy male and female trauma-exposed controls, independent of sex and stimulus valence. In PTSD patients, greater anxiety prior to scanning and amygdala reactivity during the placebo session were associated with greater reduction of amygdala reactivity after OT administration. Taken together, our results indicate presumably beneficial neurobiological effects of OT administration in male and female PTSD patients. Future studies should investigate OT administration in clinical settings to fully appreciate its therapeutic potential.

  3. Early handling and repeated cross-fostering have opposite effect on mouse emotionality.

    Directory of Open Access Journals (Sweden)

    Alessandra eLuchetti

    2015-04-01

    Full Text Available Early life events have a crucial role in programming the individual phenotype and exposure to traumatic experiences during infancy can increase later risk for a variety of neuropsychiatric conditions, including mood and anxiety disorders. Animal models of postnatal stress have been developed in rodents to explore molecular mechanisms responsible for the observed short and long lasting neurobiological effects of such manipulations. The main aim of this study was to compare the behavioral and hormonal phenotype of young and adult animals exposed to different postnatal treatments. Outbred mice were exposed to i the classical Handling protocol (H: 15 min-day of separation from the mother from day 1 to 14 of life or to ii a Repeated Cross-Fostering protocol (RCF: adoption of litters from day 1 to 4 of life by different dams. Handled mice received more maternal care in infancy and showed the already described reduced emotionality at adulthood. Repeated cross fostered animals did not differ for maternal care received, but showed enhanced sensitivity to separation from the mother in infancy and altered respiratory response to 6%CO2 in breathing air in comparison with controls. Abnormal respiratory responses to hypercapnia are commonly found among humans with panic disorders, and point to RCF-induced instability of the early environment as a valid developmental model for panic disorder. The comparisons between short- and long-term effects of postnatal handling vs. RCF indicate that different types of early adversities are associated with different behavioral profiles, and evoke psychopathologies that can be distinguished according to the neurobiological systems disrupted by early-life manipulations.

  4. [Decrease in N170 evoked potential component latency during repeated presentation of face images].

    Science.gov (United States)

    Verkhliutov, V M; Ushakov, V L; Strelets, V B

    2009-01-01

    The 15 healthy volunteers EEG from 28 channels was recorded during the presentation of visual stimuli in the form of face and building images. The stimuli were presented in two series. The first series consisted of 60 face and 60 building images presented in random order. The second series consisted of 30 face and 30 building images. The second series began 1.5-2 min after the end of the first ore. No instruction was given to the participants. P1, N170 and VPP EP components were identified for both stimuli categories. These components were located in the medial parietal area (Brodmann area 40). P1 and N170 components were recorded in the superior temporal fissure (Brodmann area 21, STS region), the first component had the latency 120 ms, the second one--155 ms. VPP was recorded with the latency 190 ms (Brodmann area 19). Dynamic mapping of EP components with the latency from 97 to 242 ms revealed the removal of positive maximums from occipital to frontal areas through temporal ones and their subsequent returning to occipital areas through the central ones. During the comparison of EP components to face and building images the amplitude differences were revealed in the following areas: P1--in frontal, central and anterior temporal areas, N170--in frontal, central, temporal and parietal areas, VPP--in all areas. It was also revealed that N170 latency was 12 ms shorter for face than for building images. It was proposed that the above mentioned N170 latency decrease for face in comparison with building images is connected with the different space location of the fusiform area responsible for face and building images recognition. Priming--the effect that is revealed during the repetitive face images presentation is interpreted as the manifestation of functional heterogeneity of the fusiform area responsible for the face images recognition. The hypothesis is put forward that the parts of extrastriate cortex which are located closer to the central retinotopical

  5. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers

    Directory of Open Access Journals (Sweden)

    Laura A. Thomas

    2014-04-01

    Full Text Available Youth with bipolar disorder (BD and those with severe, non-episodic irritability (severe mood dysregulation, SMD show face-emotion labeling deficits. These groups differ from healthy volunteers (HV in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N = 20, SMD (N = 18, and HV (N = 22 during “Aware” and “Non-aware” priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval appeared (187 ms before the shape. In non-aware, a face appeared (17 ms, followed by a mask (170 ms, and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  6. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    Science.gov (United States)

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    Science.gov (United States)

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  8. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  9. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  10. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  11. Gender Differences in Human Single Neuron Responses to Male Emotional Faces

    Directory of Open Access Journals (Sweden)

    Morgan eNewhoff

    2015-09-01

    Full Text Available Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions.This study included recordings of single-neuron activity of 14 (6 male epileptic patients in four brain areas: amygdala (236 neurons, hippocampus (n=270, anterior cingulate cortex (n=256, and ventromedial prefrontal cortex (n=174. Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions.Significant gender differences were found in the left amygdala, where 23% (n=15/66 of neurons in men were significantly affected by facial emotion, versus 8% (n=6/76 of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p<0.01. These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  12. Oxytocin and social pretreatment have similar effects on processing of negative emotional faces in healthy adult males

    Directory of Open Access Journals (Sweden)

    Anna eKis

    2013-08-01

    Full Text Available Oxytocin has been shown to affect several aspects of human social cognition, including facial emotion processing. There is also evidence that social stimuli (such as eye-contact can effectively modulate endogenous oxytocin levels.In the present study we directly tested whether intranasal oxytocin administration and pre-treatment with social stimuli had similar effects on face processing at the behavioural level. Subjects (N=52 healthy adult males were presented with a set of faces with expressions of different valence (negative, neutral, positive following different types of pretreatment (oxytocin – OT or placebo – PL and social interaction – Soc or no social interaction – NSoc, N=13 in each and were asked to rate all faces for perceived emotion and trustworthiness. On the next day subjects’ recognition memory was tested on a set of neutral faces and additionally they had to again rate each face for trustworthiness and emotion.Subjects in both the OT and the Soc pretreatment group (as compared to the PL and to the NSoc groups gave higher emotion and trustworthiness scores for faces with negative emotional expression. Moreover, 24 h later, subjects in the OT and Soc groups (unlike in control groups gave lower trustworthiness scores for previously negative faces, than for faces previously seen as emotionally neutral or positive.In sum these results provide the first direct evidence of the similar effects of intranasal oxytocin administration and social stimulation on the perception of negative facial emotions as well as on the delayed recall of negative emotional information.

  13. Neural correlates of top-down processing in emotion perception: an ERP study of emotional faces in white noise versus noise-alone stimuli.

    Science.gov (United States)

    Lee, Kyu-Yong; Lee, Tae-Ho; Yoon, So-Jeong; Cho, Yang Seok; Choi, June-Seek; Kim, Hyun Taek

    2010-06-14

    In the present study, we investigated the neural correlates underlying the perception of emotion in response to facial stimuli in order to elucidate the extent to which emotional perception is affected by the top-down process. Subjects performed a forced, two-choice emotion discrimination task towards ambiguous visual stimuli consisted of emotional faces embedded in different levels of visual white noise, including white noise-alone stimuli. ERP recordings and behavioral responses were analyzed according to the four response categories: hit, miss, false alarm and correct rejection. We observed enlarged EPN and LPP amplitudes when subjects reported seeing fearful faces and a typical emotional EPN response in the white noise-alone conditions when fearful faces were not presented. The two components of the ERP data which imply the characteristic modulation reflecting emotional processing showed the type of emotion each individual subjectively perceived. The results suggest that top-down modulations might be indispensable for emotional perception, which consists of two distinct stages of stimulus processing in the brain. (c) 2010 Elsevier B.V. All rights reserved.

  14. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  15. Dispositional fear, negative affectivity, and neuroimaging response to visually suppressed emotional faces.

    Science.gov (United States)

    Vizueta, Nathalie; Patrick, Christopher J; Jiang, Yi; Thomas, Kathleen M; He, Sheng

    2012-01-02

    "Invisible" stimulus paradigms provide a method for investigating basic affective processing in clinical and non-clinical populations. Neuroimaging studies utilizing continuous flash suppression (CFS) have shown increased amygdala response to invisible fearful versus neutral faces. The current study used CFS in conjunction with functional MRI to test for differences in brain reactivity to visible and invisible emotional faces in relation to two distinct trait dimensions relevant to psychopathology: negative affectivity (NA) and fearfulness. Subjects consisted of college students (N=31) assessed for fear/fearlessness along with dispositional NA. The main brain regions of interest included the fusiform face area (FFA), superior temporal sulcus (STS), and amygdala. Higher NA, but not trait fear, was associated with enhanced response to fearful versus neutral faces in STS and right amygdala (but not FFA), within the invisible condition specifically. The finding that NA rather than fearfulness predicted degree of amygdala reactivity to suppressed faces implicates the input subdivision of the amygdala in the observed effects. Given the central role of NA in anxiety and mood disorders, the current data also support use of the CFS methodology for investigating the neurobiology of these disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    Science.gov (United States)

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially

  17. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    Science.gov (United States)

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    Science.gov (United States)

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error =

  19. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    Science.gov (United States)

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  20. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  1. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    Science.gov (United States)

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Reduced amygdala and ventral striatal activity to happy faces in PTSD is associated with emotional numbing.

    Directory of Open Access Journals (Sweden)

    Kim L Felmingham

    Full Text Available There has been a growing recognition of the importance of reward processing in PTSD, yet little is known of the underlying neural networks. This study tested the predictions that (1 individuals with PTSD would display reduced responses to happy facial expressions in ventral striatal reward networks, and (2 that this reduction would be associated with emotional numbing symptoms. 23 treatment-seeking patients with Posttraumatic Stress Disorder were recruited from the treatment clinic at the Centre for Traumatic Stress Studies, Westmead Hospital, and 20 trauma-exposed controls were recruited from a community sample. We examined functional magnetic resonance imaging responses during the presentation of happy and neutral facial expressions in a passive viewing task. PTSD participants rated happy facial expression as less intense than trauma-exposed controls. Relative to controls, PTSD participants revealed lower activation to happy (-neutral faces in ventral striatum and and a trend for reduced activation in left amygdala. A significant negative correlation was found between emotional numbing symptoms in PTSD and right ventral striatal regions after controlling for depression, anxiety and PTSD severity. This study provides initial evidence that individuals with PTSD have lower reactivity to happy facial expressions, and that lower activation in ventral striatal-limbic reward networks may be associated with symptoms of emotional numbing.

  3. Neural markers of emotional face perception across psychotic disorders and general population.

    Science.gov (United States)

    Sabharwal, Amri; Kotov, Roman; Szekely, Akos; Leung, Hoi-Chung; Barch, Deanna M; Mohanty, Aprajita

    2017-07-01

    There is considerable variation in negative and positive symptoms of psychosis, global functioning, and emotional face perception (EFP), not only in schizophrenia but also in other psychotic disorders and healthy individuals. However, EFP impairment and its association with worse symptoms and global functioning have been examined largely in the domain of schizophrenia. The present study adopted a dimensional approach to examine the association of behavioral and neural measures of EFP with symptoms of psychosis and global functioning across individuals with schizophrenia spectrum (SZ; N = 28) and other psychotic (OP; N = 29) disorders, and never-psychotic participants (NP; N = 21). Behavioral and functional MRI data were recorded as participants matched emotional expressions of faces and geometrical shapes. Lower accuracy and increased activity in early visual regions, hippocampus, and amygdala during emotion versus shape matching were associated with higher negative, but not positive, symptoms and lower global functioning, across all participants. This association remained even after controlling for group-related (SZ, OP, and NP) variance, dysphoria, and antipsychotic medication status, except in amygdala. Furthermore, negative symptoms mediated the relationship between behavioral and brain EFP measures and global functioning. This study provides some of the first evidence supporting the specific relationship of EFP measures with negative symptoms and global functioning across psychotic and never-psychotic samples, and transdiagnostically across different psychotic disorders. Present findings help bridge the gap between basic EFP-related neuroscience research and clinical research in psychosis, and highlight EFP as a potential symptom-specific marker that tracks global functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Emotional expectations influence neural sensitivity to fearful faces in humans:An event-related potential study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.

  5. What's good for the goose is not good for the gander: Age and gender differences in scanning emotion faces.

    Science.gov (United States)

    Sullivan, Susan; Campbell, Anna; Hutton, Sam B; Ruffman, Ted

    2017-05-01

    Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    Science.gov (United States)

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p recognition errors on adult happy faces even when controlling for group status (p face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  7. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  8. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    Science.gov (United States)

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  9. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    Science.gov (United States)

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  10. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. The effect of emotional design and online customer review on customer repeat purchase intention in online stores

    Science.gov (United States)

    Dewi, D. S.; Sudiarno, A.; Saputra, H.; Dewi, R. S.

    2018-04-01

    The internet users in Indonesia has increased rapidly over the last decade. A survey conducted by Association of Internet Service Providers Indonesia shows that the internet users has reached 34.9% of total population in Indonesia. The increase of internet users has led to a shift in trading practice from conventional trade to online trade. It is predicted in the next years the number of online consumers in Indonesia will continue to increase, provide many opportunity for online business. The huge number of internet users is not necesarily followed by the high number of e-purchase. It is therefore become the interest of many researchers to investigate factors that influence the decision on online purchasing.This research proposes a model that assess the effect of emotional design and customer review to customer intention on e-repeat purchase. Online questionnaire is designed and is distributed randomly through google forms. There are 187 respondent filled the questionnaire from which only 162 respondents actually have experience in online purchase. These data are then processed by using statistical analysis. A model is developed by applying structural equation modeling (SEM) approach. This study revealed that customer reviews especially objective reviews has a significant effect toward repeat purchase. Whereas emotional design particularly visual attractiveness also shows a significant effect toward e-repeat purchase.

  12. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G.; Alpers, Georg W.

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety. PMID:25324792

  13. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses.

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G; Alpers, Georg W

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  14. A Rapid Subcortical Amygdala Route for Faces Irrespective of Spatial Frequency and Emotion.

    Science.gov (United States)

    McFadyen, Jessica; Mermillod, Martial; Mattingley, Jason B; Halász, Veronika; Garrido, Marta I

    2017-04-05

    There is significant controversy over the existence and function of a direct subcortical visual pathway to the amygdala. It is thought that this pathway rapidly transmits low spatial frequency information to the amygdala independently of the cortex, and yet the directionality of this function has never been determined. We used magnetoencephalography to measure neural activity while human participants discriminated the gender of neutral and fearful faces filtered for low or high spatial frequencies. We applied dynamic causal modeling to demonstrate that the most likely underlying neural network consisted of a pulvinar-amygdala connection that was uninfluenced by spatial frequency or emotion, and a cortical-amygdala connection that conveyed high spatial frequencies. Crucially, data-driven neural simulations revealed a clear temporal advantage of the subcortical connection over the cortical connection in influencing amygdala activity. Thus, our findings support the existence of a rapid subcortical pathway that is nonselective in terms of the spatial frequency or emotional content of faces. We propose that that the "coarseness" of the subcortical route may be better reframed as "generalized." SIGNIFICANCE STATEMENT The human amygdala coordinates how we respond to biologically relevant stimuli, such as threat or reward. It has been postulated that the amygdala first receives visual input via a rapid subcortical route that conveys "coarse" information, namely, low spatial frequencies. For the first time, the present paper provides direction-specific evidence from computational modeling that the subcortical route plays a generalized role in visual processing by rapidly transmitting raw, unfiltered information directly to the amygdala. This calls into question a widely held assumption across human and animal research that fear responses are produced faster by low spatial frequencies. Our proposed mechanism suggests organisms quickly generate fear responses to a wide range

  15. Avoidant decision making in social anxiety: The interaction of angry faces and emotional responses

    Directory of Open Access Journals (Sweden)

    Andre ePittig

    2014-09-01

    Full Text Available Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis- advantageous to maximize overall gain. To create a decision conflict between approach of rewards and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  16. Characterization and recognition of mixed emotional expressions in thermal face image

    Science.gov (United States)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  17. Differences in neural and cognitive response to emotional faces in middle-aged dizygotic twins at familial risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Svendsen, A M B; Harmer, C J

    2017-01-01

    -twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task......BACKGROUND: Negative bias and aberrant neural processing of emotional faces are trait-marks of depression but findings in healthy high-risk groups are conflicting. METHODS: Healthy middle-aged dizygotic twins (N = 42) underwent functional magnetic resonance imaging (fMRI): 22 twins had a co...... the amygdala and ventral prefrontal cortex and pregenual anterior cingulate. This was accompanied by greater fear-specific fronto-temporal response and reduced fronto-occipital response to all emotional faces relative to baseline. The risk groups showed no differences in mood, subjective state or coping...

  18. Faces

    DEFF Research Database (Denmark)

    Mortensen, Kristine Køhler; Brotherton, Chloe

    2018-01-01

    for the face the be put into action. Based on an ethnographic study of Danish teenagers’ use of SnapChat we demonstrate how the face is used as a central medium for interaction with peers. Through the analysis of visual SnapChat messages we investigate how SnapChat requires the sender to put an ‘ugly’ face...... already secured their popular status on the heterosexual marketplace in the broad context of the school. Thus SnapChat functions both as a challenge to beauty norms of ‘flawless faces’ and as a reinscription of these same norms by further manifesting the exclusive status of the popular girl...

  19. Association of Irritability and Anxiety With the Neural Mechanisms of Implicit Face Emotion Processing in Youths With Psychopathology.

    Science.gov (United States)

    Stoddard, Joel; Tseng, Wan-Ling; Kim, Pilyoung; Chen, Gang; Yi, Jennifer; Donahue, Laura; Brotman, Melissa A; Towbin, Kenneth E; Pine, Daniel S; Leibenluft, Ellen

    2017-01-01

    Psychiatric comorbidity complicates clinical care and confounds efforts to elucidate the pathophysiology of commonly occurring symptoms in youths. To our knowledge, few studies have simultaneously assessed the effect of 2 continuously distributed traits on brain-behavior relationships in children with psychopathology. To determine shared and unique effects of 2 major dimensions of child psychopathology, irritability and anxiety, on neural responses to facial emotions during functional magnetic resonance imaging. Cross-sectional functional magnetic resonance imaging study in a large, well-characterized clinical sample at a research clinic at the National Institute of Mental Health. The referred sample included youths ages 8 to 17 years, 93 youths with anxiety, disruptive mood dysregulation, and/or attention-deficit/hyperactivity disorders and 22 healthy youths. The child's irritability and anxiety were rated by both parent and child on the Affective Reactivity Index and Screen for Child Anxiety Related Disorders, respectively. Using functional magnetic resonance imaging, neural response was measured across the brain during gender labeling of varying intensities of angry, happy, or fearful face emotions. In mixed-effects analyses, the shared and unique effects of irritability and anxiety were tested on amygdala functional connectivity and activation to face emotions. The mean (SD) age of participants was 13.2 (2.6) years; of the 115 included, 64 were male. Irritability and/or anxiety influenced amygdala connectivity to the prefrontal and temporal cortex. Specifically, irritability and anxiety jointly influenced left amygdala to left medial prefrontal cortex connectivity during face emotion viewing (F4,888 = 9.20; P differences in neural response to face emotions in several areas (F2, 888 ≥ 13.45; all P emotion dysregulation when very anxious and irritable youth process threat-related faces. Activation in the ventral visual circuitry suggests a mechanism

  20. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    Science.gov (United States)

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  1. Amygdala activation and its functional connectivity during perception of emotional faces in social phobia and panic disorder

    NARCIS (Netherlands)

    Demenescu, L.R.; Kortekaas, R.; Cremers, H.R.; Renken, R.J.; van Tol, M.J.; van der Wee, M.J.A.; Veltman, D.J.; den Boer, J.A.; Roelofs, K.; Aleman, A.

    Social phobia (SP) and panic disorder (PD) have been associated with aberrant amygdala responses to threat-related stimuli. The aim of the present study was to examine amygdala function and its connectivity with medial prefrontal cortex (mPFC) during emotional face perception in PD and SP, and the

  2. Association of Maternal Interaction with Emotional Regulation in 4 and 9 Month Infants During the Still Face Paradigm

    Science.gov (United States)

    Lowe, Jean R.; MacLean, Peggy C.; Duncan, Andrea F.; Aragón, Crystal; Schrader, Ronald M.; Caprihan, Arvind; Phillips, John P.

    2013-01-01

    This study used the Still Face Paradigm to investigate the relationship of maternal interaction on infants’ emotion regulation responses. Seventy infant-mother dyads were seen at 4 months and 25 of these same dyads were re-evaluated at 9 months. Maternal interactions were coded for attention seeking and contingent responding. Emotional regulation was described by infant stress reaction and overall positive affect. Results indicated that at both 4 and 9 months mothers who used more contingent responding interactions had infants who showed more positive affect. In contrast, mothers who used more attention seeking play had infants who showed less positive affect after the Still Face Paradigm. Patterns of stress reaction were reversed, as mothers who used more attention seeking play had infants with less negative affect. Implications for intervention and emotional regulation patterns over time are discussed. PMID:22217393

  3. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    Science.gov (United States)

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Psychopathic traits are associated with reduced attention to the eyes of emotional faces among adult male non-offenders

    Directory of Open Access Journals (Sweden)

    Steven Mark Gillespie

    2015-10-01

    Full Text Available Psychopathic traits are linked with impairments in emotional facial expression recognition. These impairments may, in part, reflect reduced attention to the eyes of emotional faces. Although reduced attention to the eyes has been noted among children with conduct problems and callous-unemotional traits, similar findings are yet to be found in relation to psychopathic traits among adult male participants. Here we investigated the relationship of primary (selfish, uncaring and secondary (impulsive, antisocial psychopathic traits with attention to the eyes among adult male non-offenders during an emotion recognition task. We measured the number of fixations, and overall dwell time, on the eyes and the mouth of male and female faces showing the six basic emotions at varying levels of intensity. We found no relationship of primary or secondary psychopathic traits with recognition accuracy. However, primary psychopathic traits were associated with a reduced number of fixations, and lower overall dwell time, on the eyes relative to the mouth across expressions, intensity, and sex. Furthermore, the relationship of primary psychopathic traits with attention to the eyes of angry and fearful faces was influenced by the sex and intensity of the expression. We also showed that a greater number of fixations on the eyes, relative to the mouth, was associated with increased accuracy for angry and fearful expression recognition. These results are the first to show effects of psychopathic traits on attention to the eyes of emotional faces in an adult male sample, and may support amygdala based accounts of psychopathy. These findings may also have methodological implications for clinical studies of emotion recognition.

  5. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents

    Directory of Open Access Journals (Sweden)

    Bianca G. van den Bulk

    2016-10-01

    Full Text Available Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral in adolescents with a DSM-IV depressive and/or anxiety disorder (N = 25, adolescents with CSA-related PTSD (N = 19 and healthy controls (N = 26. Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala.

  6. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    Science.gov (United States)

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    Science.gov (United States)

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  8. Face and emotion recognition deficits in Turner syndrome: a possible role for X-linked genes in amygdala development.

    Science.gov (United States)

    Lawrence, Kate; Kuntsi, Jonna; Coleman, Michael; Campbell, Ruth; Skuse, David

    2003-01-01

    Face recognition is thought to rely on configural visual processing. Where face recognition impairments have been identified, qualitatively delayed or anomalous configural processing has also been found. A group of women with Turner syndrome (TS) with monosomy for a single maternal X chromosome (45, Xm) showed an impairment in face recognition skills compared with normally developing women. However, normal configural face-processing abilities were apparent. The ability to recognize facial expressions of emotion, particularly fear, was also impaired in this TS subgroup. Face recognition and fear recognition accuracy were significantly correlated in the female control group but not in women with TS. The authors therefore suggest that anomalies in amygdala function may be a neurological feature of TS of this karyotype.

  9. Emotion

    Science.gov (United States)

    Choi, Sukwoo

    It was widely accepted that emotion such as fear, anger and pleasure could not be studied using a modern scientific tools. During the very early periods of emotion researches, psychologists, but not biologist, dominated in studying emotion and its disorders. Intuitively, one may think that emotion arises from brain first and then bodily responses follow. For example, we are sad first, and then cry. However, groups of psychologists suggested a proposal that our feeling follows bodily responses; that is, we feel sad because we cry! This proposal seems counterintuitive but became a popular hypothesis for emotion. Another example for this hypothesis is as follows. When you accidentally confront a large bear in a mountain, what would be your responses?; you may feel terrified first, and then run, or you may run first, and then feel terrified later on. In fact, the latter explanation is correct! You feel fear after you run (even because you run?). Or, you can imagine that you date with your girl friend who you love so much. Your heart must be beating fast and your body temperature must be elevated! In this situation, if you take a very cold bath, what would you expect? Your hot feeling is usually calmed down after this cold bath; that is, you feel hot because your heart and bodily temperature change. While some evidence supported this hypothesis, others do not. In the case of patients whose cervical vertebrae were severed with an accident, they still retained significant amount of emotion (feelings!) in some cases (but other patients lost most of emotional experience). In addition, one can imagine that there would be a specific set of physical responses for specific emotion if the original hypothesis is correct (e.g. fasten heart beating and redden face for anger etc.). However, some psychologists failed to find any specific set of physical responses for specific emotion, though others insisted that there existed such specific responses. Based on these controversial

  10. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    Science.gov (United States)

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  11. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  12. Infants’ Temperament and Mothers’, and Fathers’ Depression Predict Infants’ Attention to Objects Paired with Emotional Faces

    NARCIS (Netherlands)

    Aktar, E.; Mandell, D.J.; de Vente, W.; Majdandžić, M.; Raijmakers, M.E.J.; Bögels, S.M.

    2016-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze

  13. Preschooler's Faces in Spontaneous Emotional Contexts--How Well Do They Match Adult Facial Expression Prototypes?

    Science.gov (United States)

    Gaspar, Augusta; Esteves, Francisco G.

    2012-01-01

    Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…

  14. 3D Face Model Dataset: Automatic Detection of Facial Expressions and Emotions for Educational Environments

    Science.gov (United States)

    Chickerur, Satyadhyan; Joshi, Kartik

    2015-01-01

    Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…

  15. P2-27: Electrophysiological Correlates of Conscious and Unconscious Processing of Emotional Faces in Individuals with High and Low Autistic Traits

    Directory of Open Access Journals (Sweden)

    Svjetlana Vukusic

    2012-10-01

    Full Text Available LeDoux (1996 The Emotional Brain has suggested that subconsciouss presentation of fearful emotional information is relayed to the amygdala along a rapid subcortical route. Rapid emotion processing is important because it alerts other parts of brain to emotionally salient information. It also produces immediate reflexive responses to threating stimuli in comparison to slower conscious appraisal, which is of important adaptive survival value. Current theoretical models of autism spectrum disorders (ASD have linked impairments in the processing of emotional information to amygdala dysfunction. It can be suggested that impairment in face processing found in autism may be the result of impaired rapid subconscious processing of emotional information which does not make faces socially salient. Previous studies examined subconscious processing of emotional stimuli with backward masking paradigms by using very brief presentation of emotional face stimuli proceeded by a mask. We used an event-related potential (ERP study within a backward masking paradigm with subjects with low and high autistic tendencies as measured by the Autism Spectrum Quotient (AQ questionnaire. The time course of processing of fearful and happy facial expressions and an emotionally neutral face was investigated during subliminal (16 ms and supraliminal (166 ms stimuli presentation. The task consisted of an explicit categorization of emotional and neutral faces. We looked at ERP components N2, P3a, and also N170 for differences between subjects with low ( 19 AQ.

  16. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  17. Effects on automatic attention due to exposure to pictures of emotional faces while performing Chinese word judgment tasks.

    Science.gov (United States)

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.

  18. Attentional capture by emotional faces is contingent on attentional control settings

    DEFF Research Database (Denmark)

    Barratt, D.; Bundesen, Claus

    2012-01-01

    faster and attract more processing resources), responses to positive faces were slower when these were flanked by (response incompatible) negative faces as compared with positive or neutral faces, whereas responses to negative faces were unaffected by the identity of the flankers. Experiment 2...

  19. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    Science.gov (United States)

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  20. The ties to unbind: Age-related differences in feature (unbinding in working memory for emotional faces

    Directory of Open Access Journals (Sweden)

    Didem ePehlivanoglu

    2014-04-01

    Full Text Available In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust from bound stimuli (i.e., photographs of faces expressing these emotions, as a hyperbinding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back under three conditions: match/mismatch judgments based on either the identity of the face (identity condition, the face’s emotional expression (expression condition, or both identity and expression of the face (binding condition. Both age groups performed more slowly and with lower accuracy in the expression condition than in the binding condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory, over and beyond age-related differences observed in perceptual processing (0-Back and attention/short-term memory (1-Back. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/short-term memory and working memory. Pupil dilation data confirmed that the attention/short-term memory version of the task (1-Back is more effortful in older adults than younger adults.

  1. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  2. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  3. What you want to avoid is what you see: Social avoidance motivation affects the interpretation of emotional faces

    OpenAIRE

    Nikitin, Jana; Freund, Alexandra M

    2015-01-01

    This study investigated the effects of habitual social approach and avoidance motivation on the classification of facial expressions of different visual clarity. Participants (N = 78) categorized partially masked emotional faces expressing either anger or happiness as positive or negative. Participants generally tended to interpret the facial expressions in a positive way. This positivity effect was reduced when persons were highly avoidance motivated. Social avoidance motivation predicted fe...

  4. Measuring Emotions in Marketing and Consumer Behavior : Is Face Reader an applicable tool?

    OpenAIRE

    Drozdova, Natalia

    2014-01-01

    This thesis investigates the topic of measuring emotions in marketing and consumer research. An overview of existing implicit and explicit methods of measuring emotions is presented in the thesis, followed by a literature review of methods used in empirical research during the last decade. The last part of the thesis focuses on automatic facial expression analysis as a tool for measuring emotional responses. A pilot study conducted by the Center of Service Innovations in the Norwegian School ...

  5. Infants? Temperament and Mothers?, and Fathers? Depression Predict Infants? Attention to Objects Paired with Emotional Faces

    OpenAIRE

    Aktar, Evin; Mandell, Dorothy J.; de Vente, Wieke; Majdand?i?, Mirjana; Raijmakers, Maartje E. J.; B?gels, Susan M.

    2015-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants’ attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations ...

  6. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    Science.gov (United States)

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. BESST (Bochum Emotional Stimulus Set)--a pilot validation study of a stimulus set containing emotional bodies and faces from frontal and averted views.

    Science.gov (United States)

    Thoma, Patrizia; Soria Bauser, Denise; Suchan, Boris

    2013-08-30

    This article introduces the freely available Bochum Emotional Stimulus Set (BESST), which contains pictures of bodies and faces depicting either a neutral expression or one of the six basic emotions (happiness, sadness, fear, anger, disgust, and surprise), presented from two different perspectives (0° frontal view vs. camera averted by 45° to the left). The set comprises 565 frontal view and 564 averted view pictures of real-life bodies with masked facial expressions and 560 frontal and 560 averted view faces which were synthetically created using the FaceGen 3.5 Modeller. All stimuli were validated in terms of categorization accuracy and the perceived naturalness of the expression. Additionally, each facial stimulus was morphed into three age versions (20/40/60 years). The results show high recognition of the intended facial expressions, even under speeded forced-choice conditions, as corresponds to common experimental settings. The average naturalness ratings for the stimuli range between medium and high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Electrocortical Reactivity to Emotional Faces in Young Children and Associations with Maternal and Paternal Depression

    Science.gov (United States)

    Kujawa, Autumn; Hajcak, Greg; Torpey, Dana; Kim, Jiyon; Klein, Daniel N.

    2012-01-01

    Background: The late positive potential (LPP) is an event-related potential component that indexes selective attention toward motivationally salient information and is sensitive to emotional stimuli. Few studies have examined the LPP in children. Depression has been associated with reduced reactivity to negative and positive emotional stimuli,…

  9. Neural Reactivity to Emotional Faces May Mediate the Relationship between Childhood Empathy and Adolescent Prosocial Behavior

    Science.gov (United States)

    Flournoy, John C.; Pfeifer, Jennifer H.; Moore, William E.; Tackman, Allison M.; Masten, Carrie L.; Mazziotta, John C.; Iacoboni, Marco; Dapretto, Mirella

    2016-01-01

    Reactivity to others' emotions not only can result in empathic concern (EC), an important motivator of prosocial behavior, but can also result in personal distress (PD), which may hinder prosocial behavior. Examining neural substrates of emotional reactivity may elucidate how EC and PD differentially influence prosocial behavior. Participants…

  10. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    Science.gov (United States)

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces.

    Science.gov (United States)

    Pérez-Edgar, Koraly; Bar-Haim, Yair; McDermott, Jennifer Martin; Gorodetsky, Elena; Hodgkinson, Colin A; Goldman, David; Ernst, Monique; Pine, Daniel S; Fox, Nathan A

    2010-03-01

    Both attention biases to threat and a serotonin-transporter gene polymorphism (5-HTTLPR) have been linked to heightened neural activation to threat and the emergence of anxiety. The short allele of 5-HTTLPR may act via its effect on neurotransmitter availability, while attention biases shape broad patterns of cognitive processing. We examined individual differences in attention bias to emotion faces as a function of 5-HTTLPR genotype. Adolescents (N=117) were classified for presumed SLC6A4 expression based on 5-HTTLPR-low (SS, SL(G), or L(G)L(G)), intermediate (SL(A) or L(A)L(G)), or high (L(A)L(A)). Participants completed the dot-probe task, measuring attention biases toward or away from angry and happy faces. Biases for angry faces increased with the genotype-predicted neurotransmission levels (low>intermediate>high). The reverse pattern was evident for happy faces. The data indicate a linear relation between 5-HTTLPR allelic status and attention biases to emotion, demonstrating a genetic mechanism for biased attention using ecologically valid stimuli that target socioemotional adaptation. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Memory for faces with emotional expressions in Alzheimer's disease and healthy older participants: positivity effect is not only due to familiarity.

    Science.gov (United States)

    Sava, Alina-Alexandra; Krolak-Salmon, Pierre; Delphin-Combe, Floriane; Cloarec, Morgane; Chainay, Hanna

    2017-01-01

    Young individuals better memorize initially seen faces with emotional rather than neutral expressions. Healthy older participants and Alzheimer's disease (AD) patients show better memory for faces with positive expressions. The socioemotional selectivity theory postulates that this positivity effect in memory reflects a general age-related preference for positive stimuli, subserving emotion regulation. Another explanation might be that older participants use compensatory strategies, often considering happy faces as previously seen. The question about the existence of this effect in tasks not permitting such compensatory strategies is still open. Thus, we compared the performance of healthy participants and AD patients for positive, neutral, and negative faces in such tasks. Healthy older participants and AD patients showed a positivity effect in memory, but there was no difference between emotional and neutral faces in young participants. Our results suggest that the positivity effect in memory is not entirely due to the sense of familiarity for smiling faces.

  13. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    Science.gov (United States)

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Event-related brain responses to emotional words, pictures, and faces - a cross-domain comparison.

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal.

  15. Placing the face in context: cultural differences in the perception of facial emotion.

    Science.gov (United States)

    Masuda, Takahiko; Ellsworth, Phoebe C; Mesquita, Batja; Leu, Janxin; Tanida, Shigehito; Van de Veerdonk, Ellen

    2008-03-01

    Two studies tested the hypothesis that in judging people's emotions from their facial expressions, Japanese, more than Westerners, incorporate information from the social context. In Study 1, participants viewed cartoons depicting a happy, sad, angry, or neutral person surrounded by other people expressing the same emotion as the central person or a different one. The surrounding people's emotions influenced Japanese but not Westerners' perceptions of the central person. These differences reflect differences in attention, as indicated by eye-tracking data (Study 2): Japanese looked at the surrounding people more than did Westerners. Previous findings on East-West differences in contextual sensitivity generalize to social contexts, suggesting that Westerners see emotions as individual feelings, whereas Japanese see them as inseparable from the feelings of the group.

  16. Neural Reactivity to Emotional Faces Mediates the Relationship Between Childhood Empathy and Adolescent Prosocial Behavior

    Science.gov (United States)

    Flournoy, John C.; Pfeifer, Jennifer H.; Moore, William E.; Tackman, Allison; Masten, Carrie L.; Mazziotta, John C.; Iacoboni, Marco; Dapretto, Mirella

    2017-01-01

    Reactivity to others' emotions can result in empathic concern (EC), an important motivator of prosocial behavior, but can also result in personal distress (PD), which may hinder prosocial behavior. Examining neural substrates of emotional reactivity may elucidate how EC and PD differentially influence prosocial behavior. Participants (N=57) provided measures of EC, PD, prosocial behavior, and neural responses to emotional expressions at age 10 and 13. Initial EC predicted subsequent prosocial behavior. Initial EC and PD predicted subsequent reactivity to emotions in the inferior frontal gyrus (IFG) and inferior parietal lobule, respectively. Activity in the IFG, a region linked to mirror neuron processes, as well as cognitive control and language, mediated the relation between initial EC and subsequent prosocial behavior. PMID:28262939

  17. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    Science.gov (United States)

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  18. A framework for investigating the use of face features to identify spontaneous emotions

    OpenAIRE

    Bezerra, Giuliana Silva

    2014-01-01

    Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real world scenarios (...

  19. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  20. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    OpenAIRE

    Rossana eActis-Grosso; Rossana eActis-Grosso; Francesco eBossi; Paola eRicciardelli; Paola eRicciardelli

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits (HAT group) or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness) either shown in static faces or c...

  1. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    OpenAIRE

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or ...

  2. Emotional Expression in Simple Line Drawings of a Robot's Face Leads to Higher Offers in the Ultimatum Game

    Directory of Open Access Journals (Sweden)

    Kazunori Terada

    2017-05-01

    Full Text Available In the present study, we investigated whether expressing emotional states using a simple line drawing to represent a robot's face can serve to elicit altruistic behavior from humans. An experimental investigation was conducted in which human participants interacted with a humanoid robot whose facial expression was shown on an LCD monitor that was mounted as its head (Study 1. Participants were asked to play the ultimatum game, which is usually used to measure human altruistic behavior. All participants were assigned to be the proposer and were instructed to decide their offer within 1 min by controlling a slider bar. The corners of the robot's mouth, as indicated by the line drawing, simply moved upward, or downward depending on the position of the slider bar. The results suggest that the change in the facial expression depicted by a simple line drawing of a face significantly affected the participant's final offer in the ultimatum game. The offers were increased by 13% when subjects were shown contingent changes of facial expression. The results were compared with an experiment in a teleoperation setting in which participants interacted with another person through a computer display showing the same line drawings used in Study 1 (Study 2. The results showed that offers were 15% higher if participants were shown a contingent facial expression change. Together, Studies 1 and 2 indicate that emotional expression in simple line drawings of a robot's face elicits the same higher offer from humans as a human telepresence does.

  3. Emotional Expression in Simple Line Drawings of a Robot's Face Leads to Higher Offers in the Ultimatum Game.

    Science.gov (United States)

    Terada, Kazunori; Takeuchi, Chikara

    2017-01-01

    In the present study, we investigated whether expressing emotional states using a simple line drawing to represent a robot's face can serve to elicit altruistic behavior from humans. An experimental investigation was conducted in which human participants interacted with a humanoid robot whose facial expression was shown on an LCD monitor that was mounted as its head (Study 1). Participants were asked to play the ultimatum game, which is usually used to measure human altruistic behavior. All participants were assigned to be the proposer and were instructed to decide their offer within 1 min by controlling a slider bar. The corners of the robot's mouth, as indicated by the line drawing, simply moved upward, or downward depending on the position of the slider bar. The results suggest that the change in the facial expression depicted by a simple line drawing of a face significantly affected the participant's final offer in the ultimatum game. The offers were increased by 13% when subjects were shown contingent changes of facial expression. The results were compared with an experiment in a teleoperation setting in which participants interacted with another person through a computer display showing the same line drawings used in Study 1 (Study 2). The results showed that offers were 15% higher if participants were shown a contingent facial expression change. Together, Studies 1 and 2 indicate that emotional expression in simple line drawings of a robot's face elicits the same higher offer from humans as a human telepresence does.

  4. The Way Dogs (Canis familiaris Look at Human Emotional Faces Is Modulated by Oxytocin. An Eye-Tracking Study

    Directory of Open Access Journals (Sweden)

    Anna Kis

    2017-10-01

    Full Text Available Dogs have been shown to excel in reading human social cues, including facial cues. In the present study we used eye-tracking technology to further study dogs’ face processing abilities. It was found that dogs discriminated between human facial regions in their spontaneous viewing pattern and looked most to the eye region independently of facial expression. Furthermore dogs played most attention to the first two images presented, afterwards their attention dramatically decreases; a finding that has methodological implications. Increasing evidence indicates that the oxytocin system is involved in dogs’ human-directed social competence, thus as a next step we investigated the effects of oxytocin on processing of human facial emotions. It was found that oxytocin decreases dogs’ looking to the human faces expressing angry emotional expression. More interestingly, however, after oxytocin pre-treatment dogs’ preferential gaze toward the eye region when processing happy human facial expressions disappears. These results provide the first evidence that oxytocin is involved in the regulation of human face processing in dogs. The present study is one of the few empirical investigations that explore eye gaze patterns in naïve and untrained pet dogs using a non-invasive eye-tracking technique and thus offers unique but largely untapped method for studying social cognition in dogs.

  5. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements

    NARCIS (Netherlands)

    van den Bulk, B.G.; Koolschijn, P.C.M.P.; Meens, P.H.F.; van Lang, N.D.J.; van der Wee, N.J.A.; Rombouts, S.A.R.B.; Vermeiren, R.R.J.M.; Crone, E.A.

    2013-01-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the

  6. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    Science.gov (United States)

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  7. Attentional Bias for Emotional Faces in Children with Generalized Anxiety Disorder

    Science.gov (United States)

    Waters, Allison M.; Mogg, Karin; Bradley, Brendan P.; Pine, Daniel S.

    2008-01-01

    Attentional bias for angry and happy faces in 7-12 year old children with general anxiety disorder (GAD) is examined. Results suggest that an attentional bias toward threat faces depends on a certain degree of clinical severity and/or the type of anxiety diagnosis in children.

  8. Elevated responses to constant facial emotions in different faces in the human amygdala: an fMRI study of facial identity and expression

    Directory of Open Access Journals (Sweden)

    Weiller Cornelius

    2004-11-01

    Full Text Available Abstract Background Human faces provide important signals in social interactions by inferring two main types of information, individual identity and emotional expression. The ability to readily assess both, the variability and consistency among emotional expressions in different individuals, is central to one's own interpretation of the imminent environment. A factorial design was used to systematically test the interaction of either constant or variable emotional expressions with constant or variable facial identities in areas involved in face processing using functional magnetic resonance imaging. Results Previous studies suggest a predominant role of the amygdala in the assessment of emotional variability. Here we extend this view by showing that this structure activated to faces with changing identities that display constant emotional expressions. Within this condition, amygdala activation was dependent on the type and intensity of displayed emotion, with significant responses to fearful expressions and, to a lesser extent so to neutral and happy expressions. In contrast, the lateral fusiform gyrus showed a binary pattern of increased activation to changing stimulus features while it was also differentially responsive to the intensity of displayed emotion when processing different facial identities. Conclusions These results suggest that the amygdala might serve to detect constant facial emotions in different individuals, complementing its established role for detecting emotional variability.

  9. The role of emotion in dynamic audiovisual integration of faces and voices.

    Science.gov (United States)

    Kokinous, Jenny; Kotz, Sonja A; Tavano, Alessandro; Schröger, Erich

    2015-05-01

    We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutral expressions. An auditory-only condition served as a baseline for the interpretation of integration effects. In the audiovisual conditions, the validity of visual information was manipulated using facial expressions that were either emotionally congruent or incongruent with the vocal expressions. First, we report an N1 suppression effect for angry compared with neutral vocalizations in the auditory-only condition. Second, we confirm early integration of congruent visual and auditory information as indexed by a suppression of the auditory N1 and P2 components in the audiovisual compared with the auditory-only condition. Third, audiovisual N1 suppression was modulated by audiovisual congruency in interaction with emotion: for neutral vocalizations, there was N1 suppression in both the congruent and the incongruent audiovisual conditions. For angry vocalizations, there was N1 suppression only in the congruent but not in the incongruent condition. Extending previous findings of dynamic audiovisual integration, the current results suggest that audiovisual N1 suppression is congruency- and emotion-specific and indicate that dynamic emotional expressions compared with non-emotional expressions are preferentially processed in early audiovisual integration. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. Face processing in chronic alcoholism: a specific deficit for emotional features.

    Science.gov (United States)

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  11. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    Science.gov (United States)

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  12. Faces of Shame: Implications for Self-Esteem, Emotion Regulation, Aggression, and Well-Being.

    Science.gov (United States)

    Velotti, Patrizia; Garofalo, Carlo; Bottazzi, Federica; Caretti, Vincenzo

    2017-02-17

    There is an increasing interest in psychological research on shame experiences and their associations with other aspects of psychological functioning and well-being, as well as with possible maladaptive outcomes. In an attempt to confirm and extend previous knowledge on this topic, we investigated the nomological network of shame experiences in a large community sample (N = 380; 66.1% females), adopting a multidimensional conceptualization of shame. Females reported higher levels of shame (in particular, bodily and behavioral shame), guilt, psychological distress, emotional reappraisal, and hostility. Males had higher levels of self-esteem, emotional suppression, and physical aggression. Shame feelings were associated with low self-esteem, hostility, and psychological distress in a consistent way across gender. Associations between characterological shame and emotional suppression, as well as between bodily shame and anger occurred only among females. Moreover, characterological and bodily shame added to the prediction of low self-esteem, hostility, and psychological distress above and beyond the influence of trait shame. Finally, among females, emotional suppression mediated the influence of characterological shame on hostility and psychological distress. These findings extend current knowledge on the nomological net surrounding shame experiences in everyday life, supporting the added value of a multidimensional conceptualization of shame feelings.

  13. Modulation of Attentional Blink with Emotional Faces in Typical Development and in Autism Spectrum Disorders

    Science.gov (United States)

    Yerys, Benjamin E.; Ruiz, Ericka; Strang, John; Sokoloff, Jennifer; Kenworthy, Lauren; Vaidya, Chandan J.

    2013-01-01

    Background: The attentional blink (AB) phenomenon was used to assess the effect of emotional information on early visual attention in typically developing (TD) children and children with autism spectrum disorders (ASD). The AB effect is the momentary perceptual unawareness that follows target identification in a rapid serial visual processing…

  14. Recognizing emotions in faces : effects of acute tryptophan depletion and bright light

    NARCIS (Netherlands)

    aan het Rot, Marije; Coupland, Nicholas; Boivin, Diane B.; Benkelfat, Chawki; Young, Simon N.

    2010-01-01

    In healthy never-depressed individuals, acute tryptophan depletion (ATD) may selectively decrease the accurate recognition of fearful facial expressions. Here we investigated the perception of facial emotions after ATD in more detail. We also investigated whether bright light, which can reverse

  15. Processing of masked and unmasked emotional faces under different attentional conditions: an electrophysiological investigation.

    Directory of Open Access Journals (Sweden)

    Marzia eDel Zotto

    2015-10-01

    Full Text Available In order to investigate the interactions between non-spatial selective attention, awareness and emotion processing, we carried out an ERP study using a backward masking paradigm, in which angry, fearful, happy and neutral facial expressions were presented, while participants attempted to detect the presence of one or the other category of facial expressions in the different experimental blocks. ERP results showed that negative emotions enhanced an early N170 response over temporal-occipital leads in both masked and unmasked conditions, independently of selective attention. A later effect arising at the P2 was linked to awareness. Finally, selective attention was found to affect the N2 and N3 components over occipito-parietal leads. Our findings reveal that i the initial processing of facial expressions arises prior to attention and awareness; ii attention and awareness give rise to temporally distinct periods of activation independently of the type of emotion with only a partial degree of overlap; and iii selective attention appears to be influenced by the emotional nature of the stimuli, which in turn impinges on unconscious processing at a very early stage. This study confirms previous reports that negative facial expressions can be processed rapidly, in absence of visual awareness and independently of selective attention. On the other hand, attention and awareness may operate in a synergistic way, depending on task demand.

  16. A dimensional approach to determine common and specific neurofunctional markers for depression and social anxiety during emotional face processing.

    Science.gov (United States)

    Luo, Lizhu; Becker, Benjamin; Zheng, Xiaoxiao; Zhao, Zhiying; Xu, Xiaolei; Zhou, Feng; Wang, Jiaojian; Kou, Juan; Dai, Jing; Kendrick, Keith M

    2018-02-01

    Major depression disorder (MDD) and anxiety disorder are both prevalent and debilitating. High rates of comorbidity between MDD and social anxiety disorder (SAD) suggest common pathological pathways, including aberrant neural processing of interpersonal signals. In patient populations, the determination of common and distinct neurofunctional markers of MDD and SAD is often hampered by confounding factors, such as generally elevated anxiety levels and disorder-specific brain structural alterations. This study employed a dimensional disorder approach to map neurofunctional markers associated with levels of depression and social anxiety symptoms in a cohort of 91 healthy subjects using an emotional face processing paradigm. Examining linear associations between levels of depression and social anxiety, while controlling for trait anxiety revealed that both were associated with exaggerated dorsal striatal reactivity to fearful and sad expression faces respectively. Exploratory analysis revealed that depression scores were positively correlated with dorsal striatal functional connectivity during processing of fearful faces, whereas those of social anxiety showed a negative association during processing of sad faces. No linear relationships between levels of depression and social anxiety were observed during a facial-identity matching task or with brain structure. Together, the present findings indicate that dorsal striatal neurofunctional alterations might underlie aberrant interpersonal processing associated with both increased levels of depression and social anxiety. © 2017 Wiley Periodicals, Inc.

  17. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Directory of Open Access Journals (Sweden)

    Letizia Palumbo

    Full Text Available Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1. This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2. Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3. The bias survived insertion of a 400 ms blank (Experiment 4. These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects. We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism, which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  18. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Science.gov (United States)

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  19. vMMN for schematic faces: automatic detection of change in emotional expression

    Directory of Open Access Journals (Sweden)

    Kairi eKreegipuu

    2013-10-01

    Full Text Available Our brain is able to automatically detect changes in sensory stimulation, including in vision. A large variety of changes of features in stimulation elicit a deviance-reflecting ERP component known as the mismatch negativity (MMN. The present study has three main goals: (1 to register vMMN using a rapidly presented stream of schematic faces (neutral, happy, angry; adapted from Öhman et al., 2001; (2 to compare elicited vMMNs to angry and happy schematic faces in two different paradigms, in a traditional oddball design with frequent standard and rare target and deviant stimuli (12.5% each and in an version of an optimal multi-feature paradigm with several deviant stimuli (altogether 37.5% in the stimulus block; (3 to compare vMMNs to subjective ratings of valence, arousal and attention capture for happy and angry schematic faces, i.e., to estimate the effect of affective value of stimuli on their automatic detection. Eleven observers (19-32 years, 6 women took part in both experiments, an oddball and optimum paradigm. Stimuli were rapidly presented schematic faces and an object with face-features that served as the target stimulus to be detected by a button-press. Results show that a vMMN-type response at posterior sites was equally elicited in both experiments. Post-experimental reports confirmed that the angry face attracted more automatic attention than the happy face but the difference did not emerge directly at the ERP level. Thus, when interested in studying change detection in facial expressions we encourage the use of the optimum (multi-feature design in order to save time and other experimental resources.

  20. It Is Not Just in Faces! Processing of Emotion and Intention from Biological Motion in Psychiatric Disorders

    Directory of Open Access Journals (Sweden)

    Łukasz Okruszek

    2018-02-01

    Full Text Available Social neuroscience offers a wide range of techniques that may be applied to study the social cognitive deficits that may underlie reduced social functioning—a common feature across many psychiatric disorders. At the same time, a significant proportion of research in this area has been conducted using paradigms that utilize static displays of faces or eyes. The use of point-light displays (PLDs offers a viable alternative for studying recognition of emotion or intention inference while minimizing the amount of information presented to participants. This mini-review aims to summarize studies that have used PLD to study emotion and intention processing in schizophrenia (SCZ, affective disorders, anxiety and personality disorders, eating disorders and neurodegenerative disorders. Two main conclusions can be drawn from the reviewed studies: first, the social cognitive problems found in most of the psychiatric samples using PLD were of smaller magnitude than those found in studies presenting social information using faces or voices. Second, even though the information presented in PLDs is extremely limited, presentation of these types of stimuli is sufficient to elicit the disorder-specific, social cognitive biases (e.g., mood-congruent bias in depression, increased threat perception in anxious individuals, aberrant body size perception in eating disorders documented using other methodologies. Taken together, these findings suggest that point-light stimuli may be a useful method of studying social information processing in psychiatry. At the same time, some limitations of using this methodology are also outlined.

  1. Fusiform gyrus dysfunction is associated with perceptual processing efficiency to emotional faces in adolescent depression: a model-based approach

    Directory of Open Access Journals (Sweden)

    Tiffany Cheing Ho

    2016-02-01

    Full Text Available While the extant literature has focused on major depressive disorder (MDD as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions, little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI. We analyzed the behavioral data using a sequential sampling model of response time (RT commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA, the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.

  2. Shall I call, text, post it online or just tell it face-to-face? How and why Flemish adolescents choose to share their emotions on- or offline

    OpenAIRE

    Vermeulen, Anne; Vandebosch, Heidi; Heirman, Wannes

    2017-01-01

    Abstract: Social sharing of emotions is a frequently used emotion regulation strategy. This study adds to the emotion regulation literature and the affordances of technologies perspective by providing a better understanding of with whom adolescents share emotions on- and offline, how they do this and why they use certain modes. In-depth interviews with 22 Flemish adolescents (aged 1418) show that these youngsters share almost all experienced emotions, often with multiple recipients and using ...

  3. Social Anxiety Under Load: The Effects of Perceptual Load in Processing Emotional Faces

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Soares

    2015-04-01

    Full Text Available Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, fifty-four university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral were simultaneously presented. The results showed that high (compared to low socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  4. Social anxiety under load: the effects of perceptual load in processing emotional faces.

    Science.gov (United States)

    Soares, Sandra C; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral) were simultaneously presented. The results showed that high (compared to low) socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  5. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  6. Building Biases in Infancy: The Influence of Race on Face and Voice Emotion Matching

    Science.gov (United States)

    Vogel, Margaret; Monesson, Alexandra; Scott, Lisa S.

    2012-01-01

    Early in the first year of life infants exhibit equivalent performance distinguishing among people within their own race and within other races. However, with development and experience, their face recognition skills become tuned to groups of people they interact with the most. This developmental tuning is hypothesized to be the origin of adult…

  7. Attention Bias to Emotional Faces Varies by IQ and Anxiety in Williams Syndrome

    Science.gov (United States)

    McGrath, Lauren M.; Oates, Joyce M.; Dai, Yael G.; Dodd, Helen F.; Waxler, Jessica; Clements, Caitlin C.; Weill, Sydney; Hoffnagle, Alison; Anderson, Erin; MacRae, Rebecca; Mullett, Jennifer; McDougle, Christopher J.; Pober, Barbara R.; Smoller, Jordan W.

    2016-01-01

    Individuals with Williams syndrome (WS) often experience significant anxiety. A promising approach to anxiety intervention has emerged from cognitive studies of attention bias to threat. To investigate the utility of this intervention in WS, this study examined attention bias to happy and angry faces in individuals with WS (N = 46). Results showed…

  8. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    Science.gov (United States)

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  9. Priming Facial Gender and Emotional Valence: The Influence of Spatial Frequency on Face Perception in ASD

    Science.gov (United States)

    Vanmarcke, Steven; Wagemans, Johan

    2017-01-01

    Adolescents with and without autism spectrum disorder (ASD) performed two priming experiments in which they implicitly processed a prime stimulus, containing high and/or low spatial frequency information, and then explicitly categorized a target face either as male/female (gender task) or as positive/negative (Valence task). Adolescents with ASD…

  10. Hemispheric contributions to the processing of emotion in chimeric faces : behavioural and electrophysiological evidence

    OpenAIRE

    Geiger, Anja

    2005-01-01

    The face in general and facial expressions in particular have always been a focus of sociological, biological and psychological interest, with numerous different scientific approaches and objectives, investigating the expression and perception of facial affect or the social interaction of transmitting facial information between poser and perceiver.This study, dealing with two different aspects of facial expression, namely intensity and efficiency of facial expression, focuses on hemispheric c...

  11. Social anxiety under load: the effects of perceptual load in processing emotional faces

    OpenAIRE

    Soares, Sandra C.; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F.

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant fac...

  12. The telltale face: possible mechanisms behind defector and cooperator recognition revealed by emotional facial expression metrics.

    Science.gov (United States)

    Kovács-Bálint, Zsófia; Bereczkei, Tamás; Hernádi, István

    2013-11-01

    In this study, we investigated the role of facial cues in cooperator and defector recognition. First, a face image database was constructed from pairs of full face portraits of target subjects taken at the moment of decision-making in a prisoner's dilemma game (PDG) and in a preceding neutral task. Image pairs with no deficiencies (n = 67) were standardized for orientation and luminance. Then, confidence in defector and cooperator recognition was tested with image rating in a different group of lay judges (n = 62). Results indicate that (1) defectors were better recognized (58% vs. 47%), (2) they looked different from cooperators (p towards the cooperator category (p < .01), and (4) females were more confident in detecting defectors (p < .05). According to facial microexpression analysis, defection was strongly linked with depressed lower lips and less opened eyes. Significant correlation was found between the intensity of micromimics and the rating of images in the cooperator-defector dimension. In summary, facial expressions can be considered as reliable indicators of momentary social dispositions in the PDG. Females may exhibit an evolutionary-based overestimation bias to detecting social visual cues of the defector face. © 2012 The British Psychological Society.

  13. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V

    2017-01-01

    neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial...... expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster...... to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy....

  14. The emotional context facing nursing home residents' families: a call for role reinforcement strategies from nursing homes and the community.

    Science.gov (United States)

    Bern-Klug, Mercedes

    2008-01-01

    Identify useful concepts related to the emotional context facing family members of nursing home residents. These concepts can be used in future studies to design and test interventions that benefit family caregivers. Secondary data analyses of qualitative ethnographic data. Two nursing homes in a large Midwestern city; 8 months of data collection in each. 44 family members of nursing home residents whose health was considered, "declining." Role theory was used to design and help interpret the findings. Data included transcripts of conversations between family members and researchers and were analyzed using a coding scheme developed for the secondary analysis. Comments about emotions related to the social role of family member were grouped into three categories: relief related to admission, stress, and decision making support/stress. Subcategories of stress include the role strain associated with "competing concerns" and the psychological pressures of 1) witnessing the decline of a loved one in a nursing home, and 2) guilt about placement. Decision-making was discussed as a challenge which family members did not want to face alone; support from the resident, health care professionals, and other family members was appreciated. Family members may benefit from role reinforcement activities provided by nursing home staff and community members. All nursing home staff members (in particular social workers) and physicians are called upon to provide educationa and support regarding nursing home admissions, during the decline of the resident, and especially regarding medical decision-making. Community groups are asked to support the family member by offering assistance with concrete tasks (driving, visiting, etc.) and social support.

  15. Neural Correlates of Task-Irrelevant First and Second Language Emotion Words — Evidence from the Face-Word Stroop Task

    Directory of Open Access Journals (Sweden)

    Lin Fan

    2016-11-01

    Full Text Available Emotionally valenced words have thus far not been empirically examined in a bilingual population with the emotional face-word Stroop paradigm. Chinese-English bilinguals were asked to identify the facial expressions of emotion with their first (L1 or second (L2 language task-irrelevant emotion words superimposed on the face pictures. We attempted to examine how the emotional content of words modulates behavioral performance and cerebral functioning in the bilinguals’ two languages. The results indicated that there were significant congruency effects for both L1 and L2 emotion words, and that identifiable differences in the magnitude of Stroop effect between the two languages were also observed, suggesting L1 is more capable of activating the emotional response to word stimuli. For event-related potentials (ERPs data, an N350-550 effect was observed only in L1 task with greater negativity for incongruent than congruent trials. The size of N350-550 effect differed across languages, whereas no identifiable language distinction was observed in the effect of conflict slow potential (conflict SP. Finally, more pronounced negative amplitude at 230-330 ms was observed in L1 than in L2, but only for incongruent trials. This negativity, likened to an orthographic decoding N250, may reflect the extent of attention to emotion word processing at word-form level, while N350-550 reflects a complicated set of processes in the conflict processing. Overall, the face-word congruency effect has reflected identifiable language distinction at 230-330 and 350-550 ms, which provides supporting evidence for the theoretical proposals assuming attenuated emotionality of L2 processing.

  16. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  17. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory.

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-23

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the "uncanny valley" effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  18. Repeated Short-term (2h×14d) Emotional Stress Induces Lasting Depression-like Behavior in Mice.

    Science.gov (United States)

    Kim, Kyoung-Shim; Kwon, Hye-Joo; Baek, In-Sun; Han, Pyung-Lim

    2012-03-01

    Chronic behavioral stress is a risk factor for depression. To understand chronic stress effects and the mechanism underlying stress-induced emotional changes, various animals model have been developed. We recently reported that mice treated with restraints for 2 h daily for 14 consecutive days (2h-14d or 2h×14d) show lasting depression-like behavior. Restraint provokes emotional stress in the body, but the nature of stress induced by restraints is presumably more complex than emotional stress. So a question remains unsolved whether a similar procedure with "emotional" stress is sufficient to cause depression-like behavior. To address this, we examined whether "emotional" constraints in mice treated for 2h×14d by enforcing them to individually stand on a small stepping platform placed in a water bucket with a quarter full of water, and the stress evoked by this procedure was termed "water-bucket stress". The water-bucket stress activated the hypothalamus-pituitary-adrenal gland (HPA) system in a manner similar to restraint as evidenced by elevation of serum glucocorticoids. After the 2h×14d water-bucket stress, mice showed behavioral changes that were attributed to depression-like behavior, which was stably detected >3 weeks after last water-bucket stress endorsement. Administration of the anti-depressant, imipramine, for 20 days from time after the last emotional constraint completely reversed the stress-induced depression-like behavior. These results suggest that emotional stress evokes for 2h×14d in mice stably induces depression-like behavior in mice, as does the 2h×14d restraint.

  19. Emotions

    DEFF Research Database (Denmark)

    Kristensen, Liv Kondrup; Otrel-Cass, Kathrin

    2017-01-01

    Observing science classroom activities presents an opportunity to observe the emotional aspect of interactions, and this chapter presents how this can be done and why. Drawing on ideas proposed by French philosopher Maurice Merleau-Ponty, emotions are theorized as publicly embodied enactments......, where differences in behavior between people shape emotional responses. Merleau-Ponty’s theorization of the body and feelings is connected to embodiment while examining central concepts such as consciousness and perception. Merleau-Ponty describes what he calls the emotional atmosphere and how it shapes...... the ways we experience events and activities. We use our interpretation of his understanding of emotions to examine an example of a group of year 8 science students who were engaged in a physics activity. Using the analytical framework of analyzing bodily stance by Goodwin, Cekaite, and Goodwin...

  20. Developing an eBook-Integrated High-Fidelity Mobile App Prototype for Promoting Child Motor Skills and Taxonomically Assessing Children's Emotional Responses Using Face and Sound Topology.

    Science.gov (United States)

    Brown, William; Liu, Connie; John, Rita Marie; Ford, Phoebe

    2014-01-01

    Developing gross and fine motor skills and expressing complex emotion is critical for child development. We introduce "StorySense", an eBook-integrated mobile app prototype that can sense face and sound topologies and identify movement and expression to promote children's motor skills and emotional developmental. Currently, most interactive eBooks on mobile devices only leverage "low-motor" interaction (i.e. tapping or swiping). Our app senses a greater breath of motion (e.g. clapping, snapping, and face tracking), and dynamically alters the storyline according to physical responses in ways that encourage the performance of predetermined motor skills ideal for a child's gross and fine motor development. In addition, our app can capture changes in facial topology, which can later be mapped using the Facial Action Coding System (FACS) for later interpretation of emotion. StorySense expands the human computer interaction vocabulary for mobile devices. Potential clinical applications include child development, physical therapy, and autism.

  1. Emotion talk in the context of young people self?harming: facing the feelings in family therapy

    OpenAIRE

    Rogers, Alice; Schmidt, Petra

    2016-01-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self?harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using ?emotion talk? (Fredman, 2004) in deconstruc...

  2. Emotion talk in the context of young people self‐harming: facing the feelings in family therapy

    Science.gov (United States)

    Schmidt, Petra

    2016-01-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self‐harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using ‘emotion talk’ (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family‐of‐origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions. Practitioner points Emotions can be seen as both a reflection of feelings experienced by the individual and as a communication.An interactional understanding of emotions can be used therapeutically.Therapists should explore emotional displays and track the interactional patterns within the therapeutic system.Therapists should self‐reflexive about ways of doing emotions and use this awareness in practice. PMID:27667879

  3. Emotion talk in the context of young people self-harming: facing the feelings in family therapy.

    Science.gov (United States)

    Rogers, Alice; Schmidt, Petra

    2016-04-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self-harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using 'emotion talk' (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family-of-origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions. Emotions can be seen as both a reflection of feelings experienced by the individual and as a communication.An interactional understanding of emotions can be used therapeutically.Therapists should explore emotional displays and track the interactional patterns within the therapeutic system.Therapists should self-reflexive about ways of doing emotions and use this awareness in practice.

  4. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    Science.gov (United States)

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  5. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    OpenAIRE

    Lili Guan; Lili Guan; Yufang Zhao; Yige Wang; Yujie Chen; Juan Yang

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the ...

  6. When the face reveals what words do not: facial expressions of emotion, smiling, and the willingness to disclose childhood sexual abuse.

    Science.gov (United States)

    Bonanno, George A; Keltner, Dacher; Noll, Jennie G; Putnam, Frank W; Trickett, Penelope K; LeJeune, Jenna; Anderson, Cameron

    2002-07-01

    For survivors of childhood sexual abuse (CSA), verbal disclosure is often complex and painful. The authors examined the voluntary disclosure-nondisclosure of CSA in relation to nonverbal expressions of emotion in the face. Consistent with hypotheses derived from recent theorizing about the moral nature of emotion, CSA survivors who did not voluntarily disclose CSA showed greater facial expressions of shame, whereas CSA survivors who voluntarily disclosed CSA expressed greater disgust. Expressions of disgust also signaled sexual abuse accompanied by violence. Consistent with recent theorizing about smiling behavior, CSA nondisclosers made more polite smiles, whereas nonabused participants expressed greater genuine positive emotion. Discussion addressed the implications of these findings for the study of disclosure of traumatic events, facial expression, and the links between morality and emotion.

  7. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  8. Specific Patterns of Emotion Recognition from Faces in Children with ASD: Results of a Cross-Modal Matching Paradigm

    Science.gov (United States)

    Golan, Ofer; Gordon, Ilanit; Fichman, Keren; Keinan, Giora

    2018-01-01

    Children with ASD show emotion recognition difficulties, as part of their social communication deficits. We examined facial emotion recognition (FER) in intellectually disabled children with ASD and in younger typically developing (TD) controls, matched on mental age. Our emotion-matching paradigm employed three different modalities: facial, vocal…

  9. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study.

    Science.gov (United States)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V; Macoveanu, Julian; Harmer, Catherine J; Jørgensen, Anders; Revsbech, Rasmus; Jensen, Hans M; Paulson, Olaf B; Siebner, Hartwig R; Jørgensen, Martin B

    2017-09-01

    Negative neurocognitive bias is a core feature of major depressive disorder that is reversed by pharmacological and psychological treatments. This double-blind functional magnetic resonance imaging study investigated for the first time whether electroconvulsive therapy modulates negative neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster-forming threshold of Z>3.1 ( p2.3; pelectroconvulsive therapy-induced changes in parahippocampal and superior frontal responses to fearful versus happy faces as well as in fear-specific functional connectivity between amygdala and occipito-temporal regions. Across all patients, greater fear-specific amygdala - occipital coupling correlated with lower fear vigilance. Despite no statistically significant shift in neural response to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy.

  10. Put on a happy face! Inhibitory control and socioemotional knowledge predict emotion regulation in 5- to 7-year-olds.

    Science.gov (United States)

    Hudson, Amanda; Jacques, Sophie

    2014-07-01

    Children's developing capacity to regulate emotions may depend on individual characteristics and other abilities, including age, sex, inhibitory control, theory of mind, and emotion and display rule knowledge. In the current study, we examined the relations between these variables and children's (N=107) regulation of emotion in a disappointing gift paradigm as well as their relations with the amount of effort to control emotion children exhibited after receiving the disappointing gift. Regression analyses were also conducted to identify unique predictors. Children's understanding of others' emotions and emotion display rules, as well as their inhibitory control skills, emerged as significant correlates of emotion regulation and predicted children's responses to the disappointing gift even after controlling for other relevant variables. Age and inhibitory control significantly predicted the amount of overt effort that went into regulating emotions, as did emotion knowledge (albeit only marginally). Together, findings suggest that effectively regulating emotions requires (a) knowledge of context-appropriate emotions along with (b) inhibitory skills to implement that knowledge. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Emotion

    DEFF Research Database (Denmark)

    Jantzen, Christian; Vetner, Mikael

    2006-01-01

    En emotion er en evaluerende respons på en betydningsfuld hændelse, som har affektiv valens og motiverer organismen i forhold til objektverdenen (omverden). Emotioner fører til affekt: til smerte (negativ) eller glæde (positiv affekt). Både positive og negative emotioner påvirker organismens...

  12. Face-to-Face and Online: An Investigation of Children's and Adolescents' Bullying Behavior through the Lens of Moral Emotions and Judgments

    Science.gov (United States)

    Conway, Lauryn; Gomez-Garibello, Carlos; Talwar, Victoria; Shariff, Shaheen

    2016-01-01

    The current study investigated the influence of type of aggression (cyberbullying or traditional bullying) and participant role (bystander or perpetrator) on children and adolescents' self-attribution of moral emotions and judgments, while examining the influence of chronological age. Participants (N = 122, 8-16 years) evaluated vignettes and were…

  13. Abnormal left and right amygdala-orbitofrontal cortical functional connectivity to emotional faces: state versus trait vulnerability markers of depression in bipolar disorder.

    Science.gov (United States)

    Versace, Amelia; Thompson, Wesley K; Zhou, Donli; Almeida, Jorge R C; Hassel, Stefanie; Klein, Crystal R; Kupfer, David J; Phillips, Mary L

    2010-03-01

    Amygdala-orbitofrontal cortical (OFC) functional connectivity (FC) to emotional stimuli and relationships with white matter remain little examined in bipolar disorder individuals (BD). Thirty-one BD (type I; n = 17 remitted; n = 14 depressed) and 24 age- and gender-ratio-matched healthy individuals (HC) viewed neutral, mild, and intense happy or sad emotional faces in two experiments. The FC was computed as linear and nonlinear dependence measures between amygdala and OFC time series. Effects of group, laterality, and emotion intensity upon amygdala-OFC FC and amygdala-OFC FC white matter fractional anisotropy (FA) relationships were examined. The BD versus HC showed significantly greater right amygdala-OFC FC (p relationship (p = .001) between left amygdala-OFC FC to sad faces and FA in HC. In BD, antidepressants were associated with significantly reduced left amygdala-OFC FC to mild sad faces (p = .001). In BD, abnormally elevated right amygdala-OFC FC to sad stimuli might represent a trait vulnerability for depression, whereas abnormally elevated left amygdala-OFC FC to sad stimuli and abnormally reduced amygdala-OFC FC to intense happy stimuli might represent a depression state marker. Abnormal FC measures might normalize with antidepressant medications in BD. Nonlinear amygdala-OFC FC-FA relationships in BD and HC require further study. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  14. Motivated emotion and the rally around the flag effect: liberals are motivated to feel collective angst (like conservatives) when faced with existential threat.

    Science.gov (United States)

    Porat, Roni; Tamir, Maya; Wohl, Michael J A; Gur, Tamar; Halperin, Eran

    2018-04-18

    A careful look at societies facing threat reveals a unique phenomenon in which liberals and conservatives react emotionally and attitudinally in a similar manner, rallying around the conservative flag. Previous research suggests that this rally effect is the result of liberals shifting in their attitudes and emotional responses toward the conservative end. Whereas theories of motivated social cognition provide a motivation-based account of cognitive processes (i.e. attitude shift), it remains unclear whether emotional shifts are, in fact, also a motivation-based process. Herein, we propose that under threat, liberals are motivated to feel existential concern about their group's future vitality (i.e. collective angst) to the same extent as conservatives, because this group-based emotion elicits support for ingroup protective action. Within the context of the Palestinian-Israeli conflict, we tested and found support for this hypothesis both inside (Study 1) and outside (Study 2) the laboratory. We did so using a behavioural index of motivation to experience collective angst. We discuss the implications of our findings for understanding motivated emotion regulation in the context of intergroup threat.

  15. Job resources and recovery experiences to face difficulties in emotion regulations at work: a diary study among nurses

    NARCIS (Netherlands)

    Blanco-Donoso, L.M.; Garrosa, E.; Demerouti, E.; Moreno-Jiménez, B.

    2017-01-01

    The present study examines the role of daily difficulties in emotion regulation at work in nurse’s daily well-being and how certain job resources and recovery experiences influence this relationship. We hypothesized that daily difficulties to regulate emotions at work would be significantly and

  16. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  17. The Moderating Effect of Self-Reported State and Trait Anxiety on the Late Positive Potential to Emotional Faces in 6–11-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Georgia Chronaki

    2018-02-01

    Full Text Available Introduction: The emergence of anxiety during childhood is accompanied by the development of attentional biases to threat. However, the neural mechanisms underlying these biases are poorly understood. In addition, previous research has not examined whether state and trait anxiety are independently associated with threat-related biases.Methods: We compared ERP waveforms during the processing of emotional faces in a population sample of 58 6–11-year-olds who completed self-reported measures of trait and state anxiety and depression.Results: The results showed that the P1 was larger to angry than neutral faces in the left hemisphere, though early components (P1, N170 were not strongly associated with child anxiety or depression. In contrast, Late Positive Potential (LPP amplitudes to angry (vs. neutral faces were significantly and positively associated with symptoms of anxiety/depression. In addition, the difference between LPPs for angry (vs. neutral faces was independently associated with state and trait anxiety symptoms.Discussion: The results showed that neural responses to facial emotion in children with elevated symptoms of anxiety and depression were most evident at later processing stages characterized as evaluative and effortful. The findings support cognitive models of threat perception in anxiety and indicate that trait elements of anxiety and more transitory fluctuations in anxious affect are important in understanding individual variation in the neural response to threat in late childhood.

  18. Greater Pupil Size in Response to Emotional Faces as an Early Marker of Social-Communicative Difficulties in Infants at High Risk for Autism.

    Science.gov (United States)

    Wagner, Jennifer B; Luyster, Rhiannon J; Tager-Flusberg, Helen; Nelson, Charles A

    2016-01-01

    When scanning faces, individuals with autism spectrum disorder (ASD) have shown reduced visual attention (e.g., less time on eyes) and atypical autonomic responses (e.g., heightened arousal). To understand how these differences might explain sub-clinical variability in social functioning, 9-month-olds, with or without a family history of ASD, viewed emotionally-expressive faces, and gaze and pupil diameter (a measure of autonomic activation) were recorded using eye-tracking. Infants at high-risk for ASD with no subsequent clinical diagnosis (HRA-) and low-risk controls (LRC) showed similar face scanning and attention to eyes and mouth. Attention was overall greater to eyes than mouth, but this varied as a function of the emotion presented. HRA- showed significantly larger pupil size than LRC. Correlations between scanning at 9 months, pupil size at 9 months, and 18-month social-communicative behavior, revealed positive associations between pupil size and attention to both face and eyes at 9 months in LRC, and a negative association between 9-month pupil size and 18-month social-communicative behavior in HRA-. The present findings point to heightened autonomic arousal in HRA-. Further, with greater arousal relating to worse social-communicative functioning at 18 months, this work points to a mechanism by which unaffected siblings might develop atypical social behavior.

  19. Event-related brain responses to emotional words, pictures, and faces – a cross-domain comparison

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal. PMID:25339927

  20. Attachment and couple satisfaction as predictors of expressed emotion in women facing breast cancer and their partners in the immediate post-surgery period.

    Science.gov (United States)

    Favez, Nicolas; Cairo Notari, Sarah; Antonini, Tania; Charvoz, Linda

    2017-02-01

    To investigate expressed emotion (EE) in couples facing breast cancer in the immediate post-surgery period. EE may be predictive of psychological disturbances that hinder both partners' capacities to cope with the stress of the disease. Severity of the disease, attachment tendencies, and couple satisfaction were tested as predictors of EE. The design was cross-sectional. Couples (N = 61) were interviewed 2 weeks after the women's breast surgery. Expressed emotion was assessed in women and in partners with the Five-Minute Speech Sample, with a focus on overt and covert criticisms. Self-reported EE, attachment tendencies, and couple satisfaction were assessed with questionnaires. Hierarchical regression analyses were performed to test the predictors and possible interactions between them. Both partners expressed overt and covert criticisms; women expressed more overt criticisms than did their partners. Cancer stage was inversely related to the number of overt criticisms in women and to the number of covert criticisms in partners. Regression analyses showed that in women, higher attachment anxiety and lower couple satisfaction were positive predictors of overt criticisms; in partners, a higher cancer stage was a negative predictor of overt and covert criticisms. Practitioners should pay attention to the couple relationship in breast cancer. EE is most likely to appear when the cancer stage is low, showing that even when the medical prognosis is optimal, relational and emotional disturbances may occur. Statement of contribution What is already known on this subject? The couple relationship is of paramount importance in breast cancer. Expressed emotion (EE) is related to negative individual and relational psychological outcomes in psychiatric and somatic diseases. Expressed emotion has not yet been studied in the context of breast cancer. What does this study add? Expressed emotion is present in breast cancer situations, especially when the cancer stage is low. There

  1. The impact of oxytocin administration and maternal love withdrawal on event-related potential (ERP) responses to emotional faces with performance feedback.

    Science.gov (United States)

    Huffmeijer, Renske; Alink, Lenneke R A; Tops, Mattie; Grewen, Karen M; Light, Kathleen C; Bakermans-Kranenburg, Marian J; van Ijzendoorn, Marinus H

    2013-03-01

    This is the first experimental study on the effect of oxytocin administration on the neural processing of facial stimuli conducted with female participants that uses event-related potentials (ERPs). Using a double-blind, placebo-controlled within-subjects design, we studied the effects of 16 IU of intranasal oxytocin on ERPs to pictures combining performance feedback with emotional facial expressions in 48 female undergraduate students. Participants also reported on the amount of love withdrawal they experienced from their mothers. Vertex positive potential (VPP) and late positive potential (LPP) amplitudes were more positive after oxytocin compared to placebo administration. This suggests that oxytocin increased attention to the feedback stimuli (LPP) and enhanced the processing of emotional faces (VPP). Oxytocin heightened processing of the happy and disgusted faces primarily for those reporting less love withdrawal. Significant associations with LPP amplitude suggest that more maternal love withdrawal relates to the allocation of attention toward the motivationally relevant combination of negative feedback with a disgusted face. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Adjunctive selective estrogen receptor modulator increases neural activity in the hippocampus and inferior frontal gyrus during emotional face recognition in schizophrenia.

    Science.gov (United States)

    Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W

    2016-05-03

    Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, Precognition in schizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia.

  3. Computer-Assisted Face Processing Instruction Improves Emotion Recognition, Mentalizing, and Social Skills in Students with ASD

    Science.gov (United States)

    Rice, Linda Marie; Wall, Carla Anne; Fogel, Adam; Shic, Frederick

    2015-01-01

    This study examined the extent to which a computer-based social skills intervention called "FaceSay"™ was associated with improvements in affect recognition, mentalizing, and social skills of school-aged children with Autism Spectrum Disorder (ASD). "FaceSay"™ offers students simulated practice with eye gaze, joint attention,…

  4. The Perception of Four Basic Emotions in Human and Nonhuman Faces by Children with Autism and Other Developmental Disabilities

    Science.gov (United States)

    Gross, Thomas F.

    2004-01-01

    Children who experienced autism, mental retardation, and language disorders; and, children in a clinical control group were shown photographs of human female, orangutan, and canine (boxer) faces expressing happiness, sadness, anger, surprise and a neutral expression. For each species of faces, children were asked to identify the happy, sad, angry,…

  5. The influence of combined cognitive plus social-cognitive training on amygdala response during face emotion recognition in schizophrenia.

    Science.gov (United States)

    Hooker, Christine I; Bruce, Lori; Fisher, Melissa; Verosky, Sara C; Miyakawa, Asako; D'Esposito, Mark; Vinogradov, Sophia

    2013-08-30

    Both cognitive and social-cognitive deficits impact functional outcome in schizophrenia. Cognitive remediation studies indicate that targeted cognitive and/or social-cognitive training improves behavioral performance on trained skills. However, the neural effects of training in schizophrenia and their relation to behavioral gains are largely unknown. This study tested whether a 50-h intervention which included both cognitive and social-cognitive training would influence neural mechanisms that support social ccognition. Schizophrenia participants completed a computer-based intervention of either auditory-based cognitive training (AT) plus social-cognition training (SCT) (N=11) or non-specific computer games (CG) (N=11). Assessments included a functional magnetic resonance imaging (fMRI) task of facial emotion recognition, and behavioral measures of cognition, social cognition, and functional outcome. The fMRI results showed the predicted group-by-time interaction. Results were strongest for emotion recognition of happy, surprise and fear: relative to CG participants, AT+SCT participants showed a neural activity increase in bilateral amygdala, right putamen and right medial prefrontal cortex. Across all participants, pre-to-post intervention neural activity increase in these regions predicted behavioral improvement on an independent emotion perception measure (MSCEIT: Perceiving Emotions). Among AT+SCT participants alone, neural activity increase in right amygdala predicted behavioral improvement in emotion perception. The findings indicate that combined cognition and social-cognition training improves neural systems that support social-cognition skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Emotion and food. Do the emotions expressed on other people's faces affect the desire to eat liked and disliked food products?

    Science.gov (United States)

    Barthomeuf, L; Rousset, S; Droit-Volet, S

    2009-02-01

    The aim of the present study was to test if pleasure, neutrality and disgust expressed by other individuals on a photograph could affect the desire to eat liked or disliked food products. Forty-four men and women were presented with two series of photographs. The first series of photographs was composed of six food photographs: three liked and three disliked food products. The second series consisted of the same photographs presented with eaters expressing three different emotions: disgust, pleasure or neutrality. Results showed that the effect of the presence of an eater, and of emotions expressed by this eater, depended on the food category. For the liked foods, the desire to eat was higher when these foods were presented alone than with an eater expressing neutral emotion. When the eater expressed pleasure, the desire to eat these liked foods did not significantly increase. In contrast, when the eater expressed disgust, the desire to eat them significantly decreased. When the foods were disliked, the influence of the pleasant social context was stronger than for the liked foods. The desire to eat the disliked foods actually increased in the presence of an eater expressing pleasure. On the contrary, the disgust and neutral context had no effect on the desire for disliked foods.

  7. Facing the challenge of teaching emotions to individuals with low- and high-functioning autism using a new Serious game: a pilot study

    Science.gov (United States)

    2014-01-01

    Background It is widely accepted that emotion processing difficulties are involved in Autism Spectrum Conditions (ASC). An increasing number of studies have focused on the development of training programs and have shown promising results. However, most of these programs are appropriate for individuals with high-functioning ASC (HFA) but exclude individuals with low-functioning ASC (LFA). We have developed a computer-based game called JeStiMulE based on logical skills to teach emotions to individuals with ASC, independently of their age, intellectual, verbal and academic level. The aim of the present study was to verify the usability of JeStiMulE (which is its adaptability, effectiveness and efficiency) on a heterogeneous ASC group. We hypothesized that after JeStiMulE training, a performance improvement would be found in emotion recognition tasks. Methods A heterogeneous group of thirty-three children and adolescents with ASC received two one-hour JeStiMulE sessions per week over four weeks. In order to verify the usability of JeStiMulE, game data were collected for each participant. Furthermore, all participants were presented before and after training with five emotion recognition tasks, two including pictures of game avatars (faces and gestures) and three including pictures of real-life characters (faces, gestures and social scenes). Results Descriptive data showed suitable adaptability, effectiveness and efficiency of JeStiMulE. Results revealed a significant main effect of Session on avatars (ANOVA: F (1,32) = 98.48, P Emotion interaction was also found for avatars (ANOVA: F (6,192) = 2.84, P = .01). This triple interaction was close to significance for pictures of real-life characters (ANOVA: F (12,384) = 1.73, P = .057). Post-hoc analyses revealed that 30 out of 35 conditions found a significant increase after training. Conclusions JeStiMulE appears to be a promising tool to teach emotion recognition not only to individuals with HFA but

  8. One Size Does Not Fit All: Face Emotion Processing Impairments in Semantic Dementia, Behavioural-Variant Frontotemporal Dementia and Alzheimer?s Disease Are Mediated by Distinct Cognitive Deficits

    OpenAIRE

    Miller, Laurie A.; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R.; Piguet, Olivier

    2011-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and E...

  9. Dialogue Design for a Robot-Based Face-Mirroring Game to Engage Autistic Children with Emotional Expressions

    NARCIS (Netherlands)

    Chevalier, Pauline; Li, Jamy Jue; Ainger, Eloise; Alcorn, Alyssa M.; Babovic, Snezana; Charisi, Vicky; Petrovic, Suncica; Schadenberg, Bob Rinse; Pellicano, Elizabeth; Evers, Vanessa; Kheddar, Abderrahmane; Yoshida, Eiichi; Ge, Shuzhi Sam; Suzuki, Kenji; Cabibihan, John-John; Eyssel, Friederike; He, Hongsheng

    2017-01-01

    We present design strategies for Human Robot Interaction for school-aged autistic children with limited receptive language. Applying these strategies to the DE-ENIGMA project (large EU project addressing emotion recognition in autistic children) supported development of a new activity for in facial

  10. Facing the Music or Burying Our Heads in the Sand?: Adaptive Emotion Regulation in Midlife and Late Life

    Science.gov (United States)

    Waldinger, Robert J.; Schulz, Marc S.

    2011-01-01

    Defenses that keep threatening information out of awareness are posited to reduce anxiety at the cost of longer-term dysfunction. By contrast, socioemotional selectivity theory suggests that preference for positively-valenced information is a late-life manifestation of adaptive emotion regulation. Using longitudinal data on 61 men, we examined links between emotion regulation indices informed by these distinct conceptualizations: defenses in earlier adulthood and selective memory for positively-valenced images in late-life. Use of avoidant defenses in midlife predicted poorer memory for positive, negative, and neutral images nearly 4 decades later. Late-life satisfaction was positively linked with midlife engaging defenses but negatively linked at the trend level with concurrent positive memory bias. PMID:21544264

  11. In the face of threat: neural and endocrine correlates of impaired facial emotion recognition in cocaine dependence.

    Science.gov (United States)

    Ersche, K D; Hagan, C C; Smith, D G; Jones, P S; Calder, A J; Williams, G B

    2015-05-26

    The ability to recognize facial expressions of emotion in others is a cornerstone of human interaction. Selective impairments in the recognition of facial expressions of fear have frequently been reported in chronic cocaine users, but the nature of these impairments remains poorly understood. We used the multivariate method of partial least squares and structural magnetic resonance imaging to identify gray matter brain networks that underlie facial affect processing in both cocaine-dependent (n = 29) and healthy male volunteers (n = 29). We hypothesized that disruptions in neuroendocrine function in cocaine-dependent individuals would explain their impairments in fear recognition by modulating the relationship with the underlying gray matter networks. We found that cocaine-dependent individuals not only exhibited significant impairments in the recognition of fear, but also for facial expressions of anger. Although recognition accuracy of threatening expressions co-varied in all participants with distinctive gray matter networks implicated in fear and anger processing, in cocaine users it was less well predicted by these networks than in controls. The weaker brain-behavior relationships for threat processing were also mediated by distinctly different factors. Fear recognition impairments were influenced by variations in intelligence levels, whereas anger recognition impairments were associated with comorbid opiate dependence and related reduction in testosterone levels. We also observed an inverse relationship between testosterone levels and the duration of crack and opiate use. Our data provide novel insight into the neurobiological basis of abnormal threat processing in cocaine dependence, which may shed light on new opportunities facilitating the psychosocial integration of these patients.

  12. Functional connectivity decreases in autism in emotion, self, and face circuits identified by Knowledge-based Enrichment Analysis.

    Science.gov (United States)

    Cheng, Wei; Rolls, Edmund T; Zhang, Jie; Sheng, Wenbo; Ma, Liang; Wan, Lin; Luo, Qiang; Feng, Jianfeng

    2017-03-01

    A powerful new method is described called Knowledge based functional connectivity Enrichment Analysis (KEA) for interpreting resting state functional connectivity, using circuits that are functionally identified using search terms with the Neurosynth database. The method derives its power by focusing on neural circuits, sets of brain regions that share a common biological function, instead of trying to interpret single functional connectivity links. This provides a novel way of investigating how task- or function-related networks have resting state functional connectivity differences in different psychiatric states, provides a new way to bridge the gap between task and resting-state functional networks, and potentially helps to identify brain networks that might be treated. The method was applied to interpreting functional connectivity differences in autism. Functional connectivity decreases at the network circuit level in 394 patients with autism compared with 473 controls were found in networks involving the orbitofrontal cortex, anterior cingulate cortex, middle temporal gyrus cortex, and the precuneus, in networks that are implicated in the sense of self, face processing, and theory of mind. The decreases were correlated with symptom severity. Copyright © 2017. Published by Elsevier Inc.

  13. Don't make me angry, you wouldn't like me when I'm angry: Volitional choices to act or inhibit are modulated by subliminal perception of emotional faces.

    Science.gov (United States)

    Parkinson, Jim; Garfinkel, Sarah; Critchley, Hugo; Dienes, Zoltan; Seth, Anil K

    2017-04-01

    Volitional action and self-control-feelings of acting according to one's own intentions and in being control of one's own actions-are fundamental aspects of human conscious experience. However, it is unknown whether high-level cognitive control mechanisms are affected by socially salient but nonconscious emotional cues. In this study, we manipulated free choice decisions to act or withhold an action by subliminally presenting emotional faces: In a novel version of the Go/NoGo paradigm, participants made speeded button-press responses to Go targets, withheld responses to NoGo targets, and made spontaneous, free choices to execute or withhold the response for Choice targets. Before each target, we presented emotional faces, backwards masked to render them nonconscious. In Intentional trials, subliminal angry faces made participants more likely to voluntarily withhold the action, whereas fearful and happy faces had no effects. In a second experiment, the faces were made supraliminal, which eliminated the effects of angry faces on volitional choices. A third experiment measured neural correlates of the effects of subliminal angry faces on intentional choice using EEG. After replicating the behavioural results found in Experiment 1, we identified a frontal-midline theta component-associated with cognitive control processes-which is present for volitional decisions, and is modulated by subliminal angry faces. This suggests a mechanism whereby subliminally presented "threat" stimuli affect conscious control processes. In summary, nonconscious perception of angry faces increases choices to inhibit, and subliminal influences on volitional action are deep seated and ecologically embedded.

  14. Sex-specific mediation effect of the right fusiform face area volume on the association between variants in repeat length of AVPR1A RS3 and altruistic behavior in healthy adults.

    Science.gov (United States)

    Wang, Junping; Qin, Wen; Liu, Feng; Liu, Bing; Zhou, Yuan; Jiang, Tianzi; Yu, Chunshui

    2016-07-01

    Microsatellite variants in the arginine vasopressin receptor 1A gene (AVPR1A) RS3 have been associated with normal social behaviors variation and autism spectrum disorders (ASDs) in a sex-specific manner. However, neural mechanisms underlying these associations remain largely unknown. We hypothesized that AVPR1A RS3 variants affect altruistic behavior by modulating the gray matter volume (GMV) of specific brain regions in a sex-specific manner. We investigated 278 young healthy adults using the Dictator Game to assess altruistic behavior. All subjects were genotyped and main effect of AVPR1A RS3 repeat polymorphisms and interaction of genotype-by-sex on the GMV were assessed in a voxel-wise manner. We observed that male subjects with relatively short repeats allocated less money to others and exhibited a significantly smaller GMV in the right fusiform face area (FFA) compared with male long homozygotes. In male subjects, the GMV of the right FFA exhibited a significant positive correlation with altruistic behavior. A mixed mediation and moderation analysis further revealed both a significant mediation effect of the GMV of the right FFA on the association between AVPR1A RS3 repeat polymorphisms and allocation sums and a significant moderation effect of sex (only in males) on the mediation effect. Post hoc analysis showed that the GMV of the right FFA was significantly smaller in male subjects carrying allele 426 than in non-426 carriers. These results suggest that the GMV of the right FFA may be a potential mediator whereby the genetic variants in AVPR1A RS3 affect altruistic behavior in healthy male subjects. Hum Brain Mapp 37:2700-2709, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. The Secrets of Faces

    OpenAIRE

    Enquist, Magnus; Ghirlanda, Stefano

    1998-01-01

    This is a comment on an article by Perrett et al., on the same issue of Nature, investigating face perception. With computer graphics, Perrett and colleagues have produced exaggerated male and female faces, and asked people to rate them with respect to femininity or masculinity, and personality traits such as intelligence, emotionality and so on. The key question is: what informations do faces (and sexual signals in general) convey? One view, supported by Perrett and colleagues, is that all a...

  16. Deployment Repeatability

    Science.gov (United States)

    2016-04-01

    evaluating the deployment repeatability builds upon the testing or analysis of deployment kinematics (Chapter 6) and adds repetition. Introduction...material yield or failure during a test. For the purposes of this chapter, zero shift will refer to permanent changes in the structure, while reversible ...the content of other chapters in this book: Gravity Compensation (Chapter 4) and Deployment Kinematics and Dynamics (Chapter 6). Repeating the

  17. Face it, don't Facebook it: Impacts of Social Media Addiction on Mindfulness, Coping Strategies and the Consequence on Emotional Exhaustion.

    Science.gov (United States)

    Sriwilai, Kanokporn; Charoensukmongkol, Peerayuth

    2016-10-01

    Addiction to social media has now become a problem that societies are concerned with. The aim of the present study is to investigate the impacts that social media addiction has on mindfulness and choice of coping strategy, as well as to explore the consequences on emotional exhaustion. The survey data were collected from 211 employees in 13 enterprises in Thailand. Results from partial least square structural equation modelling revealed that people who are highly addicted to social media tended to have lower mindfulness and tended to use emotion-focused coping to deal with stress. Lack of mindfulness and the decision to use emotion-coping strategy are also subsequently associated with higher emotional exhaustion. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Developing an eBook-Integrated High-Fidelity Mobile App Prototype for Promoting Child Motor Skills and Taxonomically Assessing Children’s Emotional Responses Using Face and Sound Topology

    Science.gov (United States)

    Brown, William; Liu, Connie; John, Rita Marie; Ford, Phoebe

    2014-01-01

    Developing gross and fine motor skills and expressing complex emotion is critical for child development. We introduce “StorySense”, an eBook-integrated mobile app prototype that can sense face and sound topologies and identify movement and expression to promote children’s motor skills and emotional developmental. Currently, most interactive eBooks on mobile devices only leverage “low-motor” interaction (i.e. tapping or swiping). Our app senses a greater breath of motion (e.g. clapping, snapping, and face tracking), and dynamically alters the storyline according to physical responses in ways that encourage the performance of predetermined motor skills ideal for a child’s gross and fine motor development. In addition, our app can capture changes in facial topology, which can later be mapped using the Facial Action Coding System (FACS) for later interpretation of emotion. StorySense expands the human computer interaction vocabulary for mobile devices. Potential clinical applications include child development, physical therapy, and autism. PMID:25954336

  19. Smiling faces and cash bonuses: Exploring common affective coding across positive and negative emotional and motivational stimuli using fMRI.

    Science.gov (United States)

    Park, Haeme R P; Kostandyan, Mariam; Boehler, C Nico; Krebs, Ruth M

    2018-06-01

    Although it is clear that emotional and motivational manipulations yield a strong influence on cognition and behaviour, these domains have mostly been investigated in independent research lines. Therefore, it remains poorly understood how far these affective manipulations overlap in terms of their underlying neural activations, especially in light of previous findings that suggest a shared valence mechanism across multiple affective processing domains (e.g., monetary incentives, primary rewards, emotional events). This is particularly interesting considering the commonality between emotional and motivational constructs in terms of their basic affective nature (positive vs. negative), but dissociations in terms of instrumentality, in that only reward-related stimuli are typically associated with performance-contingent outcomes. Here, we aimed to examine potential common neural processes triggered by emotional and motivational stimuli in matched tasks within participants using functional magnetic resonance imaging (fMRI). Across tasks, we found shared valence effects in the ventromedial prefrontal cortex and left inferior frontal gyrus (part of dorsolateral prefrontal cortex), with increased activity for positive and negative stimuli, respectively. Despite this commonality, emotion and reward tasks featured differential behavioural patterns in that negative valence effects (performance costs) were exclusive to emotional stimuli, while positive valence effects (performance benefits) were only observed for reward-related stimuli. Overall, our data suggest a common affective coding mechanism across different task domains and support the idea that monetary incentives entail signed basic valence signals, above and beyond the instruction to perform both gain and loss trials as accurately as possible to maximise the outcome.

  20. Unconsciously Triggered Emotional Conflict by Emotional Facial Expressions

    Science.gov (United States)

    Chen, Antao; Cui, Qian; Zhang, Qinglin

    2013-01-01

    The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict. PMID:23409084

  1. Repeating Marx

    DEFF Research Database (Denmark)

    Fuchs, Christian; Monticelli, Lara

    2018-01-01

    This introduction sets out the context of the special issue “Karl Marx @ 200: Debating Capitalism & Perspectives for the Future of Radical Theory”, which was published on the occasion of Marx’s bicentenary on 5 May 2018. First, we give a brief overview of contemporary capitalism’s development...... and its crises. Second, we argue that it is important to repeat Marx today. Third, we reflect on lessons learned from 200 years of struggles for alternatives to capitalism. Fourth, we give an overview of the contributions in this special issue. Taken together, the contributions in this special issue show...... that Marx’s theory and politics remain key inspirations for understanding exploitation and domination in 21st-century society and for struggles that aim to overcome these phenomena and establishing a just and fair society. We need to repeat Marx today....

  2. That "poker face" just might lose you the game! The impact of expressive suppression and mimicry on sensitivity to facial expressions of emotion.

    Science.gov (United States)

    Schneider, Kristin G; Hempel, Roelie J; Lynch, Thomas R

    2013-10-01

    Successful interpersonal functioning often requires both the ability to mask inner feelings and the ability to accurately recognize others' expressions--but what if effortful control of emotional expressions impacts the ability to accurately read others? In this study, we examined the influence of self-controlled expressive suppression and mimicry on facial affect sensitivity--the speed with which one can accurately identify gradually intensifying facial expressions of emotion. Muscle activity of the brow (corrugator, related to anger), upper lip (levator, related to disgust), and cheek (zygomaticus, related to happiness) were recorded using facial electromyography while participants randomized to one of three conditions (Suppress, Mimic, and No-Instruction) viewed a series of six distinct emotional expressions (happiness, sadness, fear, anger, surprise, and disgust) as they morphed from neutral to full expression. As hypothesized, individuals instructed to suppress their own facial expressions showed impairment in facial affect sensitivity. Conversely, mimicry of emotion expressions appeared to facilitate facial affect sensitivity. Results suggest that it is difficult for a person to be able to simultaneously mask inner feelings and accurately "read" the facial expressions of others, at least when these expressions are at low intensity. The combined behavioral and physiological data suggest that the strategies an individual selects to control his or her own expression of emotion have important implications for interpersonal functioning.

  3. Deployment Repeatability

    Science.gov (United States)

    2016-08-31

    large cohort of trials to spot unusual cases. However, deployment repeatability is inherently a nonlinear phenomenon, which makes modeling difficult...and GEMS tip position were both tracked during ground testing by a laser target tracking system. Earlier SAILMAST testing in 2005 [8] used...recalls the strategy used by SRTM, where a constellation of lights was installed at the tip of the boom and a modified star tracker was used to track tip

  4. Facial and prosodic emotion recognition in social anxiety disorder.

    Science.gov (United States)

    Tseng, Huai-Hsuan; Huang, Yu-Lien; Chen, Jian-Ting; Liang, Kuei-Yu; Lin, Chao-Cheng; Chen, Sue-Huei

    2017-07-01

    Patients with social anxiety disorder (SAD) have a cognitive preference to negatively evaluate emotional information. In particular, the preferential biases in prosodic emotion recognition in SAD have been much less explored. The present study aims to investigate whether SAD patients retain negative evaluation biases across visual and auditory modalities when given sufficient response time to recognise emotions. Thirty-one SAD patients and 31 age- and gender-matched healthy participants completed a culturally suitable non-verbal emotion recognition task and received clinical assessments for social anxiety and depressive symptoms. A repeated measures analysis of variance was conducted to examine group differences in emotion recognition. Compared to healthy participants, SAD patients were significantly less accurate at recognising facial and prosodic emotions, and spent more time on emotion recognition. The differences were mainly driven by the lower accuracy and longer reaction times for recognising fearful emotions in SAD patients. Within the SAD patients, lower accuracy of sad face recognition was associated with higher severity of depressive and social anxiety symptoms, particularly with avoidance symptoms. These findings may represent a cross-modality pattern of avoidance in the later stage of identifying negative emotions in SAD. This pattern may be linked to clinical symptom severity.

  5. Dystonia: Emotional and Mental Health

    Science.gov (United States)

    ... Support Frequently Asked Questions Faces of Dystonia Emotional & Mental Health Although dystonia is a movement disorder that impacts ... emotion as well as muscle movement. For years, mental health professionals have recognized that coping with a chronic ...

  6. Free-Labeling Facial Expressions and Emotional Situations in Children Aged 3-7 Years: Developmental Trajectory and a Face Inferiority Effect

    Science.gov (United States)

    Wang, Zhenhong; Lü, Wei; Zhang, Hui; Surina, Alyssa

    2014-01-01

    Chinese children (N = 185, aged 3-7 years) were assessed on their abilities to freely label facial expressions and emotional situations. Results indicated that the overall accuracy of free-labeling facial expressions increased relatively quickly in children aged 3-5 years, but slowed down in children aged 5-7 years. In contrast, the overall…

  7. How Do Typically Developing Deaf Children and Deaf Children with Autism Spectrum Disorder Use the Face When Comprehending Emotional Facial Expressions in British Sign Language?

    Science.gov (United States)

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-01-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their…

  8. Emotions and trait emotional intelligence among ultra-endurance runners.

    Science.gov (United States)

    Lane, Andrew M; Wilson, Mathew

    2011-07-01

    The aim of this study was to investigate relationships between trait emotional intelligence and emotional state changes over the course of an ultra-endurance foot race covering a route of approximately 175 miles (282 km) and held in set stages over six days. A repeated measures field design that sought to maintain ecological validity was used. Trait emotional intelligence was defined as a relatively stable concept that should predict adaptive emotional states experienced over the duration of the race and therefore associate with pleasant emotions during a 6-stage endurance event. Thirty-four runners completed a self-report measure of trait emotional intelligence before the event started. Participants reported emotional states before and after each of the six races. Repeated measures ANOVA results showed significant variations in emotions over time and a main effect for trait emotional intelligence. Runners high in self-report trait emotional intelligence also reported higher pleasant and lower unpleasant emotions than runners low in trait emotional intelligence. Findings lend support to the notion that trait emotional intelligence associates with adaptive psychological states, suggesting that it may be a key individual difference that explains why some athletes respond to repeated bouts of hard exercise better than others. Future research should test the effectiveness of interventions designed to enhance trait emotional intelligence and examine the attendant impact on emotional responses to intense exercise during multi-stage events. Copyright © 2011. Published by Elsevier Ltd.

  9. Face to Face

    OpenAIRE

    Robert Leckey

    2013-01-01

    This paper uses Queer theory, specifically literature on Bowers v. Hardwick, to analyze debates over legislation proposed in Quebec regarding covered faces. Queer theory sheds light on legal responses to the veil. Parliamentary debates in Quebec reconstitute the polity, notably as secular and united. The paper highlights the contradictory and unstable character of four binaries: legislative text versus social practice, act versus status, majority versus minority, and knowable versus unknowabl...

  10. Sensual expressions on faces

    NARCIS (Netherlands)

    Hendriks, A.W.C.J.; Engels, R.C.M.E.; Roek, M.A.E.

    2009-01-01

    We explored the possibility that an emotional facial expression exists specifically for signalling sexual interest. We selected photographs of twenty-eight fashion models (male and female) with large portfolios (range 81 - 1593), choosing only face photographs in which the model was looking into the

  11. The Caledonian face test: A new test of face discrimination.

    Science.gov (United States)

    Logan, Andrew J; Wilkinson, Frances; Wilson, Hugh R; Gordon, Gael E; Loffler, Gunter

    2016-02-01

    This study aimed to develop a clinical test of face perception which is applicable to a wide range of patients and can capture normal variability. The Caledonian face test utilises synthetic faces which combine simplicity with sufficient realism to permit individual identification. Face discrimination thresholds (i.e. minimum difference between faces required for accurate discrimination) were determined in an "odd-one-out" task. The difference between faces was controlled by an adaptive QUEST procedure. A broad range of face discrimination sensitivity was determined from a group (N=52) of young adults (mean 5.75%; SD 1.18; range 3.33-8.84%). The test is fast (3-4 min), repeatable (test-re-test r(2)=0.795) and demonstrates a significant inversion effect. The potential to identify impairments of face discrimination was evaluated by testing LM who reported a lifelong difficulty with face perception. While LM's impairment for two established face tests was close to the criterion for significance (Z-scores of -2.20 and -2.27) for the Caledonian face test, her Z-score was -7.26, implying a more than threefold higher sensitivity. The new face test provides a quantifiable and repeatable assessment of face discrimination ability. The enhanced sensitivity suggests that the Caledonian face test may be capable of detecting more subtle impairments of face perception than available tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. What Is Going On? The Process of Generating Questions about Emotion and Social Cognition in Bipolar Disorder and Schizophrenia with Cartoon Situations and Faces

    Directory of Open Access Journals (Sweden)

    Bryan D. Fantie

    2018-04-01

    Full Text Available Regarding the notion of putative “best” practices in social neuroscience and science in general, we contend that following established procedures has advantages, but prescriptive uniformity in methodology can obscure flaws, bias thinking, stifle creativity, and restrict exploration. Generating hypotheses is at least as important as testing hypotheses. To illustrate this process, we describe the following exploratory study. Psychiatric patients have difficulties with social functioning that affect their quality of life adversely. To investigate these impediments, we compared the performances of patients with schizophrenia and those with bipolar disorder to healthy controls on a task that involved matching photographs of facial expressions to a faceless protagonist in each of a series of drawn cartoon emotion-related situations. These scenarios involved either a single character (Nonsocial or multiple characters (Social. The Social scenarios were also Congruent, with everyone in the cartoon displaying the same emotion, or Noncongruent (with everyone displaying a different emotion than the protagonist should. In this preliminary study, both patient groups produced lower scores than controls (p < 0.001, but did not perform differently from each other. All groups performed best on the social-congruent items and worst on the social-noncongruent items (p < 0.001. Performance varied inversely with illness duration, but not symptom severity. Complete emotional, social, cognitive, or perceptual inability is unlikely because these patient groups could still do this task. Nevertheless, the differences we saw could be meaningful functionally and clinically significant and deserve further exploration. Therefore, we stress the need to continue developing novel, alternative ways to explore social cognition in patients with psychiatric disorders and to clarify which elements of the multidimensional process contribute to difficulties in daily functioning.

  13. Impacto emocional en estudiantes de pedagogía ante eventos de maltrato en la práctica profesional / Emotional impact on pedagogy students faced with abuse during professional practice

    Directory of Open Access Journals (Sweden)

    Denisse Alejandra Jaramillo Sandoval

    2015-12-01

    Full Text Available RESUMEN El objetivo de este estudio fue comprender la experiencia emocional de los estudiantes de pedagogía al enfrentarse a eventos de maltrato por parte de los docentes del establecimiento, en sus prácticas profesionales. Desde el enfoque cualitativo y utilizando la técnica de incidentes críticos, se entrevistó a 12 estudiantes que experimentaron maltrato durante su práctica y se recabó 15 incidentes críticos. De esta muestra, se categorizaron 5 conductas de maltrato, de las cuales, la angustia, la rabia e inseguridad fueron las principales emociones que manifestaron los estudiantes. Para ellos, dicha experiencia tuvo un intenso impacto emocional, asociándose a altos nivel de estrés. Se discute por una parte que estos eventos provocaron un significativo cuestionamiento en su identidad profesional y, por otra parte, sobre la importancia de fortalecer las habilidades socioemocionales en la formación docente, de modo que se maneje adecuadamente los conflictos y conservar el bienestar psicológico. ABSTRACT The objective of this study was to understand the emotional experience of pedagogy students faced with abusive events caused by teachers of the institution during their professional practice. From the qualitative approach, and using the critical incidents technique, we interviewed 12 students who experienced abuse during their practice and gathered 15 critical incidents. From this sample, it was categorized 5 abusive behaviors of which anguish, anger, and insecurity were the main emotions expressed by the students. For them, such experience had an intense emotional impact associating it with high levels of stress. The discussion that these events caused them to significantly question in their professional identity; and on the other hand, regarding the importance of strengthening the socio-emotional skills in teacher training, so that conflicts are managed properly – thereby maintaining the psychological well-being.

  14. Emotion modelling towards affective pathogenesis.

    Science.gov (United States)

    Bas, James Le

    2009-12-01

    Objective: There is a need in psychiatry for models that integrate pathological states with normal systems. The interaction of arousal and emotion is the focus of an exploration of affective pathogenesis. Method: Given that the explicit causes of affective disorder remain nascent, methods of linking emotion and disorder are evaluated. Results: A network model of emotional families is presented, in which emotions exist as quantal gradients. Morbid emotional states are seen as the activation of distal emotion sites. The phenomenology of affective disorders is described with reference to this model. Recourse is made to non-linear dynamic theory. Conclusions: Metaphoric emotion models have face validity and may prove a useful heuristic.

  15. Exploring the unconscious using faces.

    Science.gov (United States)

    Axelrod, Vadim; Bar, Moshe; Rees, Geraint

    2015-01-01

    Understanding the mechanisms of unconscious processing is one of the most substantial endeavors of cognitive science. While there are many different empirical ways to address this question, the use of faces in such research has proven exceptionally fruitful. We review here what has been learned about unconscious processing through the use of faces and face-selective neural correlates. A large number of cognitive systems can be explored with faces, including emotions, social cueing and evaluation, attention, multisensory integration, and various aspects of face processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Face to Face

    Directory of Open Access Journals (Sweden)

    Robert Leckey

    2013-12-01

    Full Text Available This paper uses Queer theory, specifically literature on Bowers v. Hardwick, to analyze debates over legislation proposed in Quebec regarding covered faces. Queer theory sheds light on legal responses to the veil. Parliamentary debates in Quebec reconstitute the polity, notably as secular and united. The paper highlights the contradictory and unstable character of four binaries: legislative text versus social practice, act versus status, majority versus minority, and knowable versus unknowable. As with contradictory propositions about homosexuality, contradiction does not undermine discourse but makes it stronger and more agile. Este artículo utiliza la teoría Queer, más concretamente la literatura sobre Bowers vs. Hardwick, para analizar los debates sobre la legislación propuesta en Quebec en relación al velo. La teoría Queer arroja luz sobre las respuestas legales al velo. Los debates parlamentarios en Quebec reconstituyen la forma de gobierno, especialmente como secular y unido. El documento pone de relieve el carácter contradictorio e inestable de cuatro binarios: texto legislativo frente a las prácticas sociales; legislación frente a estado; mayoría versus minoría; y conocible frente a incognoscible. Al igual que con las proposiciones contradictorias acerca de la homosexualidad, la contradicción no socava el discurso, sino que lo hace más fuerte y más ágil.

  17. How Children Use Emotional Prosody: Crossmodal Emotional Integration?

    Science.gov (United States)

    Gil, Sandrine; Hattouti, Jamila; Laval, Virginie

    2016-01-01

    A crossmodal effect has been observed in the processing of facial and vocal emotion in adults and infants. For the first time, we assessed whether this effect is present in childhood by administering a crossmodal task similar to those used in seminal studies featuring emotional faces (i.e., a continuum of emotional expressions running from…

  18. About-face on face recognition ability and holistic processing.

    Science.gov (United States)

    Richler, Jennifer J; Floyd, R Jackie; Gauthier, Isabel

    2015-01-01

    Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically.

  19. Modeling the Experience of Emotion

    OpenAIRE

    Broekens, Joost

    2009-01-01

    Affective computing has proven to be a viable field of research comprised of a large number of multidisciplinary researchers resulting in work that is widely published. The majority of this work consists of computational models of emotion recognition, computational modeling of causal factors of emotion and emotion expression through rendered and robotic faces. A smaller part is concerned with modeling the effects of emotion, formal modeling of cognitive appraisal theory and models of emergent...

  20. Happy faces are preferred regardless of familiarity--sad faces are preferred only when familiar.

    Science.gov (United States)

    Liao, Hsin-I; Shimojo, Shinsuke; Yeh, Su-Ling

    2013-06-01

    Familiarity leads to preference (e.g., the mere exposure effect), yet it remains unknown whether it is objective familiarity, that is, repetitive exposure, or subjective familiarity that contributes to preference. In addition, it is unexplored whether and how different emotions influence familiarity-related preference. The authors investigated whether happy or sad faces are preferred or perceived as more familiar and whether this subjective familiarity judgment correlates with preference for different emotional faces. An emotional face--happy or sad--was paired with a neutral face, and participants rated the relative preference and familiarity of each of the paired faces. For preference judgment, happy faces were preferred and sad faces were less preferred, compared with neutral faces. For familiarity judgment, happy faces did not show any bias, but sad faces were perceived as less familiar than neutral faces. Item-by-item correlational analyses show preference for sad faces--but not happy faces--positively correlate with familiarity. These results suggest a direct link between positive emotion and preference, and argue at least partly against a common cause for familiarity and preference. Instead, facial expression of different emotional valence modulates the link between familiarity and preference.

  1. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    Science.gov (United States)

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  2. Patterns of feelings in face to face negotiation: a Sino-Dutch pilot study

    NARCIS (Netherlands)

    Ulijn, J.M.; Rutkowski, A.F.; Kumar, Rajesh; Zhu, Y.

    2005-01-01

    We conducted a pilot study to compare the emotions experienced by Dutch and Chinese students during a face-to-face negotiation role play. Emotions play an important role in negotiations because they influence the behaviour and judgments of negotiators The Data Printer case developed by Greenhalgh

  3. Touch communicates distinct emotions.

    Science.gov (United States)

    Hertenstein, Matthew J; Keltner, Dacher; App, Betsy; Bulleit, Brittany A; Jaskolka, Ariane R

    2006-08-01

    The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation. (c) 2006 APA, all rights reserved

  4. Emotional response to musical repetition.

    Science.gov (United States)

    Livingstone, Steven R; Palmer, Caroline; Schubert, Emery

    2012-06-01

    Two experiments examined the effects of repetition on listeners' emotional response to music. Listeners heard recordings of orchestral music that contained a large section repeated twice. The music had a symmetric phrase structure (same-length phrases) in Experiment 1 and an asymmetric phrase structure (different-length phrases) in Experiment 2, hypothesized to alter the predictability of sensitivity to musical repetition. Continuous measures of arousal and valence were compared across music that contained identical repetition, variation (related), or contrasting (unrelated) structure. Listeners' emotional arousal ratings differed most for contrasting music, moderately for variations, and least for repeating musical segments. A computational model for the detection of repeated musical segments was applied to the listeners' emotional responses. The model detected the locations of phrase boundaries from the emotional responses better than from performed tempo or physical intensity in both experiments. These findings indicate the importance of repetition in listeners' emotional response to music and in the perceptual segmentation of musical structure.

  5. About Face

    Medline Plus

    Full Text Available ... Basics PTSD Treatment What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos ... Basics PTSD Treatment What is AboutFace? Resources for Professionals Get Help PTSD We've been there. After ...

  6. About Face

    Medline Plus

    Full Text Available ... Treatment What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos by Type ... Treatment What is AboutFace? Resources for Professionals Get Help PTSD We've been there. After a traumatic ...

  7. Improved emotional conflict control triggered by the processing priority of negative emotion.

    Science.gov (United States)

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  8. Brain mechanisms of emotions.

    Science.gov (United States)

    Simonov, P V

    1997-01-01

    At the 23rd International Congress of Physiology Sciences (Tokyo, 1965) the results of experiment led us to the conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is, to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion, and estimation of probability have different neuromorphological substrates. Activation through the hypothalamic motivatiogenic structures of the frontal parts of the neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which are typical for the emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams, the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons, while in the course of avoidance reaction the positive emotion's neurons were involved. The role of the left and right frontal neocortex in the appearance or positive or negative emotions depends on these informational (cognitive) functions.

  9. Quantified Faces

    DEFF Research Database (Denmark)

    Sørensen, Mette-Marie Zacher

    2016-01-01

    artist Marnix de Nijs' Physiognomic Scrutinizer is an interactive installation whereby the viewer's face is scanned and identified with historical figures. The American artist Zach Blas' project Fag Face Mask consists of three-dimensional portraits that blend biometric facial data from 30 gay men's faces...... and critically examine bias in surveillance technologies, as well as scientific investigations, regarding the stereotyping mode of the human gaze. The American artist Heather Dewey-Hagborg creates three-dimensional portraits of persons she has “identified” from their garbage. Her project from 2013 entitled...

  10. Reading faces and Facing words

    DEFF Research Database (Denmark)

    Robotham, Julia Emma; Lindegaard, Martin Weis; Delfi, Tzvetelina Shentova

    unilateral lesions, we found no patient with a selective deficit in either reading or face processing. Rather, the patients showing a deficit in processing either words or faces were also impaired with the other category. One patient performed within the normal range on all tasks. In addition, all patients......It has long been argued that perceptual processing of faces and words is largely independent, highly specialised and strongly lateralised. Studies of patients with either pure alexia or prosopagnosia have strongly contributed to this view. The aim of our study was to investigate how visual...... perception of faces and words is affected by unilateral posterior stroke. Two patients with lesions in their dominant hemisphere and two with lesions in their non-dominant hemisphere were tested on sensitive tests of face and word perception during the stable phase of recovery. Despite all patients having...

  11. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Interviews Our Stories Search All Videos PTSD Basics PTSD Treatment What is AboutFace? Resources for Professionals Get Help Home Watch Interviews Our ...

  12. About Face

    Medline Plus

    Full Text Available ... not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is AboutFace In these videos, Veterans, family members, and clinicians share their experiences with PTSD ...

  13. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Videos by Topic Videos by Type Search All ... What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos by Type Search ...

  14. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Interviews Our Stories Search All Videos PTSD Basics ... What is AboutFace? Resources for Professionals Get Help Home Watch Interviews Our Stories Search All Videos Learn ...

  15. Designing Emotionally Expressive Robots

    DEFF Research Database (Denmark)

    Tsiourti, Christiana; Weiss, Astrid; Wac, Katarzyna

    2017-01-01

    Socially assistive agents, be it virtual avatars or robots, need to engage in social interactions with humans and express their internal emotional states, goals, and desires. In this work, we conducted a comparative study to investigate how humans perceive emotional cues expressed by humanoid...... robots through five communication modalities (face, head, body, voice, locomotion) and examined whether the degree of a robot's human-like embodiment affects this perception. In an online survey, we asked people to identify emotions communicated by Pepper -a highly human-like robot and Hobbit – a robot...... for robots....

  16. Understanding Schemas and Emotion in Early Childhood

    Science.gov (United States)

    Arnold, Cath

    2010-01-01

    This book makes explicit connections between young children's spontaneous repeated actions and their representations of their emotional worlds. Drawing on the literature on schemas, attachment theory and family contexts, the author takes schema theory into the territory of the emotions, making it relevant to the social and emotional development…

  17. Configuration perception and face memory, and face context effects in developmental prosopagnosia.

    Science.gov (United States)

    Huis in 't Veld, Elisabeth; Van den Stock, Jan; de Gelder, Beatrice

    2012-01-01

    This study addresses two central and controversial issues in developmental prosopagnosia (DP), configuration- versus feature-based face processing and the influence of affective information from either facial or bodily expressions on face recognition. A sample of 10 DPs and 10 controls were tested with a previously developed face and object recognition and memory battery (Facial Expressive Action Stimulus Test, FEAST), a task measuring the influence of emotional faces and bodies on face identity matching (Face-Body Compound task), and an emotionally expressive face memory task (Emotional Face Memory task, FaMe-E). We show that DPs were impaired in upright, but not inverted, face matching but they performed at the level of controls on part-to-whole matching. Second, DPs showed impaired memory for both neutral and emotional faces and scored within the normal range on the Face-Body Compound task. Third, configural perception but not feature-based processing was significantly associated with memory performance. Taken together the results indicate that DPs have a deficit in configural processing at the perception stage that may underlie the memory impairment.

  18. Emergent emotion

    OpenAIRE

    O'Connell, Elaine Finbarr

    2016-01-01

    I argue that emotion is an ontologically emergent and sui generis. I argue that emotion meets both of two individually necessary and jointly sufficient conditions for ontological emergence. These are, (i) that emotion necessarily has constituent parts to which it cannot be reduced, and (ii) that emotion has a causal effect on its constituent parts (i.e. emotion demonstrates downward causation).\\ud \\ud I argue that emotion is partly cognitive, partly constituted by feelings and partly perceptu...

  19. About Face

    Medline Plus

    Full Text Available ... at first. But if it's been months or years since the trauma and you're not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is AboutFace In these videos, Veterans, family members, ...

  20. About Face

    Medline Plus

    Full Text Available ... What is AboutFace? Resources for Professionals Get Help PTSD We've been there. After a traumatic event — ... you're not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is ...

  1. Positive Emotional Engagement and Autism Risk

    Science.gov (United States)

    Lambert-Brown, Brittany L.; McDonald, Nicole M.; Mattson, Whitney I.; Martin, Katherine B.; Ibañez, Lisa V.; Stone, Wendy L.; Messinger, Daniel S.

    2015-01-01

    Positive emotional engagement develops in the context of face-to-face interactions during the first 6 months of life. Deficits in emotional engagement are characteristic of autism spectrum disorder (ASD) and may characterize the younger siblings of children with ASD (high-risk siblings). High-risk siblings are likely to exhibit a broad range of…

  2. Emotional intelligence and emotions associated with optimal and dysfunctional athletic performance.

    Science.gov (United States)

    Lane, Andrew M; Devonport, Tracey J; Soos, Istvan; Karsai, Istvan; Leibinger, Eva; Hamar, Pal

    2010-01-01

    This study investigated relationships between self-report measures of emotional intelligence and memories of pre-competitive emotions before optimal and dysfunctional athletic performance. Participant-athletes (n = 284) completed a self-report measure of emotional intelligence and two measures of pre-competitive emotions; a) emotions experienced before an optimal performance, and b) emotions experienced before a dysfunctional performance. Consistent with theoretical predictions, repeated MANOVA results demonstrated pleasant emotions associated with optimal performance and unpleasant emotions associated with dysfunctional performance. Emotional intelligence correlated with pleasant emotions in both performances with individuals reporting low scores on the self-report emotional intelligence scale appearing to experience intense unpleasant emotions before dysfunctional performance. We suggest that future research should investigate relationships between emotional intelligence and emotion-regulation strategies used by athletes. Key pointsAthletes reporting high scores of self-report emotional intelligence tend to experience pleasant emotions.Optimal performance is associated with pleasant emotions and dysfunctional performance is associated with unpleasant emotions.Emotional intelligence might help athletes recognize which emotional states help performance.

  3. The time course of face processing: startle eyeblink response modulation by face gender and expression.

    Science.gov (United States)

    Duval, Elizabeth R; Lovelace, Christopher T; Aarant, Justin; Filion, Diane L

    2013-12-01

    The purpose of this study was to investigate the effects of both facial expression and face gender on startle eyeblink response patterns at varying lead intervals (300, 800, and 3500ms) indicative of attentional and emotional processes. We aimed to determine whether responses to affective faces map onto the Defense Cascade Model (Lang et al., 1997) to better understand the stages of processing during affective face viewing. At 300ms, there was an interaction between face expression and face gender with female happy and neutral faces and male angry faces producing inhibited startle. At 3500ms, there was a trend for facilitated startle during angry compared to neutral faces. These findings suggest that affective expressions are perceived differently in male and female faces, especially at short lead intervals. Future studies investigating face processing should take both face gender and expression into account. © 2013.

  4. Human wagering behavior depends on opponents' faces.

    Directory of Open Access Journals (Sweden)

    Erik J Schlicht

    Full Text Available Research in competitive games has exclusively focused on how opponent models are developed through previous outcomes and how peoples' decisions relate to normative predictions. Little is known about how rapid impressions of opponents operate and influence behavior in competitive economic situations, although such subjective impressions have been shown to influence cooperative decision-making. This study investigates whether an opponent's face influences players' wagering decisions in a zero-sum game with hidden information. Participants made risky choices in a simplified poker task while being presented opponents whose faces differentially correlated with subjective impressions of trust. Surprisingly, we find that threatening face information has little influence on wagering behavior, but faces relaying positive emotional characteristics impact peoples' decisions. Thus, people took significantly longer and made more mistakes against emotionally positive opponents. Differences in reaction times and percent correct were greatest around the optimal decision boundary, indicating that face information is predominantly used when making decisions during medium-value gambles. Mistakes against emotionally positive opponents resulted from increased folding rates, suggesting that participants may have believed that these opponents were betting with hands of greater value than other opponents. According to these results, the best "poker face" for bluffing may not be a neutral face, but rather a face that contains emotional correlates of trustworthiness. Moreover, it suggests that rapid impressions of an opponent play an important role in competitive games, especially when people have little or no experience with an opponent.

  5. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  6. Emotional Demands, Emotional Labour and Occupational Outcomes in School Principals: Modelling the Relationships

    Science.gov (United States)

    Maxwell, Aimee; Riley, Philip

    2017-01-01

    Most research into emotional labour is focussed on front-line service staff and health professionals, in short-term interactions. Little exists exploring the emotional labour involved in repeated on-going interactions by educational leaders with key stakeholders. This study explored the relationships between emotional demands, three emotional…

  7. (How) do medical students regulate their emotions?

    Science.gov (United States)

    Doulougeri, Karolina; Panagopoulou, Efharis; Montgomery, Anthony

    2016-12-12

    Medical training can be a challenging and emotionally intense period for medical students. However the emotions experienced by medical students in the face of challenging situations and the emotion regulation strategies they use remains relatively unexplored. The aim of the present study was to explore the emotions elicited by memorable incidents reported by medical students and the associated emotion regulation strategies. Peer interviewing was used to collect medical students' memorable incidents. Medical students at both preclinical and clinical stage of medical school were eligible to participate. In total 104 medical students provided memorable incidents. Only 54 narratives included references to emotions and emotion regulation and thus were further analyzed. The narratives of 47 clinical and 7 preclinical students were further analyzed for their references to emotions and emotion regulation strategies. Forty seven out of 54 incidents described a negative incident associated with negative emotions. The most frequently mentioned emotion was shock and surprise followed by feelings of embarrassment, sadness, anger and tension or anxiety. The most frequent reaction was inaction often associated with emotion regulation strategies such as distraction, focusing on a task, suppression of emotions and reappraisal. When students witnessed mistreatment or disrespect exhibited towards patients, the regulation strategy used involved focusing and comforting the patient. The present study sheds light on the strategies medical students use to deal with intense negative emotions. The vast majority reported inaction in the face of a challenging situation and the use of more subtle strategies to deal with the emotional impact of the incident.

  8. Generational Differences of Emotional Expression

    Institute of Scientific and Technical Information of China (English)

    李学勇

    2014-01-01

    As a kind of subjective psychological activity, emotion can only be known and perceived by a certain expressive form. Varies as the different main bodies, difference of emotional expression can be reflected not only among individuals but between generations. The old conceals their emotions inside, the young express their emotions boldly, and the middle-aged are rational and deep in their expressions. Facing and understanding such differences is the premise and foundation of the con-struction of a harmonious relationship between different generations.

  9. A Fuzzy Aproach For Facial Emotion Recognition

    Science.gov (United States)

    Gîlcă, Gheorghe; Bîzdoacă, Nicu-George

    2015-09-01

    This article deals with an emotion recognition system based on the fuzzy sets. Human faces are detected in images with the Viola - Jones algorithm and for its tracking in video sequences we used the Camshift algorithm. The detected human faces are transferred to the decisional fuzzy system, which is based on the variable fuzzyfication measurements of the face: eyebrow, eyelid and mouth. The system can easily determine the emotional state of a person.

  10. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  11. [Negative symptoms, emotion and cognition in schizophrenia].

    Science.gov (United States)

    Fakra, E; Belzeaux, R; Azorin, J-M; Adida, M

    2015-12-01

    For a long time, treatment of schizophrenia has been essentially focussed on positive symptoms managing. Yet, even if these symptoms are the most noticeable, negative symptoms are more enduring, resistant to pharmacological treatment and associated with a worse prognosis. In the two last decades, attention has shift towards cognitive deficit, as this deficit is most robustly associated to functional outcome. But it appears that the modest improvement in cognition, obtained in schizophrenia through pharmacological treatment or, more purposely, by cognitive enhancement therapy, has only lead to limited amelioration of functional outcome. Authors have claimed that pure cognitive processes, such as those evaluated and trained in lots of these programs, may be too distant from real-life conditions, as the latter are largely based on social interactions. Consequently, the field of social cognition, at the interface of cognition and emotion, has emerged. In a first part of this article we examined the links, in schizophrenia, between negative symptoms, cognition and emotions from a therapeutic standpoint. Nonetheless, investigation of emotion in schizophrenia may also hold relevant premises for understanding the physiopathology of this disorder. In a second part, we propose to illustrate this research by relying on the heuristic value of an elementary marker of social cognition, facial affect recognition. Facial affect recognition has been repeatedly reported to be impaired in schizophrenia and some authors have argued that this deficit could constitute an endophenotype of the illness. We here examined how facial affect processing has been used to explore broader emotion dysfunction in schizophrenia, through behavioural and imaging studies. In particular, fMRI paradigms using facial affect have shown particular patterns of amygdala engagement in schizophrenia, suggesting an intact potential to elicit the limbic system which may however not be advantageous. Finally, we

  12. Understanding Mixed Emotions: Paradigms and Measures

    Science.gov (United States)

    Kreibig, Sylvia D.; Gross, James J.

    2017-01-01

    In this review, we examine the paradigms and measures available for experimentally studying mixed emotions in the laboratory. For eliciting mixed emotions, we describe a mixed emotions film library that allows for the repeated elicitation of a specific homogeneous mixed emotional state and appropriately matched pure positive, pure negative, and neutral emotional states. For assessing mixed emotions, we consider subjective and objective measures that fall into univariate, bivariate, and multivariate measurement categories. As paradigms and measures for objectively studying mixed emotions are still in their early stages, we conclude by outlining future directions that focus on the reliability, temporal dynamics, and response coherence of mixed emotions paradigms and measures. This research will build a strong foundation for future studies and significantly advance our understanding of mixed emotions. PMID:28804752

  13. Social appraisal influences recognition of emotions.

    Science.gov (United States)

    Mumenthaler, Christian; Sander, David

    2012-06-01

    The notion of social appraisal emphasizes the importance of a social dimension in appraisal theories of emotion by proposing that the way an individual appraises an event is influenced by the way other individuals appraise and feel about the same event. This study directly tested this proposal by asking participants to recognize dynamic facial expressions of emotion (fear, happiness, or anger in Experiment 1; fear, happiness, anger, or neutral in Experiment 2) in a target face presented at the center of a screen while a contextual face, which appeared simultaneously in the periphery of the screen, expressed an emotion (fear, happiness, anger) or not (neutral) and either looked at the target face or not. We manipulated gaze direction to be able to distinguish between a mere contextual effect (gaze away from both the target face and the participant) and a specific social appraisal effect (gaze toward the target face). Results of both experiments provided evidence for a social appraisal effect in emotion recognition, which differed from the mere effect of contextual information: Whereas facial expressions were identical in both conditions, the direction of the gaze of the contextual face influenced emotion recognition. Social appraisal facilitated the recognition of anger, happiness, and fear when the contextual face expressed the same emotion. This facilitation was stronger than the mere contextual effect. Social appraisal also allowed better recognition of fear when the contextual face expressed anger and better recognition of anger when the contextual face expressed fear. 2012 APA, all rights reserved

  14. Emotional Intelligence (EI): A Therapy for Higher Education Students

    Science.gov (United States)

    Machera, Robert P.; Machera, Precious C.

    2017-01-01

    This study investigates the need to design and develop emotional intelligence curriculum for students in higher education. Emotional intelligence curriculum may be used as a therapy that provides skills to manage high emotions faced by generation "Y", on a day-to-day basis. Generation "Y" is emotionally challenged with: drug…

  15. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Science.gov (United States)

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  16. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Directory of Open Access Journals (Sweden)

    Alexander J Kirkham

    Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  17. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball

    Directory of Open Access Journals (Sweden)

    Arik eCheshin

    2016-02-01

    Full Text Available Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from MLB World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing when faced with a pitcher perceived as happy and to avoid (no swing when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  18. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball.

    Science.gov (United States)

    Cheshin, Arik; Heerdink, Marc W; Kossakowski, Jolanda J; Van Kleef, Gerben A

    2016-01-01

    Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from the Major League Baseball World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing) when faced with a pitcher perceived as happy and to avoid (no swing) when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  19. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  20. Selective Attention to Emotional Stimuli: What IQ and Openness Do, and Emotional Intelligence Does Not

    Science.gov (United States)

    Fiori, Marina; Antonakis, John

    2012-01-01

    We examined how general intelligence, personality, and emotional intelligence--measured as an ability using the MSCEIT--predicted performance on a selective-attention task requiring participants to ignore distracting emotion information. We used a visual prime in which participants saw a pair of faces depicting emotions; their task was to focus on…

  1. Faced with a dilemma

    DEFF Research Database (Denmark)

    Christensen, Anne Vinggaard; Christiansen, Anne Hjøllund; Petersson, Birgit

    2013-01-01

    's legal right to choose TOP and considerations about the foetus' right to live were suppressed. Midwives experienced a dilemma when faced with aborted foetuses that looked like newborns and when aborted foetuses showed signs of life after a termination. Furthermore, they were critical of how physicians......: A qualitative study consisting of ten individual interviews with Danish midwives, all of whom had taken part in late TOP. RESULTS: Current practice of late TOP resembles the practice of normal deliveries and is influenced by a growing personalisation of the aborted foetus. The midwives strongly supported women...... counsel women/couples after prenatal diagnosis. CONCLUSIONS: The midwives' practice in relation to late TOP was characterised by an acknowledgement of the growing ethical status of the foetus and the emotional reactions of the women/couples going through late TOP. Other professions as well as structural...

  2. Cultural similarities and differences in perceiving and recognizing facial expressions of basic emotions.

    Science.gov (United States)

    Yan, Xiaoqian; Andrews, Timothy J; Young, Andrew W

    2016-03-01

    The ability to recognize facial expressions of basic emotions is often considered a universal human ability. However, recent studies have suggested that this commonality has been overestimated and that people from different cultures use different facial signals to represent expressions (Jack, Blais, Scheepers, Schyns, & Caldara, 2009; Jack, Caldara, & Schyns, 2012). We investigated this possibility by examining similarities and differences in the perception and categorization of facial expressions between Chinese and white British participants using whole-face and partial-face images. Our results showed no cultural difference in the patterns of perceptual similarity of expressions from whole-face images. When categorizing the same expressions, however, both British and Chinese participants were slightly more accurate with whole-face images of their own ethnic group. To further investigate potential strategy differences, we repeated the perceptual similarity and categorization tasks with presentation of only the upper or lower half of each face. Again, the perceptual similarity of facial expressions was similar between Chinese and British participants for both the upper and lower face regions. However, participants were slightly better at categorizing facial expressions of their own ethnic group for the lower face regions, indicating that the way in which culture shapes the categorization of facial expressions is largely driven by differences in information decoding from this part of the face. (c) 2016 APA, all rights reserved).

  3. Revisiting the TALE repeat.

    Science.gov (United States)

    Deng, Dong; Yan, Chuangye; Wu, Jianping; Pan, Xiaojing; Yan, Nieng

    2014-04-01

    Transcription activator-like (TAL) effectors specifically bind to double stranded (ds) DNA through a central domain of tandem repeats. Each TAL effector (TALE) repeat comprises 33-35 amino acids and recognizes one specific DNA base through a highly variable residue at a fixed position in the repeat. Structural studies have revealed the molecular basis of DNA recognition by TALE repeats. Examination of the overall structure reveals that the basic building block of TALE protein, namely a helical hairpin, is one-helix shifted from the previously defined TALE motif. Here we wish to suggest a structure-based re-demarcation of the TALE repeat which starts with the residues that bind to the DNA backbone phosphate and concludes with the base-recognition hyper-variable residue. This new numbering system is consistent with the α-solenoid superfamily to which TALE belongs, and reflects the structural integrity of TAL effectors. In addition, it confers integral number of TALE repeats that matches the number of bound DNA bases. We then present fifteen crystal structures of engineered dHax3 variants in complex with target DNA molecules, which elucidate the structural basis for the recognition of bases adenine (A) and guanine (G) by reported or uncharacterized TALE codes. Finally, we analyzed the sequence-structure correlation of the amino acid residues within a TALE repeat. The structural analyses reported here may advance the mechanistic understanding of TALE proteins and facilitate the design of TALEN with improved affinity and specificity.

  4. Reconfigurable multiport EPON repeater

    Science.gov (United States)

    Oishi, Masayuki; Inohara, Ryo; Agata, Akira; Horiuchi, Yukio

    2009-11-01

    An extended reach EPON repeater is one of the solutions to effectively expand FTTH service areas. In this paper, we propose a reconfigurable multi-port EPON repeater for effective accommodation of multiple ODNs with a single OLT line card. The proposed repeater, which has multi-ports in both OLT and ODN sides, consists of TRs, BTRs with the CDR function and a reconfigurable electrical matrix switch, can accommodate multiple ODNs to a single OLT line card by controlling the connection of the matrix switch. Although conventional EPON repeaters require full OLT line cards to accommodate subscribers from the initial installation stage, the proposed repeater can dramatically reduce the number of required line cards especially when the number of subscribers is less than a half of the maximum registerable users per OLT. Numerical calculation results show that the extended reach EPON system with the proposed EPON repeater can save 17.5% of the initial installation cost compared with a conventional repeater, and can be less expensive than conventional systems up to the maximum subscribers especially when the percentage of ODNs in lightly-populated areas is higher.

  5. Laterality Biases to Chimeric Faces in Asperger Syndrome: What Is Right about Face-Processing?

    Science.gov (United States)

    Ashwin, Chris; Wheelwright, Sally; Baron-Cohen, Simon

    2005-01-01

    People show a left visual field (LVF) bias for faces, i.e., involving the right hemisphere of the brain. Lesion and neuroimaging studies confirm the importance of the right-hemisphere and suggest separable neural pathways for processing facial identity vs. emotions. We investigated the hemispheric processing of faces in adults with and without…

  6. Perception of face and body expressions using electromyography, pupillometry and gaze measures

    NARCIS (Netherlands)

    Kret, M.E.; Stekelenburg, J.J.; Roelofs, K.; de Gelder, B.

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body

  7. Quantum repeated games revisited

    International Nuclear Information System (INIS)

    Frąckiewicz, Piotr

    2012-01-01

    We present a scheme for playing quantum repeated 2 × 2 games based on Marinatto and Weber’s approach to quantum games. As a potential application, we study the twice repeated Prisoner’s Dilemma game. We show that results not available in the classical game can be obtained when the game is played in the quantum way. Before we present our idea, we comment on the previous scheme of playing quantum repeated games proposed by Iqbal and Toor. We point out the drawbacks that make their results unacceptable. (paper)

  8. [Vitiligo and emotions].

    Science.gov (United States)

    Nogueira, Lucas S C; Zancanaro, Pedro C Q; Azambuja, Roberto D

    2009-01-01

    On average, vitiligo affects one percent of the world population. More than 75% of the patients have negative self-image on account of the disease. The emotional impact of the dermatosis is frequently neglected by the caretaker, which has negative influence on therapy and prognosis. OBJECTIVE; To check the effect of vitiligo on patients emotions and discuss the mind-body interaction and its impact on the disease. METHODS; In their first medical visit, one hundred patients with various forms of vitiligo answered a question about which emotions were elicited by the presence of the spots. RESULTS; Eighty-eight percent of the patients with spots in exposed areas complained of unpleasant emotions versus twenty-seven percent of those with spots in unexposed areas. The most frequently referred emotions were fear, specifically of expansion of the spots (71%), shame (57%), insecurity (55%), sadness (55%) and inhibition (53%). CONCLUSION; Chronic illnesses generate in human beings a negative experience propitiated by the expectation of suffering. Besides appropriate scientific guidance, vitiligo patients need emotional comfort. Treatment outcomes and patients compliance to it, and even their resilience to face occasional therapeutic failures, rely on good physician-patient relationship. At a time when doctors make use of reputable therapeutic resources, it is indispensable that dermatologists become able to evaluate the patient in an integrative fashion.

  9. The Interplay between Emotion and Cognition in Autism Spectrum Disorder: Implications for Developmental Theory

    Science.gov (United States)

    Gaigg, Sebastian B.

    2012-01-01

    Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder that is clinically defined by abnormalities in reciprocal social and communicative behaviors and an inflexible adherence to routinised patterns of thought and behavior. Laboratory studies repeatedly demonstrate that autistic individuals experience difficulties in recognizing and understanding the emotional expressions of others and naturalistic observations show that they use such expressions infrequently and inappropriately to regulate social exchanges. Dominant theories attribute this facet of the ASD phenotype to abnormalities in a social brain network that mediates social-motivational and social-cognitive processes such as face processing, mental state understanding, and empathy. Such theories imply that only emotion related processes relevant to social cognition are compromised in ASD but accumulating evidence suggests that the disorder may be characterized by more widespread anomalies in the domain of emotions. In this review I summarize the relevant literature and argue that the social-emotional characteristics of ASD may be better understood in terms of a disruption in the domain-general interplay between emotion and cognition. More specifically I will suggest that ASD is the developmental consequence of early emerging anomalies in how emotional responses to the environment modulate a wide range of cognitive processes including those that are relevant to navigating the social world. PMID:23316143

  10. The interplay between emotion and cognition in Autism Spectrum Disorder: Implications for developmental theory

    Directory of Open Access Journals (Sweden)

    Sebastian B Gaigg

    2012-12-01

    Full Text Available Autism Spectrum Disorder (ASD is a neurodevelopmental disorder that is clinically defined by abnormalities in reciprocal social and communicative behaviours and an inflexible adherence to routinised patterns of thought and behaviour. Laboratory studies repeatedly demonstrate that autistic individuals experience difficulties in recognising and understanding the emotional expressions of others and naturalistic observations show that they use such expressions infrequently and inappropriately to regulate social exchanges. Dominant theories attribute this facet of the ASD phenotype to abnormalities in a social brain network that mediates social-motivational and social-cognitive processes such as face processing, mental state understanding and empathy. Such theories imply that only emotion related processes relevant to social cognition are compromised in ASD but accumulating evidence suggests that the disorder may be characterised by more widespread anomalies in the domain of emotions. In this review I summarise the relevant literature and argue that the social-emotional characteristics of ASD may be better understood in terms of a disruption in the domain-general interplay between emotion and cognition. More specifically I will suggest that ASD is the developmental consequence of early-emerging anomalies in how emotional responses to the environment modulate a wide range of cognitive processes including those that are relevant to navigating the social world.

  11. The Interplay between Emotion and Cognition in Autism Spectrum Disorder: Implications for Developmental Theory.

    Science.gov (United States)

    Gaigg, Sebastian B

    2012-01-01

    Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder that is clinically defined by abnormalities in reciprocal social and communicative behaviors and an inflexible adherence to routinised patterns of thought and behavior. Laboratory studies repeatedly demonstrate that autistic individuals experience difficulties in recognizing and understanding the emotional expressions of others and naturalistic observations show that they use such expressions infrequently and inappropriately to regulate social exchanges. Dominant theories attribute this facet of the ASD phenotype to abnormalities in a social brain network that mediates social-motivational and social-cognitive processes such as face processing, mental state understanding, and empathy. Such theories imply that only emotion related processes relevant to social cognition are compromised in ASD but accumulating evidence suggests that the disorder may be characterized by more widespread anomalies in the domain of emotions. In this review I summarize the relevant literature and argue that the social-emotional characteristics of ASD may be better understood in terms of a disruption in the domain-general interplay between emotion and cognition. More specifically I will suggest that ASD is the developmental consequence of early emerging anomalies in how emotional responses to the environment modulate a wide range of cognitive processes including those that are relevant to navigating the social world.

  12. Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

    Science.gov (United States)

    Raz, Sivan; Dan, Orrie; Zysberg, Leehu

    2014-11-01

    The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top-down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Famous face recognition, face matching, and extraversion.

    Science.gov (United States)

    Lander, Karen; Poyarekar, Siddhi

    2015-01-01

    It has been previously established that extraverts who are skilled at interpersonal interaction perform significantly better than introverts on a face-specific recognition memory task. In our experiment we further investigate the relationship between extraversion and face recognition, focusing on famous face recognition and face matching. Results indicate that more extraverted individuals perform significantly better on an upright famous face recognition task and show significantly larger face inversion effects. However, our results did not find an effect of extraversion on face matching or inverted famous face recognition.

  14. IntraFace.

    Science.gov (United States)

    De la Torre, Fernando; Chu, Wen-Sheng; Xiong, Xuehan; Vicente, Francisco; Ding, Xiaoyu; Cohn, Jeffrey

    2015-05-01

    Within the last 20 years, there has been an increasing interest in the computer vision community in automated facial image analysis algorithms. This has been driven by applications in animation, market research, autonomous-driving, surveillance, and facial editing among others. To date, there exist several commercial packages for specific facial image analysis tasks such as facial expression recognition, facial attribute analysis or face tracking. However, free and easy-to-use software that incorporates all these functionalities is unavailable. This paper presents IntraFace (IF), a publicly-available software package for automated facial feature tracking, head pose estimation, facial attribute recognition, and facial expression analysis from video. In addition, IFincludes a newly develop technique for unsupervised synchrony detection to discover correlated facial behavior between two or more persons, a relatively unexplored problem in facial image analysis. In tests, IF achieved state-of-the-art results for emotion expression and action unit detection in three databases, FERA, CK+ and RU-FACS; measured audience reaction to a talk given by one of the authors; and discovered synchrony for smiling in videos of parent-infant interaction. IF is free of charge for academic use at http://www.humansensing.cs.cmu.edu/intraface/.

  15. Repeat migration and disappointment.

    Science.gov (United States)

    Grant, E K; Vanderkamp, J

    1986-01-01

    This article investigates the determinants of repeat migration among the 44 regions of Canada, using information from a large micro-database which spans the period 1968 to 1971. The explanation of repeat migration probabilities is a difficult task, and this attempt is only partly successful. May of the explanatory variables are not significant, and the overall explanatory power of the equations is not high. In the area of personal characteristics, the variables related to age, sex, and marital status are generally significant and with expected signs. The distance variable has a strongly positive effect on onward move probabilities. Variables related to prior migration experience have an important impact that differs between return and onward probabilities. In particular, the occurrence of prior moves has a striking effect on the probability of onward migration. The variable representing disappointment, or relative success of the initial move, plays a significant role in explaining repeat migration probabilities. The disappointment variable represents the ratio of actural versus expected wage income in the year after the initial move, and its effect on both repeat migration probabilities is always negative and almost always highly significant. The repeat probabilities diminish after a year's stay in the destination region, but disappointment in the most recent year still has a bearing on the delayed repeat probabilities. While the quantitative impact of the disappointment variable is not large, it is difficult to draw comparisons since similar estimates are not available elsewhere.

  16. The Face-to-Face Light Detection Paradigm: A New Methodology for Investigating Visuospatial Attention Across Different Face Regions in Live Face-to-Face Communication Settings.

    Science.gov (United States)

    Thompson, Laura A; Malloy, Daniel M; Cone, John M; Hendrickson, David L

    2010-01-01

    We introduce a novel paradigm for studying the cognitive processes used by listeners within interactive settings. This paradigm places the talker and the listener in the same physical space, creating opportunities for investigations of attention and comprehension processes taking place during interactive discourse situations. An experiment was conducted to compare results from previous research using videotaped stimuli to those obtained within the live face-to-face task paradigm. A headworn apparatus is used to briefly display LEDs on the talker's face in four locations as the talker communicates with the participant. In addition to the primary task of comprehending speeches, participants make a secondary task light detection response. In the present experiment, the talker gave non-emotionally-expressive speeches that were used in past research with videotaped stimuli. Signal detection analysis was employed to determine which areas of the face received the greatest focus of attention. Results replicate previous findings using videotaped methods.

  17. Informational need of emotional stress

    Science.gov (United States)

    Simonov, P. V.; Frolov, M. V.

    According to the informational theory of emotions[1], emotions in humans depend on the power of some need (motivation) and the estimation by the subject of the probability (possibility) of the need staisfaction (the goal achievement). Low probability of need satisfaction leads to negative emotions, actively minimized by the subject. Increased probability of satisfaction, as compared to earlier forecast, generates positive emotions, which the subject tries to maximize, i.e. to enhance, to prolong, to repeat. The informational theory of emotions encompasses their reflective function, the laws of their appearance, the regulatory significance of emotions, and their role in organization of behavior. The level of emotional stress influences the operator's performance. A decrease in the emotional tonus leads to drowsiness, lack of vigilance, missing of significant signals and to slower reactions. An extremely high stress level disorganizes the activity, complicates it with a trend toward incorrect actions and reactions to insignificant signals (false alarms). The neurophysiological mechanisms of the influence of emotions on perceptual activity and operator performance as well as the significance of individuality are discussed.

  18. [The brain mechanisms of emotions].

    Science.gov (United States)

    Simonov, P V

    1997-01-01

    At the 23rd International Congress of Physiological Sciences (Tokyo, 1965) the results of experiment brought us to a conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion and estimation of probability have different neuromorphological substrate. Activating by motivatiogenic structures of the hypothalamus the frontal parts of neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which is typical for emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons while in the course of avoidance reaction the positive emotion's neurons being involved. The role of the left and right frontal neocortex in the appearance of positive or negative emotions depends on this informational (cognitive) functions.

  19. DAT by perceived MC interaction on human prefrontal activity and connectivity during emotion processing.

    Science.gov (United States)

    Taurisano, Paolo; Blasi, Giuseppe; Romano, Raffaella; Sambataro, Fabio; Fazio, Leonardo; Gelao, Barbara; Ursini, Gianluca; Lo Bianco, Luciana; Di Giorgio, Annabella; Ferrante, Francesca; Papazacharias, Apostolos; Porcelli, Annamaria; Sinibaldi, Lorenzo; Popolizio, Teresa; Bertolino, Alessandro

    2013-12-01

    Maternal care (MC) and dopamine modulate brain activity during emotion processing in inferior frontal gyrus (IFG), striatum and amygdala. Reuptake of dopamine from the synapse is performed by the dopamine transporter (DAT), whose abundance is predicted by variation in its gene (DAT 3'VNTR; 10 > 9-repeat alleles). Here, we investigated the interaction between perceived MC and DAT 3'VNTR genotype on brain activity during processing of aversive facial emotional stimuli. Sixty-one healthy subjects were genotyped for DAT 3'VNTR and categorized in low and high MC individuals. They underwent functional magnetic resonance imaging while performing a task requiring gender discrimination of facial stimuli with angry, fearful or neutral expressions. An interaction between facial expression, DAT genotype and MC was found in left IFG, such that low MC and homozygosity for the 10-repeat allele are associated with greater activity during processing of fearful faces. This greater activity was also inversely correlated with a measure of emotion control as scored with the Big Five Questionnaire. Moreover, MC and DAT genotype described a double dissociation on functional connectivity between IFG and amygdala. These findings suggest that perceived early parental bonding may interact with DAT 3'VNTR genotype in modulating brain activity during emotionally relevant inputs.

  20. Encouraging Preadolescent Emotional Intelligence through Leadership Activity

    Science.gov (United States)

    Alvarado, John Henry

    2010-01-01

    The study sought to determine effects of leadership activity on emotional intelligence in preadolescents. Ninety-two Central California Valley sixth grade students in two schools and four classes were assessed on emotional intelligence. Treatment and comparison groups were identified. A Two-Way Repeated Measures ANOVA examined change over time…

  1. BRAIN, EMOTION, AND MORAL JUDGEMENT

    OpenAIRE

    Ting, Fransisca

    2016-01-01

    The dual process theory posits that people relies on their emotion (especially negative emotions) when they are faced with personal moral dilemmas, such as pushing a person off a footbridge in order to stop a trolley that would otherwise kill five people. In an fMRI investigation, the medial frontal gyrus, posterior cingulate gyrus, and bilateral angular gyrus are more activated in considering a personal moral dilemma, leading them to make a characteristically deontologica...

  2. Cultural affordances and emotional experience: socially engaging and disengaging emotions in Japan and the United States.

    Science.gov (United States)

    Kitayama, Shinobu; Mesquita, Batja; Karasawa, Mayumi

    2006-11-01

    The authors hypothesized that whereas Japanese culture encourages socially engaging emotions (e.g., friendly feelings and guilt), North American culture fosters socially disengaging emotions (e.g., pride and anger). In two cross-cultural studies, the authors measured engaging and disengaging emotions repeatedly over different social situations and found support for this hypothesis. As predicted, Japanese showed a pervasive tendency to reportedly experience engaging emotions more strongly than they experienced disengaging emotions, but Americans showed a reversed tendency. Moreover, as also predicted, Japanese subjective well-being (i.e., the experience of general positive feelings) was more closely associated with the experience of engaging positive emotions than with that of disengaging emotions. Americans tended to show the reversed pattern. The established cultural differences in the patterns of emotion suggest the consistent and systematic cultural shaping of emotion over time.

  3. Virtual & Real Face to Face Teaching

    Science.gov (United States)

    Teneqexhi, Romeo; Kuneshka, Loreta

    2016-01-01

    In traditional "face to face" lessons, during the time the teacher writes on a black or white board, the students are always behind the teacher. Sometimes, this happens even in the recorded lesson in videos. Most of the time during the lesson, the teacher shows to the students his back not his face. We do not think the term "face to…

  4. Food-Induced Emotional Resonance Improves Emotion Recognition.

    Science.gov (United States)

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce-which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one.

  5. Food-Induced Emotional Resonance Improves Emotion Recognition

    Science.gov (United States)

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce—which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one. PMID:27973559

  6. Misremembering emotion: Inductive category effects for complex emotional stimuli.

    Science.gov (United States)

    Corbin, Jonathan C; Crawford, L Elizabeth; Vavra, Dylan T

    2017-07-01

    Memories of objects are biased toward what is typical of the category to which they belong. Prior research on memory for emotional facial expressions has demonstrated a bias towards an emotional expression prototype (e.g., slightly happy faces are remembered as happier). We investigate an alternate source of bias in memory for emotional expressions - the central tendency bias. The central tendency bias skews reconstruction of a memory trace towards the center of the distribution for a particular attribute. This bias has been attributed to a Bayesian combination of an imprecise memory for a particular object with prior information about its category. Until now, studies examining the central tendency bias have focused on simple stimuli. We extend this work to socially relevant, complex, emotional facial expressions. We morphed facial expressions on a continuum from sad to happy. Different ranges of emotion were used in four experiments in which participants viewed individual expressions and, after a variable delay, reproduced each face by adjusting a morph to match it. Estimates were biased toward the center of the presented stimulus range, and the bias increased at longer memory delays, consistent with the Bayesian prediction that as trace memory loses precision, category knowledge is given more weight. The central tendency effect persisted within and across emotion categories (sad, neutral, and happy). This article expands the scope of work on inductive category effects to memory for complex, emotional stimuli.

  7. Age Differences in the Complexity of Emotion Perception.

    Science.gov (United States)

    Kim, Seungyoun; Geren, Jennifer L; Knight, Bob G

    2015-01-01

    The current study examined age differences in the number of emotion components used in the judgment of emotion from facial expressions. Fifty-eight younger and 58 older adults were compared on the complexity of perception of emotion from standardized facial expressions that were either clear or ambiguous exemplars of emotion. Using an intra-individual factor analytic approach, results showed that older adults used more emotion components in perceiving emotion in faces than younger adults. Both age groups reported greater emotional complexity for the clear and prototypical emotional stimuli. Age differences in emotional complexity were more pronounced for the ambiguous expressions compared with the clear expressions. These findings demonstrate that older adults showed increased elaboration of emotion, particularly when emotion cues were subtle and provide support for greater emotion differentiation in older adulthood.

  8. Enhanced attention amplifies face adaptation.

    Science.gov (United States)

    Rhodes, Gillian; Jeffery, Linda; Evangelista, Emma; Ewing, Louise; Peters, Marianne; Taylor, Libby

    2011-08-15

    Perceptual adaptation not only produces striking perceptual aftereffects, but also enhances coding efficiency and discrimination by calibrating coding mechanisms to prevailing inputs. Attention to simple stimuli increases adaptation, potentially enhancing its functional benefits. Here we show that attention also increases adaptation to faces. In Experiment 1, face identity aftereffects increased when attention to adapting faces was increased using a change detection task. In Experiment 2, figural (distortion) face aftereffects increased when attention was increased using a snap game (detecting immediate repeats) during adaptation. Both were large effects. Contributions of low-level adaptation were reduced using free viewing (both experiments) and a size change between adapt and test faces (Experiment 2). We suggest that attention may enhance adaptation throughout the entire cortical visual pathway, with functional benefits well beyond the immediate advantages of selective processing of potentially important stimuli. These results highlight the potential to facilitate adaptive updating of face-coding mechanisms by strategic deployment of attentional resources. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Impaired face recognition is associated with social inhibition.

    Science.gov (United States)

    Avery, Suzanne N; VanDerKlok, Ross M; Heckers, Stephan; Blackford, Jennifer U

    2016-02-28

    Face recognition is fundamental to successful social interaction. Individuals with deficits in face recognition are likely to have social functioning impairments that may lead to heightened risk for social anxiety. A critical component of social interaction is how quickly a face is learned during initial exposure to a new individual. Here, we used a novel Repeated Faces task to assess how quickly memory for faces is established. Face recognition was measured over multiple exposures in 52 young adults ranging from low to high in social inhibition, a core dimension of social anxiety. High social inhibition was associated with a smaller slope of change in recognition memory over repeated face exposure, indicating participants with higher social inhibition showed smaller improvements in recognition memory after seeing faces multiple times. We propose that impaired face learning is an important mechanism underlying social inhibition and may contribute to, or maintain, social anxiety. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Towards the neurobiology of emotional body language.

    Science.gov (United States)

    de Gelder, Beatrice

    2006-03-01

    People's faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are automatically perceived and understood, and their role in emotional communication and decision-making.

  11. Evidence for unintentional emotional contagion beyond dyads.

    Directory of Open Access Journals (Sweden)

    Guillaume Dezecache

    Full Text Available Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another's emotional expression produces, in the observer's face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B's faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others' emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.

  12. The Pentapeptide Repeat Proteins

    OpenAIRE

    Vetting, Matthew W.; Hegde, Subray S.; Fajardo, J. Eduardo; Fiser, Andras; Roderick, Steven L.; Takiff, Howard E.; Blanchard, John S.

    2006-01-01

    The Pentapeptide Repeat Protein (PRP) family has over 500 members in the prokaryotic and eukaryotic kingdoms. These proteins are composed of, or contain domains composed of, tandemly repeated amino acid sequences with a consensus sequence of [S,T,A,V][D,N][L,F]-[S,T,R][G]. The biochemical function of the vast majority of PRP family members is unknown. The three-dimensional structure of the first member of the PRP family was determined for the fluoroquinolone resistance protein (MfpA) from Myc...

  13. Acute exercise attenuates negative affect following repeated sad mood inductions in persons who have recovered from depression.

    Science.gov (United States)

    Mata, Jutta; Hogan, Candice L; Joormann, Jutta; Waugh, Christian E; Gotlib, Ian H

    2013-02-01

    Identifying factors that may protect individuals from developing Major Depressive Disorder (MDD) in the face of stress is critical. In the current study we experimentally tested whether such a potentially protective factor, engaging in acute exercise, reduces the adverse effects of repeated sad mood inductions in individuals who have recovered from depression. We hypothesized that recovered depressed participants who engage in acute exercise report a smaller increase in negative affect (NA) and a smaller decrease in positive affect (PA) when exposed to a repeated sad mood induction (i.e., habituation), whereas participants who do not exercise show sensitization (i.e., increased NA and decreased PA in response to a repeated adverse stimulus). Forty-one women recovered from MDD and 40 healthy control women were randomly assigned to either exercise for 15 minutes or quiet rest. Afterward, participants were exposed to two sad mood inductions and reported their levels of affect throughout the study. Recovered depressed participants who had not exercised exhibited higher NA after the second sad mood induction, a finding consistent with sensitization. In contrast, both recovered depressed participants who had engaged in acute exercise and healthy control participants showed no increase in NA in response to the repeated sad mood induction. Participants who exercised reported higher PA after the exercise bout; however, our hypothesis concerning reported PA trajectories following the sad mood inductions was not supported. Results suggest that exercise can serve as a protective factor in the face of exposure to repeated emotional stressors, particularly concerning NA in individuals who have recovered from depression. 2013 APA, all rights reserved

  14. Emotional engineering

    CERN Document Server

    In an age of increasing complexity, diversification and change, customers expect services that cater to their needs and to their tastes. Emotional Engineering vol 2. describes how their expectations can be satisfied and managed throughout the product life cycle, if producers focus their attention more on emotion. Emotional engineering provides the means to integrate products to create a new social framework and develops services beyond product realization to create of value across a full lifetime.  14 chapters cover a wide range of topics that can be applied to product, process and industry development, with special attention paid to the increasing importance of sensing in the age of extensive and frequent changes, including: • Multisensory stimulation and user experience  • Physiological measurement • Tactile sensation • Emotional quality management • Mental model • Kansei engineering.   Emotional Engineering vol 2 builds on Dr Fukuda’s previous book, Emotional Engineering, and provides read...

  15. Wordsworthian Emotion

    Institute of Scientific and Technical Information of China (English)

    张敏

    2010-01-01

    As a great poet in British Romanticism.Wordsworth is not the practioner of an artistic craft designed tO satisfy "taste" of a literary connoisseur.He is,instead."a man speaking to men" with his uniqueness in emotion.This paper tempts to demonstrate how Wordsworth conveys emotion with poetic language.Wordsworthian "emotion recollected in tranquility" is simple,pure and genuine,which is the true art in wordsworth's poems.

  16. Repeated Causal Decision Making

    Science.gov (United States)

    Hagmayer, York; Meder, Bjorn

    2013-01-01

    Many of our decisions refer to actions that have a causal impact on the external environment. Such actions may not only allow for the mere learning of expected values or utilities but also for acquiring knowledge about the causal structure of our world. We used a repeated decision-making paradigm to examine what kind of knowledge people acquire in…

  17. simple sequence repeat (SSR)

    African Journals Online (AJOL)

    In the present study, 78 mapped simple sequence repeat (SSR) markers representing 11 linkage groups of adzuki bean were evaluated for transferability to mungbean and related Vigna spp. 41 markers amplified characteristic bands in at least one Vigna species. The transferability percentage across the genotypes ranged ...

  18. Agency and facial emotion judgment in context.

    Science.gov (United States)

    Ito, Kenichi; Masuda, Takahiko; Li, Liman Man Wai

    2013-06-01

    Past research showed that East Asians' belief in holism was expressed as their tendencies to include background facial emotions into the evaluation of target faces more than North Americans. However, this pattern can be interpreted as North Americans' tendency to downplay background facial emotions due to their conceptualization of facial emotion as volitional expression of internal states. Examining this alternative explanation, we investigated whether different types of contextual information produce varying degrees of effect on one's face evaluation across cultures. In three studies, European Canadians and East Asians rated the intensity of target facial emotions surrounded with either affectively salient landscape sceneries or background facial emotions. The results showed that, although affectively salient landscapes influenced the judgment of both cultural groups, only European Canadians downplayed the background facial emotions. The role of agency as differently conceptualized across cultures and multilayered systems of cultural meanings are discussed.

  19. The hierarchical brain network for face recognition.

    Science.gov (United States)

    Zhen, Zonglei; Fang, Huizhen; Liu, Jia

    2013-01-01

    Numerous functional magnetic resonance imaging (fMRI) studies have identified multiple cortical regions that are involved in face processing in the human brain. However, few studies have characterized the face-processing network as a functioning whole. In this study, we used fMRI to identify face-selective regions in the entire brain and then explore the hierarchical structure of the face-processing network by analyzing functional connectivity among these regions. We identified twenty-five regions mainly in the occipital, temporal and frontal cortex that showed a reliable response selective to faces (versus objects) across participants and across scan sessions. Furthermore, these regions were clustered into three relatively independent sub-networks in a face-recognition task on the basis of the strength of functional connectivity among them. The functionality of the sub-networks likely corresponds to the recognition of individual identity, retrieval of semantic knowledge and representation of emotional information. Interestingly, when the task was switched to object recognition from face recognition, the functional connectivity between the inferior occipital gyrus and the rest of the face-selective regions were significantly reduced, suggesting that this region may serve as an entry node in the face-processing network. In sum, our study provides empirical evidence for cognitive and neural models of face recognition and helps elucidate the neural mechanisms underlying face recognition at the network level.

  20. Individual differences in emotion lateralisation and the processing of emotional information arising from social interactions.

    Science.gov (United States)

    Bourne, Victoria J; Watling, Dawn

    2015-01-01

    Previous research examining the possible association between emotion lateralisation and social anxiety has found conflicting results. In this paper two studies are presented to assess two aspects related to different features of social anxiety: fear of negative evaluation (FNE) and emotion regulation. Lateralisation for the processing of facial emotion was measured using the chimeric faces test. Individuals with greater FNE were more strongly lateralised to the right hemisphere for the processing of anger, happiness and sadness; and, for the processing of fearful faces the relationship was found for females only. Emotion regulation strategies were reduced to two factors: positive strategies and negative strategies. For males, but not females, greater reported use of negative emotion strategies is associated with stronger right hemisphere lateralisation for processing negative emotions. The implications for further understanding the neuropsychological processing of emotion in individuals with social anxiety are discussed.

  1. Induction of depressed and elated mood by music influences the perception of facial emotional expressions in healthy subjects.

    Science.gov (United States)

    Bouhuys, A L; Bloem, G M; Groothuis, T G

    1995-04-04

    The judgement of healthy subject rating the emotional expressions of a set of schematic drawn faces is validated (study 1) to examine the relationship between mood (depressed/elated) and judgement of emotional expressions of these faces (study 2). Study 1: 30 healthy subjects judged 12 faces with respect to the emotions they express (fear, happiness, anger, sadness, disgust, surprise, rejection and invitation). It was found that a particular face could reflect various emotions. All eight emotions were reflected in the set of faces and the emotions were consensually judged. Moreover, gender differences in judgement could be established. Study 2: In a cross-over design, 24 healthy subjects judged the faces after listening to depressing or elating music. The faces were subdivided in six 'ambiguous' faces (i.e., expressing similar amounts of positive and negative emotions) and six 'clear' faces (i.e., faces showing a preponderance of positive or negative emotions). In addition, these two types of faces were distinguished with respect to the intensity of emotions they express. 11 subjects who showed substantial differences in experienced depression after listening to the music were selected for further analysis. It was found that, when feeling more depressed, the subjects perceived more rejection/sadness in ambiguous faces (displaying less intensive emotions) and less invitation/happiness in clear faces. In addition, subjects saw more fear in clear faces that express less intensive emotions. Hence, results show a depression-related negative bias in the perception of facial displays.

  2. Crossmodal binding of fear in voice and face

    NARCIS (Netherlands)

    Dolan, R.J.; Morris, J.S.; de Gelder, B.

    2001-01-01

    In social environments, multiple sensory channels are simultaneously engaged in the service of communication. In this experiment, we were concerned with defining the neuronal mechanisms for a perceptual bias in processing simultaneously presented emotional voices and faces. Specifically, we were

  3. Higher resting heart rate variability predicts skill in expressing some emotions.

    Science.gov (United States)

    Tuck, Natalie L; Grant, Rosemary C I; Sollers, John J; Booth, Roger J; Consedine, Nathan S

    2016-12-01

    Vagally mediated heart rate variability (vmHRV) is a measure of cardiac vagal tone, and is widely viewed as a physiological index of the capacity to regulate emotions. However, studies have not directly tested whether vmHRV is associated with the ability to facially express emotions. In extending prior work, the current report tested links between resting vmHRV and the objectively assessed ability to facially express emotions, hypothesizing that higher vmHRV would predict greater expressive skill. Eighty healthy women completed self-reported measures, before attending a laboratory session in which vmHRV and the ability to express six emotions in the face were assessed. A repeated measures analysis of variance revealed a marginal main effect for vmHRV on skill overall; individuals with higher resting vmHRV were only better able to deliberately facially express anger and interest. Findings suggest that differences in resting vmHRV are associated with the objectively assessed ability to facially express some, but not all, emotions, with potential implications for health and well-being. © 2016 Society for Psychophysiological Research.

  4. Emotion work and well-being of secondary school educators / Christelle Alfrida Visser

    OpenAIRE

    Visser, Christelle Alfrida

    2006-01-01

    Emotions play a profound role in the workplace, especially in the human service profession. Service agents, for example educators, are expected to express socially desired emotions in a service interaction with learners. This direct face-to-face contact with learners requires a lot of emotions and in order to advance educational goals, teachers perform Emotion Work. Factors like the individual factor Emotional Intelligence and organisational factors like Job Autonomy, Superviso...

  5. Extended Emotions

    DEFF Research Database (Denmark)

    Krueger, Joel; Szanto, Thomas

    2016-01-01

    beyond the neurophysiological confines of organisms; some even argue that emotions can be socially extended and shared by multiple agents. Call this the extended emotions thesis (ExE). In this article, we consider different ways of understanding ExE in philosophy, psychology, and the cognitive sciences...

  6. European cinema: face to face with Hollywood

    NARCIS (Netherlands)

    Elsaesser, T.

    2005-01-01

    In the face of renewed competition from Hollywood since the early 1980s and the challenges posed to Europe's national cinemas by the fall of the Wall in 1989, independent filmmaking in Europe has begun to re-invent itself. European Cinema: Face to Face with Hollywood re-assesses the different

  7. A Face Inversion Effect without a Face

    Science.gov (United States)

    Brandman, Talia; Yovel, Galit

    2012-01-01

    Numerous studies have attributed the face inversion effect (FIE) to configural processing of internal facial features in upright but not inverted faces. Recent findings suggest that face mechanisms can be activated by faceless stimuli presented in the context of a body. Here we asked whether faceless stimuli with or without body context may induce…

  8. Emotional Intelligence in medical practice

    Directory of Open Access Journals (Sweden)

    Abu Hasan Sarkar

    2016-08-01

    Full Text Available Emotional Intelligence is the ability to perceive, express, understand and regulate one’s inner emotions and the emotions of others. It is considered to be a ‘must have’ competence in the workplace. Several scientific studies have proven that the application of emotional intelligence is effective in improving the teaching-learning process and that it leads to organizational growth; however, only limited work has been carried out to assess its effectiveness in the practice of medicine, especially in India. Various scales have been developed to measure emotional intelligence but they are not universally applicable because emotional intelligence depends upon culture and personal background among other factors. In recent years in India, conflicts between patients and doctors have had serious, sometimes fatal, consequences for the physician. Behavior, when faced with a potential conflict-like situation, depends to a great extent on the emotional intelligence of the physician. Emotional intelligence of medical students and medical professionals can be honed through exposure to the medical humanities which are known to promote patient-centered care. Building better physician-patient relationships might help in averting doctor-patient conflict.

  9. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Science.gov (United States)

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  10. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Tiziana Quarto

    Full Text Available The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI. Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC. Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  11. Emotion and Interhemispheric Interactions in Binocular Rivalry

    Directory of Open Access Journals (Sweden)

    K L Ritchie

    2013-10-01

    Full Text Available Previous research has shown that fear-related stimuli presented in peripheral vision are preferentially processed over stimuli depicting other emotions. Furthermore, emotional content can influence dominance duration in binocular rivalry, with the period of dominance for an emotional image (e.g. a fearful face</