WorldWideScience

Sample records for facial affect processing

  1. Exploring the nature of facial affect processing deficits in schizophrenia.

    NARCIS (Netherlands)

    Wout, M. van 't; Aleman, A.; Kessels, R.P.C.; Cahn, W.; Haan, E.H.F. de; Kahn, R.S.

    2007-01-01

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as we

  2. Exploring the nature of facial affect processing deficits in schizophrenia

    NARCIS (Netherlands)

    Wout, Mascha van 't; Aleman, Andre; Kessels, Roy P. C.; Cahn, Wiepke; Haan, Edward H. F. de; Kahn, Rene S.

    2007-01-01

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as we

  3. Relation between facial affect recognition and configural face processing in antipsychotic-free schizophrenia.

    Science.gov (United States)

    Fakra, Eric; Jouve, Elisabeth; Guillaume, Fabrice; Azorin, Jean-Michel; Blin, Olivier

    2015-03-01

    Deficit in facial affect recognition is a well-documented impairment in schizophrenia, closely connected to social outcome. This deficit could be related to psychopathology, but also to a broader dysfunction in processing facial information. In addition, patients with schizophrenia inadequately use configural information-a type of processing that relies on spatial relationships between facial features. To date, no study has specifically examined the link between symptoms and misuse of configural information in the deficit in facial affect recognition. Unmedicated schizophrenia patients (n = 30) and matched healthy controls (n = 30) performed a facial affect recognition task and a face inversion task, which tests aptitude to rely on configural information. In patients, regressions were carried out between facial affect recognition, symptom dimensions and inversion effect. Patients, compared with controls, showed a deficit in facial affect recognition and a lower inversion effect. Negative symptoms and lower inversion effect could account for 41.2% of the variance in facial affect recognition. This study confirms the presence of a deficit in facial affect recognition, and also of dysfunctional manipulation in configural information in antipsychotic-free patients. Negative symptoms and poor processing of configural information explained a substantial part of the deficient recognition of facial affect. We speculate that this deficit may be caused by several factors, among which independently stand psychopathology and failure in correctly manipulating configural information. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  4. Facial Affect Processing and Depression Susceptibility: Cognitive Biases and Cognitive Neuroscience

    Science.gov (United States)

    Bistricky, Steven L.; Ingram, Rick E.; Atchley, Ruth Ann

    2011-01-01

    Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal…

  5. The ANK3 gene and facial affect processing: An ERP study.

    Science.gov (United States)

    Zhao, Wan; Zhang, Qiumei; Yu, Ping; Zhang, Zhifang; Chen, Xiongying; Gu, Huang; Zhai, Jinguo; Chen, Min; Du, Boqi; Deng, Xiaoxiang; Ji, Feng; Wang, Chuanyue; Xiang, Yu-Tao; Li, Dawei; Wu, Hongjie; Dong, Qi; Luo, Yuejia; Li, Jun; Chen, Chuansheng

    2016-09-01

    ANK3 is one of the most promising candidate genes for bipolar disorder (BD). A polymorphism (rs10994336) within the ANK3 gene has been associated with BD in at least three genome-wide association studies of BD [McGuffin et al., 2003; Kieseppä, 2004; Edvardsen et al., 2008]. Because facial affect processing is disrupted in patients with BD, the current study aimed to explore whether the BD risk alleles are associated with the N170, an early event-related potential (ERP) component related to facial affect processing. We collected data from two independent samples of healthy individuals (Ns = 83 and 82, respectively) to test the association between rs10994336 and an early event-related potential (ERP) component (N170) that is sensitive to facial affect processing. Repeated-measures analysis of covariance in both samples consistently revealed significant main effects of rs10994336 genotype (Sample I: F (1, 72) = 7.24, P = 0.009; Sample II: F (1, 69) = 11.81, P = 0.001), but no significant interaction of genotype × electrodes (Ps > 0.05) or genotype × emotional conditions (Ps > 0.05). These results suggested that rs10994336 was linked to early ERP component reflecting facial structural encoding during facial affect processing. These results shed new light on the brain mechanism of this risk SNP and associated disorders such as BD. © 2016 Wiley Periodicals, Inc.

  6. Decline of executive processes affects identification of emotional facial expression in aging.

    Science.gov (United States)

    García-Rodríguez, Beatriz; Fusari, Anna; Fernández-Guinea, Sara; Frank, Ana; Molina, José Antonio; Ellgring, Heiner

    2011-02-01

    The current study examined the hypothesis that old people have a selective deficit in the identification of emotional facial expressions (EFEs) when the task conditions require the mechanism of the central executive. We have used a Dual Task (DT) paradigm to assess the role of visuo-spatial interference of working memory when processing emotional faces under two conditions: DT at encoding and DT at retrieval. Previous studies have revealed a loss of the ability to identify specific emotional facial expressions (EFEs) in old age. This has been consistently associated with a decline of the ability to coordinate the performance of two tasks concurrently. Working memory is usually tested using DT paradigms. Regarding to aging, there is evidence that with DT performance during encoding the costs are substantial. In contrast, the introduction of a secondary task after the primary task (i.e. at retrieval), had less detrimental effects on primary task performance in either younger or older adults. Our results demonstrate that aging is associated with higher DT costs when EFEs are identified concurrently with a visuo-spatial task. In contrast, there was not a significant age-related decline when the two tasks were presented sequentially. This suggests a deficit of the central executive rather than visuo-spatial memory deficits. The current data provide further support for the hypothesis that emotional processing is "top-down" controlled, and suggest that the deficits in emotional processing of old people depend, above all, on specific cognitive impairment.

  7. How does gaze direction affect facial processing in social anxiety? -An ERP study.

    Science.gov (United States)

    Li, Dan; Yu, Fengqiong; Ye, Rong; Chen, Xingui; Xie, Xinhui; Zhu, Chunyan; Wang, Kai

    2017-02-09

    Previous behavioral studies have demonstrated an effect of eye gaze direction on the processing of emotional expressions in adults with social anxiety. However, specific brain responses to the interaction between gaze direction and facial expressions in social anxiety remain unclear. The present study aimed to explore the time course of such interaction using event-related potentials (ERPs) in participants with social anxiety. High socially anxious individuals and low socially anxious individuals were asked to identify the gender of angry or neutral faces with direct or averted gaze while their behavioral performance and electrophysiological data were monitored. We found that identification of angry faces with direct but not averted gaze elicited larger N2 amplitude in high socially anxious individuals compared to low socially anxious individuals, while identification of neutral faces did not produce any gaze modulation effect. Moreover, the N2 was correlated with increased anxiety severity upon exposure to angry faces with direct gaze. Therefore, our results suggest that gaze direction modulates the processing of threatening faces in social anxiety. The N2 component elicited by angry faces with direct gaze could be a state-dependent biomarker of social anxiety and may be an important reference biomarker for social anxiety diagnosis and intervention.

  8. Exploiting facial expressions for affective video summarisation

    NARCIS (Netherlands)

    Joho, H.; Jose, J.M.; Valenti, R.; Sebe, N.; Marchand-Maillet, S.; Kompatsiaris, I.

    2009-01-01

    This paper presents an approach to affective video summarisation based on the facial expressions (FX) of viewers. A facial expression recognition system was deployed to capture a viewer's face and his/her expressions. The user's facial expressions were analysed to infer personalised affective scenes

  9. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure

    Science.gov (United States)

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces. PMID:28249033

  10. Subliminal Cues Bias Perception of Facial Affect in Patients with Generalized Social Phobia: Evidence for Enhanced Unconscious Threat Processing

    Directory of Open Access Journals (Sweden)

    Aiste eJusyte

    2014-08-01

    Full Text Available AbstractSocially anxious individuals have been shown to exhibit altered processing of facial affect, especially expressions signalling threat. Enhanced unaware processing has been suggested an important mechanism which may give rise to anxious conscious cognition and behavior. This study investigated whether individuals with social anxiety disorder (SAD are perceptually more vulnerable to the biasing effects of subliminal threat cues compared to healthy controls. In a perceptual judgment task, 23 SAD and 23 matched control participants were asked to rate the affective valence of parametrically manipulated affective expressions ranging from neutral to angry. Each trial was preceded by subliminal presentation of an angry/ neutral cue. The SAD group tended to rate target faces as angry when the preceding subliminal stimulus was angry vs. neutral, while healthy participants were not biased by the subliminal stimulus presentation. The perceptual bias in SAD was also associated with higher reaction time latencies in the subliminal angry cue condition. The results provide further support for enhanced unconscious threat processing in SAD individuals. The implications for etiology, maintenance and treatment of SAD are discussed.

  11. Facial Expression at Retrieval Affects Recognition of Facial Identity

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2015-06-01

    Full Text Available It is well known that memory can be modulated by emotional stimuli at the time of encoding and consolidation. For example, happy faces create better identity recognition than faces with certain other expressions. However, the influence of facial expression at the time of retrieval remains unknown in the literature. To separate the potential influence of expression at retrieval from its effects at earlier stages, we had participants learn neutral faces but manipulated facial expression at the time of memory retrieval in a standard old/new recognition task. The results showed a clear effect of facial expression, where happy test faces were identified more successfully than angry test faces. This effect is unlikely due to greater image similarity between the neutral learning face and the happy test face, because image analysis showed that the happy test faces are in fact less similar to the neutral learning faces relative to the angry test faces. In the second experiment, we investigated whether this emotional effect is influenced by the expression at the time of learning. We employed angry or happy faces as learning stimuli, and angry, happy, and neutral faces as test stimuli. The results showed that the emotional effect at retrieval is robust across different encoding conditions with happy or angry expressions. These findings indicate that emotional expressions affect the retrieval process in identity recognition, and identity recognition does not rely on emotional association between learning and test faces.

  12. Processing faces and facial expressions.

    Science.gov (United States)

    Posamentier, Mette T; Abdi, Hervé

    2003-09-01

    This paper reviews processing of facial identity and expressions. The issue of independence of these two systems for these tasks has been addressed from different approaches over the past 25 years. More recently, neuroimaging techniques have provided researchers with new tools to investigate how facial information is processed in the brain. First, findings from "traditional" approaches to identity and expression processing are summarized. The review then covers findings from neuroimaging studies on face perception, recognition, and encoding. Processing of the basic facial expressions is detailed in light of behavioral and neuroimaging data. Whereas data from experimental and neuropsychological studies support the existence of two systems, the neuroimaging literature yields a less clear picture because it shows considerable overlap in activation patterns in response to the different face-processing tasks. Further, activation patterns in response to facial expressions support the notion of involved neural substrates for processing different facial expressions.

  13. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    Science.gov (United States)

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  14. Stability of Facial Affective Expressions in Schizophrenia

    Directory of Open Access Journals (Sweden)

    H. Fatouros-Bergman

    2012-01-01

    Full Text Available Thirty-two videorecorded interviews were conducted by two interviewers with eight patients diagnosed with schizophrenia. Each patient was interviewed four times: three weekly interviews by the first interviewer and one additional interview by the second interviewer. 64 selected sequences where the patients were speaking about psychotic experiences were scored for facial affective behaviour with Emotion Facial Action Coding System (EMFACS. In accordance with previous research, the results show that patients diagnosed with schizophrenia express negative facial affectivity. Facial affective behaviour seems not to be dependent on temporality, since within-subjects ANOVA revealed no substantial changes in the amount of affects displayed across the weekly interview occasions. Whereas previous findings found contempt to be the most frequent affect in patients, in the present material disgust was as common, but depended on the interviewer. The results suggest that facial affectivity in these patients is primarily dominated by the negative emotions of disgust and, to a lesser extent, contempt and implies that this seems to be a fairly stable feature.

  15. How facial attractiveness affects sustained attention.

    Science.gov (United States)

    Li, Jie; Oksama, Lauri; Hyönä, Jukka

    2016-10-01

    The present study investigated whether and how facial attractiveness affects sustained attention. We adopted a multiple-identity tracking paradigm, using attractive and unattractive faces as stimuli. Participants were required to track moving target faces amid distractor faces and report the final location of each target. In Experiment 1, the attractive and unattractive faces differed in both the low-level properties (i.e., luminance, contrast, and color saturation) and high-level properties (i.e., physical beauty and age). The results showed that the attractiveness of both the target and distractor faces affected the tracking performance: The attractive target faces were tracked better than the unattractive target faces; when the targets and distractors were both unattractive male faces, the tracking performance was poorer than when they were of different attractiveness. In Experiment 2, the low-level properties of the facial images were equalized. The results showed that the attractive target faces were still tracked better than unattractive targets while the effects related to distractor attractiveness ceased to exist. Taken together, the results indicate that during attentional tracking the high-level properties related to the attractiveness of the target faces can be automatically processed, and then they can facilitate the sustained attention on the attractive targets, either with or without the supplement of low-level properties. On the other hand, only low-level properties of the distractor faces can be processed. When the distractors share similar low-level properties with the targets, they can be grouped together, so that it would be more difficult to sustain attention on the individual targets. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  16. 场景的不同空间频率信息对面部表情加工的影响:来自ERP的证据%Scenes Differing in Spatial Frequencies Affect Facial Expression Processing: Evidence from ERP Scenes Differing in Spatial Frequencies Affect Facial Expression Processing: Evidence from ERP

    Institute of Scientific and Technical Information of China (English)

    杨亚平; 徐强; 张林; 邓培状; 梁宁建

    2015-01-01

    Facial expressions are fundamental emotional stimuli as they convey important information in social interaction. Most empirical research on facial expression processing has focused on isolated faces. But in everyday life, faces are embedded in surrounding context. For example, fearful faces always accompany with tight bodies, and happy faces appear in birthday parties more often than in sickrooms. Scenes which faces are embedded in provide typical visual context. Recently some studies attempted to investigate the influence of emotional scenes on facial expression processing. Although a few previous studies in this field demonstrated the scene effects of facial expression processing existed, the studies did not further explore the specific processing mechanism of the scene effects. Because of its excellent temporal resolution, the present study used event-related potentials (ERPs) to investigate the effects of scenes that contain different spatial frequencies on facial expression processing. Our hypothesis was that the different spatial frequencies of scenes affected facial expression processing in different ways. Eighteen right-handed college students (11 females; age range 17~24 years; mean age_20.67±1.91 years) were paid to participate in the experiment. Thirty-two face pictures (16 females and 16 males) with fearful and neutral expressions and thirty-two scene pictures (16 negative scenes and 16 neutral scenes) were presented. Spatial frequency content in the original scene stimuli (broad-band, BSF) was filtered using a high-pass cut-off that was > 16cpi for the higher spatial frequency (HSF) scene stimuli, and a low-pass cut-off of 0.1). Our ERP results showed that for scenes with broad-band spatial frequency, fearful faces which appeared in neutral scenes elicited larger N170 amplitudes than these faces which appeared in negative scenes in both right and left hemispheres. But the effects were not found for scenes with high and low spatial frequencies. In

  17. Facial age affects emotional expression decoding

    Directory of Open Access Journals (Sweden)

    Mara eFölster

    2014-02-01

    Full Text Available Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions.Previous studies have often followed up this phenomenon by examining the effect of the observers’ age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and folds may render facial expressions of older adults harder to decode. In this paper, we review theoretical frameworks and empirical findings on age effects on decoding emotional expressions, with an emphasis on age-of-face effects. We conclude that the age of the face plays an important role for facial expression decoding. Lower expressivity, age-related changes in the face, less elaborated emotion schemas for older faces, negative attitudes toward older adults, and different visual scan patterns and neural processing of older than younger faces may lower decoding accuracy for older faces. Furthermore, age-related stereotypes and age-related changes in the face may bias the attribution of specific emotions such as sadness to older faces.

  18. Facial age affects emotional expression decoding.

    Science.gov (United States)

    Fölster, Mara; Hess, Ursula; Werheid, Katja

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions. Previous studies have often followed up this phenomenon by examining the effect of the observers' age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and folds may render facial expressions of older adults harder to decode. In this paper, we review theoretical frameworks and empirical findings on age effects on decoding emotional expressions, with an emphasis on age-of-face effects. We conclude that the age of the face plays an important role for facial expression decoding. Lower expressivity, age-related changes in the face, less elaborated emotion schemas for older faces, negative attitudes toward older adults, and different visual scan patterns and neural processing of older than younger faces may lower decoding accuracy for older faces. Furthermore, age-related stereotypes and age-related changes in the face may bias the attribution of specific emotions such as sadness to older faces.

  19. Fearful faces in schizophrenia - The relationship between patient characteristics and facial affect recognition

    NARCIS (Netherlands)

    van't Wout, Mascha; van Dijke, Annemiek; Aleman, Andre; Kessels, Roy P. C.; Pijpers, Wietske; Kahn, Rene S.

    2007-01-01

    Although schizophrenia has often been associated with deficits in facial affect recognition, it is debated whether the recognition of specific emotions is affected and if these facial affect-processing deficits are related to symptomatology or other patient characteristics. The purpose of the presen

  20. Categorical Perception of Affective and Linguistic Facial Expressions

    Science.gov (United States)

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  1. Facial Affect Recognition and Social Anxiety in Preschool Children

    Science.gov (United States)

    Ale, Chelsea M.; Chorney, Daniel B.; Brice, Chad S.; Morris, Tracy L.

    2010-01-01

    Research relating anxiety and facial affect recognition has focused mostly on school-aged children and adults and has yielded mixed results. The current study sought to demonstrate an association among behavioural inhibition and parent-reported social anxiety, shyness, social withdrawal and facial affect recognition performance in 30 children,…

  2. Facial Affect Recognition and Social Anxiety in Preschool Children

    Science.gov (United States)

    Ale, Chelsea M.; Chorney, Daniel B.; Brice, Chad S.; Morris, Tracy L.

    2010-01-01

    Research relating anxiety and facial affect recognition has focused mostly on school-aged children and adults and has yielded mixed results. The current study sought to demonstrate an association among behavioural inhibition and parent-reported social anxiety, shyness, social withdrawal and facial affect recognition performance in 30 children,…

  3. Neurobiological mechanisms associated with facial affect recognition deficits after traumatic brain injury.

    Science.gov (United States)

    Neumann, Dawn; McDonald, Brenna C; West, John; Keiski, Michelle A; Wang, Yang

    2016-06-01

    The neurobiological mechanisms that underlie facial affect recognition deficits after traumatic brain injury (TBI) have not yet been identified. Using functional magnetic resonance imaging (fMRI), study aims were to 1) determine if there are differences in brain activation during facial affect processing in people with TBI who have facial affect recognition impairments (TBI-I) relative to people with TBI and healthy controls who do not have facial affect recognition impairments (TBI-N and HC, respectively); and 2) identify relationships between neural activity and facial affect recognition performance. A facial affect recognition screening task performed outside the scanner was used to determine group classification; TBI patients who performed greater than one standard deviation below normal performance scores were classified as TBI-I, while TBI patients with normal scores were classified as TBI-N. An fMRI facial recognition paradigm was then performed within the 3T environment. Results from 35 participants are reported (TBI-I = 11, TBI-N = 12, and HC = 12). For the fMRI task, TBI-I and TBI-N groups scored significantly lower than the HC group. Blood oxygenation level-dependent (BOLD) signals for facial affect recognition compared to a baseline condition of viewing a scrambled face, revealed lower neural activation in the right fusiform gyrus (FG) in the TBI-I group than the HC group. Right fusiform gyrus activity correlated with accuracy on the facial affect recognition tasks (both within and outside the scanner). Decreased FG activity suggests facial affect recognition deficits after TBI may be the result of impaired holistic face processing. Future directions and clinical implications are discussed.

  4. [Measuring impairment of facial affects recognition in schizophrenia. Preliminary study of the facial emotions recognition task (TREF)].

    Science.gov (United States)

    Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N

    2015-06-01

    without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  5. Facial age affects emotional expression decoding

    OpenAIRE

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions. Previous studies have often followed up this phenomenon by examining the effect of the observers' age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and fo...

  6. Facial age affects emotional expression decoding

    OpenAIRE

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions.Previous studies have often followed up this phenomenon by examining the effect of the observers’ age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and fol...

  7. Schizophrenia and processing of facial emotions : Sex matters

    NARCIS (Netherlands)

    Scholten, MRM; Aleman, A; Montagne, B; Kahn, RS

    2005-01-01

    The aim of this study was to examine sex differences in emotion processing in patients with schizophrenia and control subjects. To this end, 53 patients with schizophrenia (28 men and 25 women), and 42 controls (21 men and 21 women) were assessed with the use of a facial affect recognition morphing

  8. Facial Affect Reciprocity in Dyadic Interactions

    Science.gov (United States)

    2012-09-01

    Factor Inventory (NEOFFI; Costa & McCrae , 1989, 1992), a 60-item version of form S of the NEO-PI-R that provides a measure of the five factor model...emotion and reciprocal affective relationships in families of disturbed adolescents. Family Process, 28(3), 337–348. Costa , P. T., & McCrae , R. R...1989). The Neo-PI/Neo-FFI manual supplement. Odessa, FL: Psychological Assessment Resources. Costa , P. T., & McCrae , R. R. (1992). Revised Neo

  9. Facial Edema Evaluation Using Digital Image Processing

    Directory of Open Access Journals (Sweden)

    A. E. Villafuerte-Nuñez

    2013-01-01

    Full Text Available The main objective of the facial edema evaluation is providing the needed information to determine the effectiveness of the anti-inflammatory drugs in development. This paper presents a system that measures the four main variables present in facial edemas: trismus, blush (coloration, temperature, and inflammation. Measurements are obtained by using image processing and the combination of different devices such as a projector, a PC, a digital camera, a thermographic camera, and a cephalostat. Data analysis and processing are performed using MATLAB. Facial inflammation is measured by comparing three-dimensional reconstructions of inflammatory variations using the fringe projection technique. Trismus is measured by converting pixels to centimeters in a digitally obtained image of an open mouth. Blushing changes are measured by obtaining and comparing the RGB histograms from facial edema images at different times. Finally, temperature changes are measured using a thermographic camera. Some tests using controlled measurements of every variable are presented in this paper. The results allow evaluating the measurement system before its use in a real test, using the pain model approved by the US Food and Drug Administration (FDA, which consists in extracting the third molar to generate the facial edema.

  10. Facial-affect recognition and visual scanning behaviour in the course of schizophrenia.

    Science.gov (United States)

    Streit, M; Wölwer, W; Gaebel, W

    1997-04-11

    The performance of schizophrenic in-patients in facial expression identification was assessed in an acute phase and in a partly remitted phase of the illness. During visual exploration of the face stimuli, the patient's eye movements were recorded using an infrared-corneal-reflection technique. Compared to healthy controls, patients demonstrated a significant deficit in facial-affect recognition. In addition, schizophrenics differed from controls in several eye movement parameters such as length of mean scan path and mean duration of fixation. Both the facial-affect recognition deficit and the eye movement abnormalities remained stable over time. However, performance in facial-affect recognition and eye movement abnormalities were not correlated. Patients with flattened affect showed relatively selective scan pattern characteristics. In contrast, affective flattening was not correlated with performance in facial-affect recognition. Dosage of neuroleptic medication did not affect the results. The main findings of the study suggest that schizophrenia is associated with disturbances in primarily unrelated neurocognitive operations mediating visuomotor processing and facial expression analysis. Given their time stability, the disturbances might have a trait-like character.

  11. Recognition of facial affect in girls with conduct disorder.

    Science.gov (United States)

    Pajer, Kathleen; Leininger, Lisa; Gardner, William

    2010-02-28

    Impaired recognition of facial affect has been reported in youths and adults with antisocial behavior. However, few of these studies have examined subjects with the psychiatric disorders associated with antisocial behavior, and there are virtually no data on females. Our goal was to determine if facial affect recognition was impaired in adolescent girls with conduct disorder (CD). Performance on the Ekman Pictures of Facial Affect (POFA) task was compared in 35 girls with CD (mean age of 17.9 years+/-0.95; 38.9% African-American) and 30 girls who had no lifetime history of psychiatric disorder (mean age of 17.6 years+/-0.77; 30% African-American). Forty-five slides representing the six emotions in the POFA were presented one at a time; stimulus duration was 5s. Multivariate analyses indicated that CD vs. control status was not significantly associated with the total number of correct answers nor the number of correct answers for any specific emotion. Effect sizes were all considered small. Within-CD analyses did not demonstrate a significant effect for aggressive antisocial behavior on facial affect recognition. Our findings suggest that girls with CD are not impaired in facial affect recognition. However, we did find that girls with a history of trauma/neglect made a greater number of errors in recognizing fearful faces. Explanations for these findings are discussed and implications for future research presented. 2009 Elsevier B.V. All rights reserved.

  12. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Directory of Open Access Journals (Sweden)

    Uta-Susan Donges

    Full Text Available There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  13. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Science.gov (United States)

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  14. Unwanted facial hair: affects, effects and solutions.

    Science.gov (United States)

    Blume-Peytavi, U; Gieler, U; Hoffmann, R; Lavery, S; Shapiro, J

    2007-01-01

    The following is a review of a satellite symposium held at the EHRS Meeting in June 2006. U.B.P. reminded the audience that unwanted facial hair (UFH) is an important issue; over 40% of the women in the general population have some degree of UFH, and its psychological and psychosocial impact should not be underestimated. The treatment of UFH involves many different disciplines, and the symposium offered the latest thinking in different aspects of the disorder. S.L. outlined the current concepts surrounding polycystic ovarian syndrome, and U.G. addressed the psychological aspects of UFH. J.S. described the current treatment options for UFH, followed by U.B.P.'s evidence-based therapy review. Finally, R.H. reviewed the latest trial results with Trichoscan, a method being investigated for assessing UFH removal.

  15. The Relationships between Processing Facial Identity, Emotional Expression, Facial Speech, and Gaze Direction during Development

    Science.gov (United States)

    Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…

  16. The relationships between processing facial identity, emotional expression, facial speech, and gaze direction during development.

    Science.gov (United States)

    Spangler, Sibylle M; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding facial speech, emotional expression, and gaze direction or, alternatively, according to facial speech, emotional expression, and gaze direction while disregarding facial identity. Reaction times showed that children and adults were able to direct their attention selectively to facial identity despite variations of other kinds of face information, but when sorting according to facial speech and emotional expression, they were unable to ignore facial identity. In contrast, gaze direction could be processed independently of facial identity in all age groups. Apart from shorter reaction times and fewer classification errors, no substantial change in processing facial information was found to be correlated with age. We conclude that adult-like face processing routes are employed from 5 years of age onward.

  17. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    Directory of Open Access Journals (Sweden)

    Annemiek van Dijke

    Full Text Available Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD, these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57 and patients with BPD (N = 30 were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25 and healthy control participants (N = 41 on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  18. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    Science.gov (United States)

    van Dijke, Annemiek; van 't Wout, Mascha; Ford, Julian D; Aleman, André

    2016-01-01

    Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD), these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57) and patients with BPD (N = 30) were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25) and healthy control participants (N = 41) on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  19. Serotonin transporter gene-linked polymorphism affects detection of facial expressions.

    Directory of Open Access Journals (Sweden)

    Ai Koizumi

    Full Text Available Previous studies have demonstrated that the serotonin transporter gene-linked polymorphic region (5-HTTLPR affects the recognition of facial expressions and attention to them. However, the relationship between 5-HTTLPR and the perceptual detection of others' facial expressions, the process which takes place prior to emotional labeling (i.e., recognition, is not clear. To examine whether the perceptual detection of emotional facial expressions is influenced by the allelic variation (short/long of 5-HTTLPR, happy and sad facial expressions were presented at weak and mid intensities (25% and 50%. Ninety-eight participants, genotyped for 5-HTTLPR, judged whether emotion in images of faces was present. Participants with short alleles showed higher sensitivity (d' to happy than to sad expressions, while participants with long allele(s showed no such positivity advantage. This effect of 5-HTTLPR was found at different facial expression intensities among males and females. The results suggest that at the perceptual stage, a short allele enhances the processing of positive facial expressions rather than that of negative facial expressions.

  20. Poor Facial Affect Recognition among Boys with Duchenne Muscular Dystrophy

    Science.gov (United States)

    Hinton, V. J.; Fee, R. J.; De Vivo, D. C.; Goldstein, E.

    2007-01-01

    Children with Duchenne or Becker muscular dystrophy (MD) have delayed language and poor social skills and some meet criteria for Pervasive Developmental Disorder, yet they are identified by molecular, rather than behavioral, characteristics. To determine whether comprehension of facial affect is compromised in boys with MD, children were given a…

  1. Poor Facial Affect Recognition among Boys with Duchenne Muscular Dystrophy

    Science.gov (United States)

    Hinton, V. J.; Fee, R. J.; De Vivo, D. C.; Goldstein, E.

    2007-01-01

    Children with Duchenne or Becker muscular dystrophy (MD) have delayed language and poor social skills and some meet criteria for Pervasive Developmental Disorder, yet they are identified by molecular, rather than behavioral, characteristics. To determine whether comprehension of facial affect is compromised in boys with MD, children were given a…

  2. Focal Length Affects Depicted Shape and Perception of Facial Images.

    Science.gov (United States)

    Třebický, Vít; Fialová, Jitka; Kleisner, Karel; Havlíček, Jan

    2016-01-01

    Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject's facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm) affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males) participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males). Facial width-to-height ratio (fWHR) was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM). Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits.

  3. Modulation of α power and functional connectivity during facial affect recognition.

    Science.gov (United States)

    Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte; Weisz, Nathan

    2013-04-03

    Research has linked oscillatory activity in the α frequency range, particularly in sensorimotor cortex, to processing of social actions. Results further suggest involvement of sensorimotor α in the processing of facial expressions, including affect. The sensorimotor face area may be critical for perception of emotional face expression, but the role it plays is unclear. The present study sought to clarify how oscillatory brain activity contributes to or reflects processing of facial affect during changes in facial expression. Neuromagnetic oscillatory brain activity was monitored while 30 volunteers viewed videos of human faces that changed their expression from neutral to fearful, neutral, or happy expressions. Induced changes in α power during the different morphs, source analysis, and graph-theoretic metrics served to identify the role of α power modulation and cross-regional coupling by means of phase synchrony during facial affect recognition. Changes from neutral to emotional faces were associated with a 10-15 Hz power increase localized in bilateral sensorimotor areas, together with occipital power decrease, preceding reported emotional expression recognition. Graph-theoretic analysis revealed that, in the course of a trial, the balance between sensorimotor power increase and decrease was associated with decreased and increased transregional connectedness as measured by node degree. Results suggest that modulations in α power facilitate early registration, with sensorimotor cortex including the sensorimotor face area largely functionally decoupled and thereby protected from additional, disruptive input and that subsequent α power decrease together with increased connectedness of sensorimotor areas facilitates successful facial affect recognition.

  4. On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information

    Science.gov (United States)

    Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.

    Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.

  5. Overview of impaired facial affect recognition in persons with traumatic brain injury.

    Science.gov (United States)

    Radice-Neumann, Dawn; Zupan, Barbra; Babbage, Duncan R; Willer, Barry

    2007-07-01

    To review the literature of affect recognition for persons with traumatic brain injury (TBI). It is suggested that impairment of affect recognition could be a significant problem for the TBI population and treatment strategies are recommended based on research for persons with autism. Research demonstrates that persons with TBI often have difficulty determining emotion from facial expressions. Studies show that poor interpersonal skills, which are associated with impaired affect recognition, are linked to a variety of negative outcomes. Theories suggest that facial affect recognition is achieved by interpreting important facial features and processing one's own emotions. These skills are often affected by TBI, depending on the areas damaged. Affect recognition impairments have also been identified in persons with autism. Successful interventions have already been developed for the autism population. Comparable neuroanatomical and behavioural findings between TBI and autism suggest that treatment approaches for autism may also benefit those with TBI. Impaired facial affect recognition appears to be a significant problem for persons with TBI. Theories of affect recognition, strategies used in autism and teaching techniques commonly used in TBI need to be considered when developing treatments to improve affect recognition in persons with brain injury.

  6. Event-related theta synchronization predicts deficit in facial affect recognition in schizophrenia.

    Science.gov (United States)

    Csukly, Gábor; Stefanics, Gábor; Komlósi, Sarolta; Czigler, István; Czobor, Pál

    2014-02-01

    Growing evidence suggests that abnormalities in the synchronized oscillatory activity of neurons in schizophrenia may lead to impaired neural activation and temporal coding and thus lead to neurocognitive dysfunctions, such as deficits in facial affect recognition. To gain an insight into the neurobiological processes linked to facial affect recognition, we investigated both induced and evoked oscillatory activity by calculating the Event Related Spectral Perturbation (ERSP) and the Inter Trial Coherence (ITC) during facial affect recognition. Fearful and neutral faces as well as nonface patches were presented to 24 patients with schizophrenia and 24 matched healthy controls while EEG was recorded. The participants' task was to recognize facial expressions. Because previous findings with healthy controls showed that facial feature decoding was associated primarily with oscillatory activity in the theta band, we analyzed ERSP and ITC in this frequency band in the time interval of 140-200 ms, which corresponds to the N170 component. Event-related theta activity and phase-locking to facial expressions, but not to nonface patches, predicted emotion recognition performance in both controls and patients. Event-related changes in theta amplitude and phase-locking were found to be significantly weaker in patients compared with healthy controls, which is in line with previous investigations showing decreased neural synchronization in the low frequency bands in patients with schizophrenia. Neural synchrony is thought to underlie distributed information processing. Our results indicate a less effective functioning in the recognition process of facial features, which may contribute to a less effective social cognition in schizophrenia.

  7. Affective priming using facial expressions modulates liking for abstract art.

    Directory of Open Access Journals (Sweden)

    Albert Flexas

    Full Text Available We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion were presented under brief (Stimulus Onset Asynchrony, SOA = 20 ms and extended (SOA = 300 ms conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation.

  8. Affective Priming Using Facial Expressions Modulates Liking for Abstract Art

    Science.gov (United States)

    Flexas, Albert; Rosselló, Jaume; Christensen, Julia F.; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric

    2013-01-01

    We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20ms) and extended (SOA = 300ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation. PMID:24260350

  9. Affective priming using facial expressions modulates liking for abstract art.

    Science.gov (United States)

    Flexas, Albert; Rosselló, Jaume; Christensen, Julia F; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric

    2013-01-01

    We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20 ms) and extended (SOA = 300 ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation.

  10. Focal Length Affects Depicted Shape and Perception of Facial Images.

    Directory of Open Access Journals (Sweden)

    Vít Třebický

    Full Text Available Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject's facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males. Facial width-to-height ratio (fWHR was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM. Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits.

  11. Perceptual and affective mechanisms in facial expression recognition: An integrative review.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2016-09-01

    Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.

  12. The Enfacement Illusion Is Not Affected by Negative Facial Expressions.

    Science.gov (United States)

    Beck, Brianna; Cardini, Flavia; Làdavas, Elisabetta; Bertini, Caterina

    2015-01-01

    Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one's own face to assimilate another person's face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer's motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant's own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other's face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.

  13. Ventrolateral prefrontal cortex and the effects of task demand context on facial affect appraisal in schizophrenia.

    Science.gov (United States)

    Leitman, David I; Wolf, Daniel H; Loughead, James; Valdez, Jeffrey N; Kohler, Christian G; Brensinger, Colleen; Elliott, Mark A; Turetsky, Bruce I; Gur, Raquel E; Gur, Ruben C

    2011-01-01

    Schizophrenia patients display impaired performance and brain activity during facial affect recognition. These impairments may reflect stimulus-driven perceptual decrements and evaluative processing abnormalities. We differentiated these two processes by contrasting responses to identical stimuli presented under different contexts. Seventeen healthy controls and 16 schizophrenia patients performed an fMRI facial affect detection task. Subjects identified an affective target presented amongst foils of differing emotions. We hypothesized that targeting affiliative emotions (happiness, sadness) would create a task demand context distinct from that generated when targeting threat emotions (anger, fear). We compared affiliative foil stimuli within a congruent affiliative context with identical stimuli presented in an incongruent threat context. Threat foils were analysed in the same manner. Controls activated right orbitofrontal cortex (OFC)/ventrolateral prefrontal cortex (VLPFC) more to affiliative foils in threat contexts than to identical stimuli within affiliative contexts. Patients displayed reduced OFC/VLPFC activation to all foils, and no activation modulation by context. This lack of context modulation coincided with a 2-fold decrement in foil detection efficiency. Task demands produce contextual effects during facial affective processing in regions activated during affect evaluation. In schizophrenia, reduced modulation of OFC/VLPFC by context coupled with reduced behavioural efficiency suggests impaired ventral prefrontal control mechanisms that optimize affective appraisal.

  14. Facial identity and facial expression are initially integrated at visual perceptual stages of face processing.

    Science.gov (United States)

    Fisher, Katie; Towler, John; Eimer, Martin

    2016-01-08

    It is frequently assumed that facial identity and facial expression are analysed in functionally and anatomically distinct streams within the core visual face processing system. To investigate whether expression and identity interact during the visual processing of faces, we employed a sequential matching procedure where participants compared either the identity or the expression of two successively presented faces, and ignored the other irrelevant dimension. Repetitions versus changes of facial identity and expression were varied independently across trials, and event-related potentials (ERPs) were recorded during task performance. Irrelevant facial identity and irrelevant expression both interfered with performance in the expression and identity matching tasks. These symmetrical interference effects show that neither identity nor expression can be selectively ignored during face matching, and suggest that they are not processed independently. N250r components to identity repetitions that reflect identity matching mechanisms in face-selective visual cortex were delayed and attenuated when there was an expression change, demonstrating that facial expression interferes with visual identity matching. These findings provide new evidence for interactions between facial identity and expression within the core visual processing system, and question the hypothesis that these two attributes are processed independently.

  15. Processing speaker affect during spoken sentence comprehension

    NARCIS (Netherlands)

    van Leeuwen, A.R.; Quené, H.; van Berkum, J.J.A.

    2013-01-01

    Anne van Leeuwen Utrecht institute of Linguistics OTS, Utrecht University Processing speaker affect during spoken sentence comprehension We often smile (and frown) while we talk. Speakers use facial expression, posture and prosody to provide additional cues that signal speaker stance. Speaker stance

  16. Maternal emotion dysregulation is related to heightened mother-infant synchrony of facial affect.

    Science.gov (United States)

    Lotzin, Annett; Schiborr, Julia; Barkmann, Claus; Romer, Georg; Ramsauer, Brigitte

    2016-05-01

    A heightened synchrony between the mother's and infant's facial affect predicts adverse infant development. We know that maternal psychopathology is related to mother-infant facial affect synchrony, but it is unclear how maternal psychopathology is transmitted to mother-infant synchrony. One pathway might be maternal emotion dysregulation. We examined (a) whether maternal emotion dysregulation is positively related to facial affect synchrony and (b) whether maternal emotion dysregulation mediates the effect of maternal psychopathology on mother-infant facial affect synchrony. We observed 68 mothers with mood disorders and their 4- to 9-month-old infants in the Still-Face paradigm during two play interactions. The mother's and infant's facial affect were rated from high negative to high positive, and the degree of synchrony between the mother's and infant's facial affect was computed with a time-series analysis. Emotion dysregulation was measured with the Difficulties in Emotion Regulation Scale, and psychopathology was assessed with the Symptom Checklist-90-Revised. Higher maternal emotion dysregulation was significantly associated with higher facial affect synchrony; emotion dysregulation fully mediated the effect of maternal psychopathology on facial affect synchrony. Our findings demonstrate that maternal emotion dysregulation rather than maternal psychopathology per se places mothers and infants at risk for heightened facial affect synchrony.

  17. Can we distinguish emotions from faces? Investigation of implicit and explicit processes of peak facial expressions

    Directory of Open Access Journals (Sweden)

    Yanmei Wang

    2016-08-01

    Full Text Available Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes

  18. Traditional facial tattoos disrupt face recognition processes.

    Science.gov (United States)

    Buttle, Heather; East, Julie

    2010-01-01

    Factors that are important to successful face recognition, such as features, configuration, and pigmentation/reflectance, are all subject to change when a face has been engraved with ink markings. Here we show that the application of facial tattoos, in the form of spiral patterns (typically associated with the Maori tradition of a Moko), disrupts face recognition to a similar extent as face inversion, with recognition accuracy little better than chance performance (2AFC). These results indicate that facial tattoos can severely disrupt our ability to recognise a face that previously did not have the pattern.

  19. Predicting the Accuracy of Facial Affect Recognition: The Interaction of Child Maltreatment and Intellectual Functioning

    Science.gov (United States)

    Shenk, Chad E.; Putnam, Frank W.; Noll, Jennie G.

    2013-01-01

    Previous research demonstrates that both child maltreatment and intellectual performance contribute uniquely to the accurate identification of facial affect by children and adolescents. The purpose of this study was to extend this research by examining whether child maltreatment affects the accuracy of facial recognition differently at varying…

  20. Predicting the Accuracy of Facial Affect Recognition: The Interaction of Child Maltreatment and Intellectual Functioning

    Science.gov (United States)

    Shenk, Chad E.; Putnam, Frank W.; Noll, Jennie G.

    2013-01-01

    Previous research demonstrates that both child maltreatment and intellectual performance contribute uniquely to the accurate identification of facial affect by children and adolescents. The purpose of this study was to extend this research by examining whether child maltreatment affects the accuracy of facial recognition differently at varying…

  1. Facial affect perception and mentalizing abilities in female patients with persistent somatoform pain disorder.

    Science.gov (United States)

    Schönenberg, M; Mares, L; Smolka, R; Jusyte, A; Zipfel, S; Hautzinger, M

    2014-08-01

    Numerous studies have demonstrated a robust link between alexithymic traits and somatic complaints in patients suffering from psychosomatic disorders, while less is known about disease-related impairments in the processing of affective social information. Deficits in emotion recognition can lead to misinterpretations of social signals and induce distress in interpersonal interactions. This, in turn, might contribute to somatoform symptomatology in affected individuals. The aim of the present study was to investigate basal facial affect recognition as well as higher-order cognitive mind-reading skills in order to further clarify the association between alexithymia and the processing of social affective information in a homogenous sample of patients suffering from somatoform pain. We employed a series of animated morph clips that gradually displayed the onset and development of the six basic emotional expressions to investigate facial affect perception in a female sample of patients diagnosed with persistent somatoform pain disorder (PSPD) and matched healthy controls. In addition, all participants were presented with the Movie for the Assessment of Social Cognition to explore mind-reading abilities. Specifically impaired mentalizing skills and increased alexithymic traits were observed in PSPD, while emotional facial expression recognition appeared to be intact in these patients. PSPD subjects tend to overattribute inappropriate affective states to others, which could be the consequence of the inability to adequately experience and express their own emotional reactions. This cognitive bias might lead to the experience of poor psychosocial functioning and has the potential to negatively impact the course and outcome of this psychopathology. © 2014 European Pain Federation - EFIC®

  2. Task difficulty and response complexity modulate affective priming by emotional facial expressions.

    Science.gov (United States)

    Sassi, Federica; Campoy, Guillermo; Castillo, Alejandro; Inuggi, Alberto; Fuentes, Luis J

    2014-05-01

    In this study we used an affective priming task to address the issue of whether the processing of emotional facial expressions occurs automatically independent of attention or attentional resources. Participants had to attend to the emotion expression of the prime face, or to a nonemotional feature of the prime face, the glasses. When participants attended to glasses (emotion unattended), they had to report whether the face wore glasses or not (the glasses easy condition) or whether the glasses were rounded or squared (the shape difficult condition). Affective priming, measured on valence decisions on target words, was mainly defined as interference from incongruent rather than facilitation from congruent trials. Significant priming effects were observed just in the emotion and glasses tasks but not in the shape task. When the key-response mapping increased in complexity, taxing working memory load, affective priming effects were reduced equally for the three types of tasks. Thus, attentional load and working memory load affected additively to the observed reduction in affective priming. These results cast some doubts on the automaticity of processing emotional facial expressions.

  3. Facial Affect Recognition Training Through Telepractice: Two Case Studies of Individuals with Chronic Traumatic Brain Injury.

    Science.gov (United States)

    Williamson, John; Isaki, Emi

    2015-01-01

    The use of a modified Facial Affect Recognition (FAR) training to identify emotions was investigated with two case studies of adults with moderate to severe chronic (> five years) traumatic brain injury (TBI). The modified FAR training was administered via telepractice to target social communication skills. Therapy consisted of identifying emotions through static facial expressions, personally reflecting on those emotions, and identifying sarcasm and emotions within social stories and role-play. Pre- and post-therapy measures included static facial photos to identify emotion and the Prutting and Kirchner Pragmatic Protocol for social communication. Both participants with chronic TBI showed gains on identifying facial emotions on the static photos.

  4. Assessing the Utility of a Virtual Environment for Enhancing Facial Affect Recognition in Adolescents with Autism

    Science.gov (United States)

    Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan

    2014-01-01

    Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…

  5. The Relation of Facial Affect Recognition and Empathy to Delinquency in Youth Offenders

    Science.gov (United States)

    Carr, Mary B.; Lutjemeier, John A.

    2005-01-01

    Associations among facial affect recognition, empathy, and self-reported delinquency were studied in a sample of 29 male youth offenders at a probation placement facility. Youth offenders were asked to recognize facial expressions of emotions from adult faces, child faces, and cartoon faces. Youth offenders also responded to a series of statements…

  6. The Relation of Facial Affect Recognition and Empathy to Delinquency in Youth Offenders

    Science.gov (United States)

    Carr, Mary B.; Lutjemeier, John A.

    2005-01-01

    Associations among facial affect recognition, empathy, and self-reported delinquency were studied in a sample of 29 male youth offenders at a probation placement facility. Youth offenders were asked to recognize facial expressions of emotions from adult faces, child faces, and cartoon faces. Youth offenders also responded to a series of statements…

  7. Assessing the Utility of a Virtual Environment for Enhancing Facial Affect Recognition in Adolescents with Autism

    Science.gov (United States)

    Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan

    2014-01-01

    Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…

  8. Facial Emotion Processing in Aviremic HIV-infected Adults.

    Science.gov (United States)

    González-Baeza, A; Carvajal, F; Bayón, C; Pérez-Valero, I; Montes-Ramírez, M; Arribas, J R

    2016-08-01

    The emotional processing in human immunodeficiency virus-seropositive individuals (HIV+) has been scarcely studied. We included HIV+ individuals (n = 107) on antiretroviral therapy (≥2 years) who completed 6 facial processing tasks and neurocognitive testing. We compared HIV+ and healthy adult (HA) participants (n = 40) in overall performance of each facial processing task. Multiple logistic regressions were conducted to explore predictors of poorer accuracy in those measures in which HIV+ individuals performed poorer than HA participants. We separately explored the impact of neurocognitive status, antiretroviral regimen, and hepatitis C virus (HCV) coinfection on the tasks performance. We found similar performance in overall facial emotion discrimination, recognition, and recall between HIV+ and HA participants. The HIV+ group had poorer recognition of particular negative emotions. Lower WAIS-III Vocabulary scores and active HCV predicted poorer accuracy in recognition of particular emotions. Our results suggest that permanent damage of emotion-related brain systems might persist despite long-term effective antiretroviral therapy.

  9. Processing of Facial Expressions of Emotions by Adults with Down Syndrome and Moderate Intellectual Disability

    Science.gov (United States)

    Carvajal, Fernando; Fernandez-Alcaraz, Camino; Rueda, Maria; Sarrion, Louise

    2012-01-01

    The processing of facial expressions of emotions by 23 adults with Down syndrome and moderate intellectual disability was compared with that of adults with intellectual disability of other etiologies (24 matched in cognitive level and 26 with mild intellectual disability). Each participant performed 4 tasks of the Florida Affect Battery and an…

  10. Processing of facial blends of emotion: support for right hemisphere cognitive aging.

    Science.gov (United States)

    Prodan, Calin I; Orbelo, Diana M; Ross, Elliott D

    2007-02-01

    Clinical research on facial emotions has focused primarily on differences between right and left hemiface. Social psychology, however, has suggested that differences between upper versus lower facial displays may be more important, especially during social interactions. We demonstrated previously that upper facial displays are perceived preferentially by the right hemisphere, while lower facial displays are perceived preferentially by the left hemisphere. A marginal age-related effect was observed. The current research expands our original cohort to include 26 elderly individuals over age 62. Fifty-six, strongly right-handed, healthy, adult volunteers were tested tachistoscopically by flashing randomized facial displays of emotion to the right and left visual fields. The stimuli consisted of line drawings displaying various combinations of emotions on the upper and lower face. The subjects were tested under two conditions: without attend instruction and with instructions to attend to the upper face. Based on linear regression and discriminant analyses modeling age, subject performance could be divided into two distinct groups: Young ( 62 years). Without attend instructions, both groups robustly identified the emotion displayed on the lower face, regardless of visual field presentation. With instructions to attend to the upper face, the Old group demonstrated a markedly decreased ability to identify upper facial displays, compared to the Young group. The most significant difference was noted in the left visual field/right hemisphere. Our results demonstrate a significant decline in the processing of upper facial emotions by the right hemisphere in older individuals, thus providing partial support for the right hemisphere hypothesis of cognitive aging. The decreased ability to perceive upper facial displays coupled with age-related deficits in processing affective prosody may well cause impaired psychosocial competency in the elderly.

  11. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolesce......-attentive change detection on social-emotional information.......The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...

  12. How facial expressions of emotion affect distance perception

    Directory of Open Access Journals (Sweden)

    Nam-Gyoon eKim

    2015-11-01

    Full Text Available Facial expressions of emotion are thought to convey expressers’ behavioral intentions, thus priming observers’ approach and avoidance tendencies appropriately. The present study examined whether detecting expressions of behavioral intent influence perceivers’ estimation of the expresser’s distance from them. Eighteen undergraduates (9 male and 9 female participated in the study. Six facial expressions were chosen on the basis of degree of threat—anger, hate (threatening expressions, shame, surprise (neutral expressions, pleasure and joy (safe expressions. Each facial expression was presented on a tablet PC held by an assistant covered by a black drape who stood 1m, 2m, or 3m away from participants. Participants performed a visual matching task to report the perceived distance. Results showed that facial expression influenced distance estimation, with faces exhibiting threatening or safe expressions judged closer than those showing neutral expressions. Females’ judgments were more likely to be influenced; but these influences largely disappeared beyond the 2m distance. These results suggest that facial expressions of emotion (particularly threatening or safe emotions influence others’ (especially females’ distance estimations but only within close proximity.

  13. Facial feedback affects valence jugdments of dynamic and static emotional expressions

    Directory of Open Access Journals (Sweden)

    Sylwia eHyniewska

    2015-03-01

    Full Text Available The ability to judge others’ emotions is required for the establishment and maintenance of smooth interactions in a community. Several lines of evidence suggest that the attribution of meaning to a face is influenced by the facial actions produced by an observer during the observation of a face. However, empirical studies testing causal relationships between observers’ facial actions and emotion judgments have reported mixed findings. This study is the first to measure emotion judgments in terms of valence and arousal dimensions while comparing dynamic versus static presentations of facial expressions. We presented pictures and videos of facial expressions of anger and happiness. Participants (N = 36 were asked to differentiate between the gender of faces by activating the corrugator supercilii muscle (brow lowering and zygomaticus major muscle (cheek raising. They were also asked to evaluate the internal states of the stimuli using the affect grid while maintaining the facial action until they finished responding. The cheek-raising condition increased the attributed valence scores compared with the brow-lowering condition. This effect of facial actions was observed for static as well as for dynamic facial expressions. These data suggest that facial feedback mechanisms contribute to the judgment of the valence of emotional facial expressions.

  14. Facial and affective reactions to tastes and their modulation by sadness and joy.

    Science.gov (United States)

    Greimel, Ellen; Macht, Michael; Krumhuber, Eva; Ellgring, Heiner

    2006-09-30

    This study examined adults' affective and facial reactions to tastes which differ in quality and valence, and the impact of sadness and joy on these reactions. Thirty-six male and female subjects participated voluntarily. Subjects each tasted 6 ml of a sweet chocolate drink, a bitter quinine solution (0.0015 M) and a bitter-sweet soft drink. Following a baseline period, either joy or sadness was induced using film clips before the same taste stimuli were presented for a second time. Subjects rated the drinks' pleasantness and intensity of taste immediately after each stimulus presentation. Facial reactions were videotaped and analysed using the Facial Action Coding System (FACS [P. Ekman, W.V. Friesen, Facial Action Coding System: Manual. Palo Alto, CA: Consulting Psychologists Press; 1978., P. Ekman, W. Friesen, J. Hager, Facial Action Coding System. Salt Lake City, Utah: Research Nexus; 2002.]). The results strongly indicated that the tastes produced specific facial reactions that bear strong similarities to the facial reactivity patterns found in human newborns. The data also suggest that some adults' facial reactions serve additional communicative functions. Emotions modulated taste ratings, but not facial reactions to tastes. In particular, ratings of the sweet stimulus were modulated in congruence with emotion quality, such that joy increased and sadness decreased the pleasantness and sweetness of the sweet stimulus. No emotion-congruent modulation was found for the pleasantness and intensity ratings of the bitter or the bitter-sweet stimulus. This 'robustness' of bitter taste ratings may reflect a biologically meaningful mechanism.

  15. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  16. Impact of Chronic Hypercortisolemia on Affective Processing

    Science.gov (United States)

    Langenecker, Scott A.; Weisenbach, Sara L.; Giordani, Bruno; Briceno, Emily M.; GuidottiBreting, Leslie M.; Schallmo, Michael-Paul; Leon, Hadia M.; Noll, Douglas C.; Zubieta, Jon-Kar; Schteingart, David E.; Starkman, Monica N.

    2011-01-01

    Cushing syndrome (CS) is the classic condition of cortisol dysregulation, and cortisol dysregulation is the prototypic finding in Major Depressive Disorder (MDD). We hypothesized that subjects with active CS would show dysfunction in frontal and limbic structures relevant to affective networks, and also manifest poorer facial affect identification accuracy, a finding reported in MDD.Twenty-one patients with confirmed CS (20 ACTH-dependent and 1 ACTH-independent) were compared to 21 healthy controlsubjects. Identification of affective facial expressions (Facial Emotion Perception Test) was conducted in a 3 Tesla GE fMRI scanner using BOLD fMRI signal. The impact of disease (illness duration, current hormone elevation and degree of disruption of circadian rhythm), performance, and comorbid conditions secondary to hypercortisolemia were evaluated.CS patients made more errors in categorizing facial expressions and had less activation in left anterior superior temporal gyrus, a region important in emotion processing. CS patients showed higher activation in frontal, medial, and subcortical regions relative to controls. Two regions of elevated activation in CS, left middle frontal and lateral posterior/pulvinar areas, were positively correlated with accuracy in emotion identification in the CS group, reflecting compensatory recruitment. In addition, within the CSgroup, greater activation in left dorsal anterior cingulatewas related to greater severity of hormone dysregulation. In conclusion, cortisol dysregulation in CS patients is associated with problems in accuracy of affective discrimination and altered activation of brain structures relevant to emotion perception, processing and regulation, similar to the performance decrements and brain regions shown to be dysfunctional in MDD. PMID:21787793

  17. Scattered Data Processing Approach Based on Optical Facial Motion Capture

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2013-01-01

    Full Text Available In recent years, animation reconstruction of facial expressions has become a popular research field in computer science and motion capture-based facial expression reconstruction is now emerging in this field. Based on the facial motion data obtained using a passive optical motion capture system, we propose a scattered data processing approach, which aims to solve the common problems of missing data and noise. To recover missing data, given the nonlinear relationships among neighbors with the current missing marker, we propose an improved version of a previous method, where we use the motion of three muscles rather than one to recover the missing data. To reduce the noise, we initially apply preprocessing to eliminate impulsive noise, before our proposed three-order quasi-uniform B-spline-based fitting method is used to reduce the remaining noise. Our experiments showed that the principles that underlie this method are simple and straightforward, and it delivered acceptable precision during reconstruction.

  18. Holistic face processing can inhibit recognition of forensic facial composites.

    Science.gov (United States)

    McIntyre, Alex H; Hancock, Peter J B; Frowd, Charlie D; Langton, Stephen R H

    2016-04-01

    Facial composite systems help eyewitnesses to show the appearance of criminals. However, likenesses created by unfamiliar witnesses will not be completely accurate, and people familiar with the target can find them difficult to identify. Faces are processed holistically; we explore whether this impairs identification of inaccurate composite images and whether recognition can be improved. In Experiment 1 (n = 64) an imaging technique was used to make composites of celebrity faces more accurate and identification was contrasted with the original composite images. Corrected composites were better recognized, confirming that errors in production of the likenesses impair identification. The influence of holistic face processing was explored by misaligning the top and bottom parts of the composites (cf. Young, Hellawell, & Hay, 1987). Misalignment impaired recognition of corrected composites but identification of the original, inaccurate composites significantly improved. This effect was replicated with facial composites of noncelebrities in Experiment 2 (n = 57). We conclude that, like real faces, facial composites are processed holistically: recognition is impaired because unlike real faces, composites contain inaccuracies and holistic face processing makes it difficult to perceive identifiable features. This effect was consistent across composites of celebrities and composites of people who are personally familiar. Our findings suggest that identification of forensic facial composites can be enhanced by presenting composites in a misaligned format.

  19. Gender Differences in the Motivational Processing of Facial Beauty

    Science.gov (United States)

    Levy, Boaz; Ariely, Dan; Mazar, Nina; Chi, Won; Lukas, Scott; Elman, Igor

    2008-01-01

    Gender may be involved in the motivational processing of facial beauty. This study applied a behavioral probe, known to activate brain motivational regions, to healthy heterosexual subjects. Matched samples of men and women were administered two tasks: (a) key pressing to change the viewing time of average or beautiful female or male facial…

  20. Women's Greater Ability to Perceive Happy Facial Emotion Automatically: Gender Differences in Affective Priming

    OpenAIRE

    Uta-Susan Donges; Anette Kersting; Thomas Suslow

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral...

  1. Dissociating Face Identity and Facial Expression Processing Via Visual Adaptation

    Directory of Open Access Journals (Sweden)

    Hong Xu

    2012-10-01

    Full Text Available Face identity and facial expression are processed in two distinct neural pathways. However, most of the existing face adaptation literature studies them separately, despite the fact that they are two aspects from the same face. The current study conducted a systematic comparison between these two aspects by face adaptation, investigating how top- and bottom-half face parts contribute to the processing of face identity and facial expression. A real face (sad, “Adam” and its two size-equivalent face parts (top- and bottom-half were used as the adaptor in separate conditions. For face identity adaptation, the test stimuli were generated by morphing Adam's sad face with another person's sad face (“Sam”. For facial expression adaptation, the test stimuli were created by morphing Adam's sad face with his neutral face and morphing the neutral face with his happy face. In each trial, after exposure to the adaptor, observers indicated the perceived face identity or facial expression of the following test face via a key press. They were also tested in a baseline condition without adaptation. Results show that the top- and bottom-half face each generated a significant face identity aftereffect. However, the aftereffect by top-half face adaptation is much larger than that by the bottom-half face. On the contrary, only the bottom-half face generated a significant facial expression aftereffect. This dissociation of top- and bottom-half face adaptation suggests that face parts play different roles in face identity and facial expression. It thus provides further evidence for the distributed systems of face perception.

  2. Identification and intensity of disgust: Distinguishing visual, linguistic and facial expressions processing in Parkinson disease.

    Science.gov (United States)

    Sedda, Anna; Petito, Sara; Guarino, Maria; Stracciari, Andrea

    2017-07-14

    Most of the studies since now show an impairment for facial displays of disgust recognition in Parkinson disease. A general impairment in disgust processing in patients with Parkinson disease might adversely affect their social interactions, given the relevance of this emotion for human relations. However, despite the importance of faces, disgust is also expressed through other format of visual stimuli such as sentences and visual images. The aim of our study was to explore disgust processing in a sample of patients affected by Parkinson disease, by means of various tests tackling not only facial recognition but also other format of visual stimuli through which disgust can be recognized. Our results confirm that patients are impaired in recognizing facial displays of disgust. Further analyses show that patients are also impaired and slower for other facial expressions, with the only exception of happiness. Notably however, patients with Parkinson disease processed visual images and sentences as controls. Our findings show a dissociation within different formats of visual stimuli of disgust, suggesting that Parkinson disease is not characterized by a general compromising of disgust processing, as often suggested. The involvement of the basal ganglia-frontal cortex system might spare some cognitive components of emotional processing, related to memory and culture, at least for disgust. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.

    Science.gov (United States)

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  4. The Child Affective Facial Expression (CAFE Set: Validity and Reliability from Untrained Adults

    Directory of Open Access Journals (Sweden)

    Vanessa eLoBue

    2015-01-01

    Full Text Available Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE. The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for 6 emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  5. Facial emotion processing in borderline personality disorder: a systematic review and meta-analysis.

    Science.gov (United States)

    Mitchell, Amy E; Dickens, Geoffrey L; Picchioni, Marco M

    2014-06-01

    A body of work has developed over the last 20 years that explores facial emotion perception in Borderline Personality Disorder (BPD). We identified 25 behavioural and functional imaging studies that tested facial emotion processing differences between patients with BPD and healthy controls through a database literature search. Despite methodological differences there is consistent evidence supporting a negative response bias to neutral and ambiguous facial expressions in patients. Findings for negative emotions are mixed with evidence from individual studies of an enhanced sensitivity to fearful expressions and impaired facial emotion recognition of disgust, while meta-analysis revealed no significant recognition impairments between BPD and healthy controls for any negative emotion. Mentalizing studies indicate that BPD patients are accurate at attributing mental states to complex social stimuli. Functional neuroimaging data suggest that the underlying neural substrate involves hyperactivation in the amygdala to affective facial stimuli, and altered activation in the anterior cingulate, inferior frontal gyrus and the superior temporal sulcus particularly during social emotion processing tasks. Future studies must address methodological inconsistencies, particularly variations in patients' key clinical characteristics and in the testing paradigms deployed.

  6. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  7. Facial Affect Recognition Training Through Telepractice: Two Case Studies of Individuals with Chronic Traumatic Brain Injury

    OpenAIRE

    John Williamson; Emi Isaki

    2015-01-01

    The use of a modified Facial Affect Recognition (FAR) training to identify emotions was investigated with two case studies of adults with moderate to severe chronic (> five years) traumatic brain injury (TBI).  The modified FAR training was administered via telepractice to target social communication skills.  Therapy consisted of identifying emotions through static facial expressions, personally reflecting on those emotions, and identifying sarcasm and emotions within social stories and ro...

  8. Facial Affect Recognition Training Through Telepractice: Two Case Studies of Individuals with Chronic Traumatic Brain Injury

    OpenAIRE

    Williamson, John; ISAKI, EMI

    2015-01-01

    The use of a modified Facial Affect Recognition (FAR) training to identify emotions was investigated with two case studies of adults with moderate to severe chronic (> five years) traumatic brain injury (TBI). The modified FAR training was administered via telepractice to target social communication skills. Therapy consisted of identifying emotions through static facial expressions, personally reflecting on those emotions, and identifying sarcasm and emotions within social stories and role-pl...

  9. Effects of Orientation on Recognition of Facial Affect

    Science.gov (United States)

    Cohen, M. M.; Mealey, J. B.; Hargens, Alan R. (Technical Monitor)

    1997-01-01

    The ability to discriminate facial features is often degraded when the orientation of the face and/or the observer is altered. Previous studies have shown that gross distortions of facial features can go unrecognized when the image of the face is inverted, as exemplified by the 'Margaret Thatcher' effect. This study examines how quickly erect and supine observers can distinguish between smiling and frowning faces that are presented at various orientations. The effects of orientation are of particular interest in space, where astronauts frequently view one another in orientations other than the upright. Sixteen observers viewed individual facial images of six people on a computer screen; on a given trial, the image was either smiling or frowning. Each image was viewed when it was erect and when it was rotated (rolled) by 45 degrees, 90 degrees, 135 degrees, 180 degrees, 225 degrees and 270 degrees about the line of sight. The observers were required to respond as rapidly and accurately as possible to identify if the face presented was smiling or frowning. Measures of reaction time were obtained when the observers were both upright and supine. Analyses of variance revealed that mean reaction time, which increased with stimulus rotation (F=18.54, df 7/15, p (is less than) 0.001), was 22% longer when the faces were inverted than when they were erect, but that the orientation of the observer had no significant effect on reaction time (F=1.07, df 1/15, p (is greater than) .30). These data strongly suggest that the orientation of the image of a face on the observer's retina, but not its orientation with respect to gravity, is important in identifying the expression on the face.

  10. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    Science.gov (United States)

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  11. Featural processing in recognition of emotional facial expressions.

    Science.gov (United States)

    Beaudry, Olivia; Roy-Charland, Annie; Perron, Melanie; Cormier, Isabelle; Tapp, Roxane

    2014-04-01

    The present study aimed to clarify the role played by the eye/brow and mouth areas in the recognition of the six basic emotions. In Experiment 1, accuracy was examined while participants viewed partial and full facial expressions; in Experiment 2, participants viewed full facial expressions while their eye movements were recorded. Recognition rates were consistent with previous research: happiness was highest and fear was lowest. The mouth and eye/brow areas were not equally important for the recognition of all emotions. More precisely, while the mouth was revealed to be important in the recognition of happiness and the eye/brow area of sadness, results are not as consistent for the other emotions. In Experiment 2, consistent with previous studies, the eyes/brows were fixated for longer periods than the mouth for all emotions. Again, variations occurred as a function of the emotions, the mouth having an important role in happiness and the eyes/brows in sadness. The general pattern of results for the other four emotions was inconsistent between the experiments as well as across different measures. The complexity of the results suggests that the recognition process of emotional facial expressions cannot be reduced to a simple feature processing or holistic processing for all emotions.

  12. Facial Affect Recognition Training Through Telepractice: Two Case Studies of Individuals with Chronic Traumatic Brain Injury

    Directory of Open Access Journals (Sweden)

    John Williamson

    2015-07-01

    Full Text Available The use of a modified Facial Affect Recognition (FAR training to identify emotions was investigated with two case studies of adults with moderate to severe chronic (> five years traumatic brain injury (TBI.  The modified FAR training was administered via telepractice to target social communication skills.  Therapy consisted of identifying emotions through static facial expressions, personally reflecting on those emotions, and identifying sarcasm and emotions within social stories and role-play.  Pre- and post-therapy measures included static facial photos to identify emotion and the Prutting and Kirchner Pragmatic Protocol for social communication.  Both participants with chronic TBI showed gains on identifying facial emotions on the static photos.               

  13. Visual Scanning in the Recognition of Facial Affect in Traumatic Brain Injury

    Directory of Open Access Journals (Sweden)

    Suzane Vassallo

    2011-05-01

    Full Text Available We investigated the visual scanning strategy employed by a group of individuals with a severe traumatic brain injury (TBI during a facial affect recognition task. Four males with a severe TBI were matched for age and gender with 4 healthy controls. Eye movements were recorded while pictures of static emotional faces were viewed (i.e., sad, happy, angry, disgusted, anxious, surprised. Groups were compared with respect to accuracy in labelling the emotional facial expression, reaction time, number and duration of fixations to internal (i.e., eyes + nose + mouth, and external (i.e., all remaining regions of the stimulus. TBI participants demonstrated significantly reduced accuracy and increased latency in facial affect recognition. Further, they demonstrated no significant difference in the number or duration of fixations to internal versus external facial regions. Control participants, however, fixated more frequently and for longer periods of time upon internal facial features. Impaired visual scanning can contribute to inaccurate interpretation of facial expression and this can disrupt interpersonal communication. The scanning strategy demonstrated by our TBI group appears more ‘widespread’ than that employed by their normal counterparts. Further work is required to elucidate the nature of the scanning strategy used and its potential variance in TBI.

  14. Haunted by a doppelgänger: irrelevant facial similarity affects rule-based judgments.

    Science.gov (United States)

    von Helversen, Bettina; Herzog, Stefan M; Rieskamp, Jörg

    2014-01-01

    Judging other people is a common and important task. Every day professionals make decisions that affect the lives of other people when they diagnose medical conditions, grant parole, or hire new employees. To prevent discrimination, professional standards require that decision makers render accurate and unbiased judgments solely based on relevant information. Facial similarity to previously encountered persons can be a potential source of bias. Psychological research suggests that people only rely on similarity-based judgment strategies if the provided information does not allow them to make accurate rule-based judgments. Our study shows, however, that facial similarity to previously encountered persons influences judgment even in situations in which relevant information is available for making accurate rule-based judgments and where similarity is irrelevant for the task and relying on similarity is detrimental. In two experiments in an employment context we show that applicants who looked similar to high-performing former employees were judged as more suitable than applicants who looked similar to low-performing former employees. This similarity effect was found despite the fact that the participants used the relevant résumé information about the applicants by following a rule-based judgment strategy. These findings suggest that similarity-based and rule-based processes simultaneously underlie human judgment.

  15. Facial Expression of Affect in Children with Cornelia de Lange Syndrome

    Science.gov (United States)

    Collis, L.; Moss, J.; Jutley, J.; Cornish, K.; Oliver, C.

    2008-01-01

    Background: Individuals with Cornelia de Lange syndrome (CdLS) have been reported to show comparatively high levels of flat and negative affect but there have been no empirical evaluations. In this study, we use an objective measure of facial expression to compare affect in CdLS with that seen in Cri du Chat syndrome (CDC) and a group of…

  16. Cognitive tasks during expectation affect the congruency ERP effects to facial expressions.

    Science.gov (United States)

    Lin, Huiyan; Schulz, Claudia; Straube, Thomas

    2015-01-01

    Expectancy congruency has been shown to modulate event-related potentials (ERPs) to emotional stimuli, such as facial expressions. However, it is unknown whether the congruency ERP effects to facial expressions can be modulated by cognitive manipulations during stimulus expectation. To this end, electroencephalography (EEG) was recorded while participants viewed (neutral and fearful) facial expressions. Each trial started with a cue, predicting a facial expression, followed by an expectancy interval without any cues and subsequently the face. In half of the trials, participants had to solve a cognitive task in which different letters were presented for target letter detection during the expectancy interval. Furthermore, facial expressions were congruent with the cues in 75% of all trials. ERP results revealed that for fearful faces, the cognitive task during expectation altered the congruency effect in N170 amplitude; congruent compared to incongruent fearful faces evoked larger N170 in the non-task condition but the congruency effect was not evident in the task condition. Regardless of facial expression, the congruency effect was generally altered by the cognitive task during expectation in P3 amplitude; the amplitudes were larger for incongruent compared to congruent faces in the non-task condition but the congruency effect was not shown in the task condition. The findings indicate that cognitive tasks during expectation reduce the processing of expectation and subsequently, alter congruency ERP effects to facial expressions.

  17. How children with facial differences are perceived by non-affected children and adolescents: perceiver effects on stereotypical attitudes.

    Science.gov (United States)

    Masnari, Ornella; Schiestl, Clemens; Weibel, Lisa; Wuttke, Franziska; Landolt, Markus A

    2013-09-01

    Children with a facial difference are presumed to be at risk of social stigmatization. The purposes of this study were twofold: (1) to assess the effect of facial differences on social perceptions by unaffected children and adolescents; and (2) to identify perceiver characteristics that predict stereotypical attitudes toward facial differences. Participants were 344 non-affected children and adolescents, ages 8-17 years. Participants rated digitally altered images of 12 children depicted either with or without a facial difference. Results show that participants attributed less favorable characteristics to children with a facial difference than to those without. Moreover, participants reported less willingness to interact with or befriend a child with a facial difference. Significant predictors of low discriminative attitudes were older participant age and previous contact with someone with a facial difference. Our data call attention to the need for public education programs targeted at reducing negative attitudes toward facial differences.

  18. Design of a Virtual Reality System for Affect Analysis in Facial Expressions (VR-SAAFE); application to schizophrenia.

    Science.gov (United States)

    Bekele, Esubalew; Bian, Dayi; Peterman, Joel; Park, Sohee; Sarkar, Nilanjan

    2016-07-14

    Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.

  19. "MOODY BLUES": Affect Interpretation of Infant Facial Expressions and Negative Affect in Mothers of Preterm and Term Infants

    Directory of Open Access Journals (Sweden)

    Hedwig J.A. van Bakel

    2013-09-01

    Full Text Available Preterm birth places infants at increased risk for adverse developmental outcomes, with self- and affect regulation problems among the most important impairments. However, few studies have empirically examined maternal interpretation of infant affect in mothers of pre- and term infants. The current study examines how negative affect of mothers of preterm and term infants is associated with their interpretation of infant facial expressions.One hundred and sixty-eight mothers with their infants (64 term and 104 preterm participated. Seven days after birth, mothers completed the UWIST Mood Adjective Checklist (UMACL; Matthews, Jones, & Chamberlain, 1990 to assess maternal negative affect. During a home visit, six months after birth, mothers additionally completed a task developed to measure infant affect interpretation (Interpreting Facial Expressions of Emotions through Looking at Pictures task, IFEEL pictures task; Emde, Osofsky, & Butterfield, 1993.Mothers of preterm infants reported more negative affect than mothers of term infants. However, the relationship between infant birth status (i.e., term vs. preterm and maternal interpretation of infant facial expressions was moderated by the mother's own negative affectivity. Surprisingly, particularly mothers of term infants who also reported high levels of negative affect were found to interpret infant affect significantly more negatively.Prematurity itself does not seem to be a dominant factor in determining maternal infant affect interpretation, though maternal psychological negative mood does. Both theoretical and practical implications of the results are discussed.

  20. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    Science.gov (United States)

    Kujala, Miiamaaria V; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  1. Facial expressions of emotions: recognition accuracy and affective reactions during late childhood.

    Science.gov (United States)

    Mancini, Giacomo; Agnoli, Sergio; Baldaro, Bruno; Bitti, Pio E Ricci; Surcinelli, Paola

    2013-01-01

    The present study examined the development of recognition ability and affective reactions to emotional facial expressions in a large sample of school-aged children (n = 504, ages 8-11 years of age). Specifically, the study aimed to investigate if changes in the emotion recognition ability and the affective reactions associated with the viewing of facial expressions occur during late childhood. Moreover, because small but robust gender differences during late-childhood have been proposed, the effects of gender on the development of emotion recognition and affective responses were examined. The results showed an overall increase in emotional face recognition ability from 8 to 11 years of age, particularly for neutral and sad expressions. However, the increase in sadness recognition was primarily due to the development of this recognition in boys. Moreover, our results indicate different developmental trends in males and females regarding the recognition of disgust. Last, developmental changes in affective reactions to emotional facial expressions were found. Whereas recognition ability increased over the developmental time period studied, affective reactions elicited by facial expressions were characterized by a decrease in arousal over the course of late childhood.

  2. Unfakeable facial configurations affect strategic choices in trust games with or without information about past behavior.

    Directory of Open Access Journals (Sweden)

    Constantin Rezlescu

    Full Text Available BACKGROUND: Many human interactions are built on trust, so widespread confidence in first impressions generally favors individuals with trustworthy-looking appearances. However, few studies have explicitly examined: 1 the contribution of unfakeable facial features to trust-based decisions, and 2 how these cues are integrated with information about past behavior. METHODOLOGY/PRINCIPAL FINDINGS: Using highly controlled stimuli and an improved experimental procedure, we show that unfakeable facial features associated with the appearance of trustworthiness attract higher investments in trust games. The facial trustworthiness premium is large for decisions based solely on faces, with trustworthy identities attracting 42% more money (Study 1, and remains significant though reduced to 6% when reputational information is also available (Study 2. The face trustworthiness premium persists with real (rather than virtual currency and when higher payoffs are at stake (Study 3. CONCLUSIONS/SIGNIFICANCE: Our results demonstrate that cooperation may be affected not only by controllable appearance cues (e.g., clothing, facial expressions as shown previously, but also by features that are impossible to mimic (e.g., individual facial structure. This unfakeable face trustworthiness effect is not limited to the rare situations where people lack any information about their partners, but survives in richer environments where relevant details about partner past behavior are available.

  3. Appraisals Generate Specific Configurations of Facial Muscle Movements in a Gambling Task: Evidence for the Component Process Model of Emotion.

    Science.gov (United States)

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2015-01-01

    Scherer's Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect

  4. Facial emotional processing deficits in long-term HIV-suppressed patients

    Science.gov (United States)

    Gonzalez-Baeza, Alicia; Perez-Valero, Ignacio; Carvajal-Molina, Fernando; Bayon, Carmen; Montes-Ramirez, Marisa; Ignacio Bernardino, Jose; Arribas, Jose R

    2014-01-01

    Introduction Emotional processing is basic for social behaviour. We examine for the first time the facial emotion processing in long-term HIV-suppressed patients. Materials and Methods Cross-sectional study comparing (ANOVA) six facial emotional processing tasks (two discrimination, two memory and two recognition) between HIV-suppressed patients (HIV+) on effective antiretroviral therapy (>2 years) and matched (age, gender) healthy controls (HCs). Accuracy in the recognition of basic emotions (neutral, happiness, sadness, anger and fear) in each recognition task was also compared (Mann–Whitney U test) between HIV+ and HCs. In the subset of HIV+, we evaluate which factors were associated with impaired recognition of basic emotions (accuracy below 50%) by multiple logistic regression analysis. Overall performance in all six emotional tasks were separately compared between neurocognitive impaired and non-impaired HIV+. Results We included 107 HIV+, mainly Caucasian (89%) male (72%) with a mean age of 47.4 years, neurocognitively non-impaired (75.5%), and 40 HCs. Overall discrimination (p=0.38), memory (p=0.65) and recognition tasks (p=0.29) were similar in both groups. However, HIV+ had lower sadness recognition in both recognition tasks and lower sadness, anger and fear recognition in the facial affect selection task (Figure 1). Only estimated pre-morbid functioning (WAIS-III-R vocabulary subtest score) was significantly associated with sadness (1.99 [95% CI 1.18–3.58]; p=0.01) and anger recognition deficits (2.06 [95% CI 1.14–3.45]; p=0.015) in the facial affect selection task. In HIV+ individuals, neurocognitive impairment was associated with worse memory task results (p<0.01, d=0.88; p<0.01, d=1.48). Conclusions We did not find difference in the overall emotion processing between HIV+ and HIV- individuals. However, we found particular recognition deficits in the entire HIV+ sample. Estimated pre-morbid functioning was associated with sadness and anger

  5. Facial emotional processing deficits in long-term HIV-suppressed patients

    Directory of Open Access Journals (Sweden)

    Alicia Gonzalez-Baeza

    2014-11-01

    Full Text Available Introduction: Emotional processing is basic for social behaviour. We examine for the first time the facial emotion processing in long-term HIV-suppressed patients. Materials and Methods: Cross-sectional study comparing (ANOVA six facial emotional processing tasks (two discrimination, two memory and two recognition between HIV-suppressed patients (HIV+ on effective antiretroviral therapy (>2 years and matched (age, gender healthy controls (HCs. Accuracy in the recognition of basic emotions (neutral, happiness, sadness, anger and fear in each recognition task was also compared (Mann–Whitney U test between HIV+ and HCs. In the subset of HIV+, we evaluate which factors were associated with impaired recognition of basic emotions (accuracy below 50% by multiple logistic regression analysis. Overall performance in all six emotional tasks were separately compared between neurocognitive impaired and non-impaired HIV+. Results: We included 107 HIV+, mainly Caucasian (89% male (72% with a mean age of 47.4 years, neurocognitively non-impaired (75.5%, and 40 HCs. Overall discrimination (p=0.38, memory (p=0.65 and recognition tasks (p=0.29 were similar in both groups. However, HIV+ had lower sadness recognition in both recognition tasks and lower sadness, anger and fear recognition in the facial affect selection task (Figure 1. Only estimated pre-morbid functioning (WAIS-III-R vocabulary subtest score was significantly associated with sadness (1.99 [95% CI 1.18–3.58]; p=0.01 and anger recognition deficits (2.06 [95% CI 1.14–3.45]; p=0.015 in the facial affect selection task. In HIV+ individuals, neurocognitive impairment was associated with worse memory task results (p<0.01, d=0.88; p<0.01, d=1.48. Conclusions: We did not find difference in the overall emotion processing between HIV+ and HIV- individuals. However, we found particular recognition deficits in the entire HIV+ sample. Estimated pre-morbid functioning was associated with sadness and anger

  6. Biological Computation Indexes of Brain Oscillations in Unattended Facial Expression Processing Based on Event-Related Synchronization/Desynchronization

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2016-01-01

    Full Text Available Estimation of human emotions from Electroencephalogram (EEG signals plays a vital role in affective Brain Computer Interface (BCI. The present study investigated the different event-related synchronization (ERS and event-related desynchronization (ERD of typical brain oscillations in processing Facial Expressions under nonattentional condition. The results show that the lower-frequency bands are mainly used to update Facial Expressions and distinguish the deviant stimuli from the standard ones, whereas the higher-frequency bands are relevant to automatically processing different Facial Expressions. Accordingly, we set up the relations between each brain oscillation and processing unattended Facial Expressions by the measures of ERD and ERS. This research first reveals the contributions of each frequency band for comprehension of Facial Expressions in preattentive stage. It also evidences that participants have emotional experience under nonattentional condition. Therefore, the user’s emotional state under nonattentional condition can be recognized in real time by the ERD/ERS computation indexes of different frequency bands of brain oscillations, which can be used in affective BCI to provide the user with more natural and friendly ways.

  7. Facial Features Can Induce Emotion: Evidence from Affective Priming Tasks

    Directory of Open Access Journals (Sweden)

    Chia-Chen Wu

    2011-05-01

    Full Text Available Our previous study found that schematic faces with direct gazes, with mouths, with horizontal oval eyes, or without noses, tend to be perceived as in negative emotion. In this study we further explore these factors by the affective priming task. Faces were taking as prime, and positive or negative words were probe. The task was to judge the valence of the probe. If the faces could induce emotions, a target word with the same emotional valence should be judged faster than with opposite valence (the congruency effect. Experiment 1 used the most positive and negative rated faces in previous study as the primes. The positive faces were with vertical oval eyes and without mouth, while the negative faces were with horizontal eyes and with mouth. Results of 34 participants showed that those faces indeed elicited congruency effects. Experiment 2 manipulated gaze directions (N = 16. After the task the participants were asked to rate the prime faces. According to their rating, faces with direct gaze was perceive as positive, and elicited congruency effect with positive words in affective priming task. Our data thus support the conjecture that shape of eyes, the existence of mouths, and gaze directions could induces emotion.

  8. Motor, affective and cognitive empathy in adolescence: Interrelations between facial electromyography and self-reported trait and state measures.

    Science.gov (United States)

    Van der Graaff, Jolien; Meeus, Wim; de Wied, Minet; van Boxtel, Anton; van Lier, Pol A C; Koot, Hans M; Branje, Susan

    2016-01-01

    This study examined interrelations of trait and state empathy in an adolescent sample. Self-reported affective trait empathy and cognitive trait empathy were assessed during a home visit. During a test session at the university, motor empathy (facial electromyography), and self-reported affective and cognitive state empathy were assessed in response to empathy-inducing film clips portraying happiness and sadness. Adolescents who responded with stronger motor empathy consistently reported higher affective state empathy. Adolescents' motor empathy was also positively related to cognitive state empathy, either directly or indirectly via affective state empathy. Whereas trait empathy was consistently, but modestly, related to state empathy with sadness, for state empathy with happiness few trait-state associations were found. Together, the findings provide support for the notion that empathy is a multi-faceted phenomenon. Motor, affective and cognitive empathy seem to be related processes, each playing a different role in the ability to understand and share others' feelings.

  9. Revisiting the Relationship between the Processing of Gaze Direction and the Processing of Facial Expression

    Science.gov (United States)

    Ganel, Tzvi

    2011-01-01

    There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension.…

  10. Reaction Time of Facial Affect Recognition in Asperger's Disorder for Cartoon and Real, Static and Moving Faces

    Science.gov (United States)

    Miyahara, Motohide; Bray, Anne; Tsujii, Masatsugu; Fujita, Chikako; Sugiyama, Toshiro

    2007-01-01

    This study used a choice reaction-time paradigm to test the perceived impairment of facial affect recognition in Asperger's disorder. Twenty teenagers with Asperger's disorder and 20 controls were compared with respect to the latency and accuracy of response to happy or disgusted facial expressions, presented in cartoon or real images and in…

  11. Reaction Time of Facial Affect Recognition in Asperger's Disorder for Cartoon and Real, Static and Moving Faces

    Science.gov (United States)

    Miyahara, Motohide; Bray, Anne; Tsujii, Masatsugu; Fujita, Chikako; Sugiyama, Toshiro

    2007-01-01

    This study used a choice reaction-time paradigm to test the perceived impairment of facial affect recognition in Asperger's disorder. Twenty teenagers with Asperger's disorder and 20 controls were compared with respect to the latency and accuracy of response to happy or disgusted facial expressions, presented in cartoon or real images and in…

  12. Processing of individual items during ensemble coding of facial expressions

    Directory of Open Access Journals (Sweden)

    Huiyun Li

    2016-09-01

    Full Text Available There is growing evidence that human observers are able to extract the mean emotion or other type of information from a set of faces. The most intriguing aspect of this phenomenon is that observers often fail to identify or form a representation for individual faces in a face set. However, most of these results were based on judgments under limited processing resource. We examined a wider range of exposure time and observed how the relationship between the extraction of a mean and representation of individual facial expressions would change. The results showed that with an exposure time of 50 milliseconds for the faces, observers were more sensitive to mean representation over individual representation, replicating the typical findings in the literature. With longer exposure time, however, observers were able to extract both individual and mean representation more accurately. Furthermore, diffusion model analysis revealed that the mean representation is also more prone to suffer from the noise accumulated in redundant processing time and leads to a more conservative decision bias, whereas individual representations seem more resistant to this noise. Results suggest that the encoding of emotional information from multiple faces may take two forms: single face processing and crowd face processing.

  13. Priming effects on the N400 in the affective priming paradigm with facial expressions of emotion.

    Science.gov (United States)

    Aguado, Luis; Dieguez-Risco, Teresa; Méndez-Bértolo, Constantino; Pozo, Miguel A; Hinojosa, José A

    2013-06-01

    We studied the effect of facial expression primes on the evaluation of target words through a variant of the affective priming paradigm. In order to make the affective valence of the faces irrelevant to the task, the participants were assigned a double prime-target task in which they were unpredictably asked either to identify the gender of the face or to evaluate whether the word was pleasant or unpleasant. Behavioral and electrophysiological (event-related potential, or ERP) indices of affective priming were analyzed. Temporal and spatial versions of principal components analyses were used to detect and quantify those ERP components associated with affective priming. Although no significant behavioral priming was observed, electrophysiological indices showed a reverse priming effect, in the sense that the amplitude of the N400 was higher in response to congruent than to incongruent negative words. Moreover, a late positive potential (LPP), peaking around 700 ms, was sensitive to affective valence but not to prime-target congruency. This pattern of results is consistent with previous accounts of ERP effects in the affective priming paradigm that have linked the LPP with evaluative priming and the N400 with semantic priming. Our proposed explanation of the N400 priming effects obtained in the present study is based on two assumptions: a double check of affective stimuli in terms of valence and specific emotion content, and the differential specificities of facial expressions of positive and negative emotions.

  14. Brain regions involved in processing facial identity and expression are differentially selective for surface and edge information.

    Science.gov (United States)

    Harris, Richard J; Young, Andrew W; Andrews, Timothy J

    2014-08-15

    Although different brain regions are widely considered to be involved in the recognition of facial identity and expression, it remains unclear how these regions process different properties of the visual image. Here, we ask how surface-based reflectance information and edge-based shape cues contribute to the perception and neural representation of facial identity and expression. Contrast-reversal was used to generate images in which normal contrast relationships across the surface of the image were disrupted, but edge information was preserved. In a behavioural experiment, contrast-reversal significantly attenuated judgements of facial identity, but only had a marginal effect on judgements of expression. An fMR-adaptation paradigm was then used to ask how brain regions involved in the processing of identity and expression responded to blocks comprising all normal, all contrast-reversed, or a mixture of normal and contrast-reversed faces. Adaptation in the posterior superior temporal sulcus--a region directly linked with processing facial expression--was relatively unaffected by mixing normal with contrast-reversed faces. In contrast, the response of the fusiform face area--a region linked with processing facial identity--was significantly affected by contrast-reversal. These results offer a new perspective on the reasons underlying the neural segregation of facial identity and expression in which brain regions involved in processing invariant aspects of faces, such as identity, are very sensitive to surface-based cues, whereas regions involved in processing changes in faces, such as expression, are relatively dependent on edge-based cues.

  15. Endogenous testosterone levels are associated with neural activity in men with schizophrenia during facial emotion processing.

    Science.gov (United States)

    Ji, Ellen; Weickert, Cynthia Shannon; Lenroot, Rhoshel; Catts, Stanley V; Vercammen, Ans; White, Christopher; Gur, Raquel E; Weickert, Thomas W

    2015-06-01

    Growing evidence suggests that testosterone may play a role in the pathophysiology of schizophrenia given that testosterone has been linked to cognition and negative symptoms in schizophrenia. Here, we determine the extent to which serum testosterone levels are related to neural activity in affective processing circuitry in men with schizophrenia. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as 32 healthy controls and 26 people with schizophrenia performed a facial emotion identification task. Whole brain analyses were performed to determine regions of differential activity between groups during processing of angry versus non-threatening faces. A follow-up ROI analysis using a regression model in a subset of 16 healthy men and 16 men with schizophrenia was used to determine the extent to which serum testosterone levels were related to neural activity. Healthy controls displayed significantly greater activation than people with schizophrenia in the left inferior frontal gyrus (IFG). There was no significant difference in circulating testosterone levels between healthy men and men with schizophrenia. Regression analyses between activation in the IFG and circulating testosterone levels revealed a significant positive correlation in men with schizophrenia (r=.63, p=.01) and no significant relationship in healthy men. This study provides the first evidence that circulating serum testosterone levels are related to IFG activation during emotion face processing in men with schizophrenia but not in healthy men, which suggests that testosterone levels modulate neural processes relevant to facial emotion processing that may interfere with social functioning in men with schizophrenia.

  16. Reduced capacity in automatic processing of facial expression in restrictive anorexia nervosa and obesity

    NARCIS (Netherlands)

    Cserjesi, Renata; Vermeulen, Nicolas; Lenard, Laszlo; Luminet, Olivier

    2011-01-01

    There is growing evidence that disordered eating is associated with facial expression recognition and emotion processing problems. In this study, we investigated the question of whether anorexia and obesity occur on a continuum of attention bias towards negative facial expressions in comparison with

  17. Discriminative shared Gaussian processes for multiview and view-invariant facial expression recognition.

    Science.gov (United States)

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    2015-01-01

    Images of facial expressions are often captured from various views as a result of either head movements or variable camera position. Existing methods for multiview and/or view-invariant facial expression recognition typically perform classification of the observed expression using either classifiers learned separately for each view or a single classifier learned for all views. However, these approaches ignore the fact that different views of a facial expression are just different manifestations of the same facial expression. By accounting for this redundancy, we can design more effective classifiers for the target task. To this end, we propose a discriminative shared Gaussian process latent variable model (DS-GPLVM) for multiview and view-invariant classification of facial expressions from multiple views. In this model, we first learn a discriminative manifold shared by multiple views of a facial expression. Subsequently, we perform facial expression classification in the expression manifold. Finally, classification of an observed facial expression is carried out either in the view-invariant manner (using only a single view of the expression) or in the multiview manner (using multiple views of the expression). The proposed model can also be used to perform fusion of different facial features in a principled manner. We validate the proposed DS-GPLVM on both posed and spontaneously displayed facial expressions from three publicly available datasets (MultiPIE, labeled face parts in the wild, and static facial expressions in the wild). We show that this model outperforms the state-of-the-art methods for multiview and view-invariant facial expression classification, and several state-of-the-art methods for multiview learning and feature fusion.

  18. Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits.

    Directory of Open Access Journals (Sweden)

    Tiziana Quarto

    Full Text Available The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task while the sound environment was defined either by a a therapeutic music sequence (MusiCure, b a noise sequence or c silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.

  19. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    OpenAIRE

    Johnston, Patrick J.; Kaufman, Jordy; Bajic, Julie; Sercombe, Alicia; Michie, Patricia T.; Karayanidis, Frini

    2011-01-01

    Most developmental studies of emotional face processing to date have focused on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were devel...

  20. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    OpenAIRE

    Patrick eJohnston; Jordy eKaufman; Julie eBajic; Alicia eSercombe; Patricia eMichie; Frini eKarayanidis

    2011-01-01

    Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were de...

  1. Affective engagement for facial expressions and emotional scenes: the influence of social anxiety.

    Science.gov (United States)

    Wangelin, Bethany C; Bradley, Margaret M; Kastner, Anna; Lang, Peter J

    2012-09-01

    Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Neural correlates of affective priming effects based on masked facial emotion: an fMRI study.

    Science.gov (United States)

    Suslow, Thomas; Kugel, Harald; Ohrmann, Patricia; Stuhrmann, Anja; Grotegerd, Dominik; Redlich, Ronny; Bauer, Jochen; Dannlowski, Udo

    2013-03-30

    Affective priming refers to the phenomenon that subliminal presentation of facial emotion biases subsequent evaluation of a neutral object in the direction of the prime. The aim of the present study was to specify the neural correlates of evaluative shifts elicited by facial emotion shown below the threshold of conscious perception. We tested the hypotheses whether the amygdala is involved in negative priming, whereas the nucleus accumbens participates in positive priming. In addition, exploratory whole brain correlation analyses were conducted. During 3T fMRI scanning, pictures of sad, happy, and neutral facial expression masked by neutral faces were presented to 110 healthy adults who had to judge valence of masks on a four-point scale. There was evidence for significant negative priming based on sad faces. A correlation was observed between amygdala activation and negative priming. Activation in medial, middle, and superior frontal and middle temporo-occipital areas, and insula was also associated with negative priming. No significant priming based on happy faces was found. However, nucleus accumbens activation to happy faces correlated with the positive priming score. The present findings confirm that the amygdala but also other brain regions, especially the medial frontal cortex, appear involved in automatically elicited negative evaluative shifts.

  3. Processing facial expressions of emotion: upright vs. inverted images

    Directory of Open Access Journals (Sweden)

    David eBimler

    2013-02-01

    Full Text Available We studied discrimination of briefly presented Upright vs. Inverted emotional facial expressions (FEs, hypothesising that inversion would impair emotion decoding by disrupting holistic FE processing. Stimuli were photographs of seven emotion prototypes, of a male and female poser (Ekman and Friesen, 1976, and eight intermediate morphs in each set. Subjects made speeded Same/Different judgements of emotional content for all Upright (U or Inverted (I pairs of FEs, presented for 500 ms, 100 times each pair. Signal Detection Theory revealed the sensitivity measure d' to be slightly but significantly higher for the Upright FEs. In further analysis using multidimensional scaling (MDS, percentages of Same judgements were taken as an index of pairwise perceptual similarity, separately for U and I presentation mode. The outcome was a 4D ‘emotion expression space’, with FEs represented as points and the dimensions identified as Happy–Sad, Surprise/Fear, Disgust and Anger. The solutions for U and I FEs were compared by means of cophenetic and canonical correlation, Procrustes analysis and weighted-Euclidean analysis of individual difference. Differences in discrimination produced by inverting FE stimuli were found to be small and manifested as minor changes in the MDS structure or weights of the dimensions. Solutions differed substantially more between the two posers, however. Notably, for stimuli containing elements of Happiness (whether U or I, the MDS structure revealed some signs of categorical perception, indicating that mouth curvature – the dominant feature conveying Happiness – is visually salient and receives early processing. The findings suggest that for briefly-presented FEs, Same/Different decisions are dominated by low-level visual analysis of abstract patterns of lightness and edge filters, but also reflect emerging featural analysis. These analyses, insensitive to face orientation, enable initial positive/negative Valence

  4. Facial color processing in the face-selective regions: an fMRI study.

    Science.gov (United States)

    Nakajima, Kae; Minami, Tetsuto; Tanabe, Hiroki C; Sadato, Norihiro; Nakauchi, Shigeki

    2014-09-01

    Facial color is important information for social communication as it provides important clues to recognize a person's emotion and health condition. Our previous EEG study suggested that N170 at the left occipito-temporal site is related to facial color processing (Nakajima et al., [2012]: Neuropsychologia 50:2499-2505). However, because of the low spatial resolution of EEG experiment, the brain region is involved in facial color processing remains controversial. In the present study, we examined the neural substrates of facial color processing using functional magnetic resonance imaging (fMRI). We measured brain activity from 25 subjects during the presentation of natural- and bluish-colored face and their scrambled images. The bilateral fusiform face (FFA) area and occipital face area (OFA) were localized by the contrast of natural-colored faces versus natural-colored scrambled images. Moreover, region of interest (ROI) analysis showed that the left FFA was sensitive to facial color, whereas the right FFA and the right and left OFA were insensitive to facial color. In combination with our previous EEG results, these data suggest that the left FFA may play an important role in facial color processing.

  5. The influence of a working memory task on affective perception of facial expressions.

    Science.gov (United States)

    Lim, Seung-Lark; Bruce, Amanda S; Aupperle, Robin L

    2014-01-01

    In a dual-task paradigm, participants performed a spatial location working memory task and a forced two-choice perceptual decision task (neutral vs. fearful) with gradually morphed emotional faces (neutral ∼ fearful). Task-irrelevant word distractors (negative, neutral, and control) were experimentally manipulated during spatial working memory encoding. We hypothesized that, if affective perception is influenced by concurrent cognitive load using a working memory task, task-irrelevant emotional distractors would bias subsequent perceptual decision-making on ambiguous facial expression. We found that when either neutral or negative emotional words were presented as task-irrelevant working-memory distractors, participants more frequently reported fearful face perception - but only at the higher emotional intensity levels of morphed faces. Also, the affective perception bias due to negative emotional distractors correlated with a decrease in working memory performance. Taken together, our findings suggest that concurrent working memory load by task-irrelevant distractors has an impact on affective perception of facial expressions.

  6. Does gaze direction modulate facial expression processing in children with autism spectrum disorder?

    Science.gov (United States)

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent motivational tendency (i.e., an avoidant facial expression with averted eye gaze) than those with an incongruent motivational tendency. Children with ASD (9-14 years old; n = 14) were not affected by the gaze direction of facial stimuli. This finding was replicated in Experiment 2, which presented only the eye region of the face to typically developing children (n = 10) and children with ASD (n = 10). These results demonstrated that children with ASD do not encode and/or integrate multiple communicative signals based on their affective or motivational tendency.

  7. Recognition of facial emotion and affective prosody in children with ASD (+ADHD) and their unaffected siblings.

    Science.gov (United States)

    Oerlemans, Anoek M; van der Meer, Jolanda M J; van Steijn, Daphne J; de Ruiter, Saskia W; de Bruijn, Yvette G E; de Sonneville, Leo M J; Buitelaar, Jan K; Rommelse, Nanda N J

    2014-05-01

    Autism is a highly heritable and clinically heterogeneous neuropsychiatric disorder that frequently co-occurs with other psychopathologies, such as attention-deficit/hyperactivity disorder (ADHD). An approach to parse heterogeneity is by forming more homogeneous subgroups of autism spectrum disorder (ASD) patients based on their underlying, heritable cognitive vulnerabilities (endophenotypes). Emotion recognition is a likely endophenotypic candidate for ASD and possibly for ADHD. Therefore, this study aimed to examine whether emotion recognition is a viable endophenotypic candidate for ASD and to assess the impact of comorbid ADHD in this context. A total of 90 children with ASD (43 with and 47 without ADHD), 79 ASD unaffected siblings, and 139 controls aged 6-13 years, were included to test recognition of facial emotion and affective prosody. Our results revealed that the recognition of both facial emotion and affective prosody was impaired in children with ASD and aggravated by the presence of ADHD. The latter could only be partly explained by typical ADHD cognitive deficits, such as inhibitory and attentional problems. The performance of unaffected siblings could overall be considered at an intermediate level, performing somewhat worse than the controls and better than the ASD probands. Our findings suggest that emotion recognition might be a viable endophenotype in ASD and a fruitful target in future family studies of the genetic contribution to ASD and comorbid ADHD. Furthermore, our results suggest that children with comorbid ASD and ADHD are at highest risk for emotion recognition problems.

  8. Comparison of eye-movement patterns in schizophrenic and normal adults during examination of facial affect displays.

    Science.gov (United States)

    Shimizu, T; Shimizu, A; Yamashita, K; Iwase, M; Kajimoto, O; Kawasaki, T

    2000-12-01

    Patients with schizophrenia are known to have deficits in facial affect recognition. Subjects were 25 schizophrenic patients and 25 normal subjects who were shown pairs of slides of laughing faces and asked to compare the intensity of laughter in the two slides. Eye movements were recorded using an infrared scleral reflection technique. Normal subjects efficiently compared the same facial features in the two slides, examining the eyes and mouth, important areas for recognizing laughter, for a longer time than other regions of the face. Schizophrenic patients spent less time ex amining the eyes and mouth and often examined other regions of the face or areas other than the face. Similar results were obtained for the number of fixation points. That schizophrenic patients may have employed an inefficient strategy with few effective eye movements in facial comparison and recognition may help to explain the deficits in facial recognition observed in schizophrenic patients.

  9. Motor, affective and cognitive empathy in adolescence : Interrelations between facial electromyography and self-reported trait and state measures

    NARCIS (Netherlands)

    Van der Graaff, Jolien; Meeus, W; de Wied, Minet; van Boxtel, Anton; van Lier, Pol A C; Koot, Hans M.; Branje, Susan

    2016-01-01

    This study examined interrelations of trait and state empathy in an adolescent sample. Self-reported affective trait empathy and cognitive trait empathy were assessed during a home visit. During a test session at the university, motor empathy (facial electromyography), and self-reported affective an

  10. Motor, affective and cognitive empathy in adolescence : Interrelations between facial electromyography and self-reported trait and state measures

    NARCIS (Netherlands)

    van der Graaff, J.; Meeus, W.H.J.; de Wied, M.; van Boxtel, A.; van Lier, P.A.C.; Koot, H.M.; Branje, S.T.J.

    2016-01-01

    This study examined interrelations of trait and state empathy in an adolescent sample. Self-reported affective trait empathy and cognitive trait empathy were assessed during a home visit. During a test session at the university, motor empathy (facial electromyography), and self-reported affective

  11. Motor, affective and cognitive empathy in adolescence : Interrelations between facial electromyography and self-reported trait and state measures

    NARCIS (Netherlands)

    Van der Graaff, Jolien; Meeus, W; de Wied, Minet; van Boxtel, Anton; van Lier, Pol A C; Koot, Hans M.; Branje, Susan

    2016-01-01

    This study examined interrelations of trait and state empathy in an adolescent sample. Self-reported affective trait empathy and cognitive trait empathy were assessed during a home visit. During a test session at the university, motor empathy (facial electromyography), and self-reported affective an

  12. A detailed investigation of facial expression processing in congenital prosopagnosia as compared to acquired prosopagnosia.

    Science.gov (United States)

    Humphreys, Kate; Avidan, Galia; Behrmann, Marlene

    2007-01-01

    Whether the ability to recognize facial expression can be preserved in the absence of the recognition of facial identity remains controversial. The current study reports the results of a detailed investigation of facial expression recognition in three congenital prosopagnosic (CP) participants, in comparison with two patients with acquired prosopagnosia (AP) and a large group of 30 neurologically normal participants, including individually age- and gender-matched controls. Participants completed a fine-grained expression recognition paradigm requiring a six-alternative forced-choice response to continua of morphs of six different basic facial expressions (e.g. happiness and surprise). Accuracy, sensitivity and reaction times were measured. The performance of all three CP individuals was indistinguishable from that of controls, even for the most subtle expressions. In contrast, both individuals with AP displayed pronounced difficulties with the majority of expressions. The results from the CP participants attest to the dissociability of the processing of facial identity and of facial expression. Whether this remarkably good expression recognition is achieved through normal, or compensatory, mechanisms remains to be determined. Either way, this normal level of performance does not extend to include facial identity.

  13. Facial emotion and identity processing development in 5- to 15-year-old children.

    Science.gov (United States)

    Johnston, Patrick J; Kaufman, Jordy; Bajic, Julie; Sercombe, Alicia; Michie, Patricia T; Karayanidis, Frini

    2011-01-01

    Most developmental studies of emotional face processing to date have focused on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching, and butterfly wing matching) to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety-two children aged 5-15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

  14. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Patrick eJohnston

    2011-02-01

    Full Text Available Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching and butterfly wing matching to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety two children aged 5 to 15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

  15. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    OpenAIRE

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed t...

  16. Misperceiving facial affect: effects of laterality and individual differences in susceptibility to visual hallucinations.

    Science.gov (United States)

    Coy, Abbie L; Hutton, Samuel B

    2012-04-30

    It has been suggested that certain types of auditory hallucinations may be the by-product of a perceptual system that has evolved to be oversensitive to threat-related stimuli. People with schizophrenia and high schizotypes experience visual as well as auditory hallucinations, and have deficits in processing facial emotions. We sought to determine the relationship between visual hallucination proneness and the tendency to misattribute threat and non-threat related emotions to neutral faces. Participants completed a questionnaire assessing visual hallucination proneness (the Revised Visual Hallucination Scale - RVHS). High scoring individuals (N=64) were compared to low scoring individuals (N=72) on a novel emotion detection task. The high RVHS group made more false positive errors (ascribing emotions to neutral faces) than the low RVHS group, particularly when detecting threat-related emotions. All participants made more false positives when neutral faces were presented to the right visual field than to the left visual field. Our results support continuum models of visual hallucinatory experience in which tolerance for false positives is highest for potentially threatening emotional stimuli and suggest that lateral asymmetries in face processing extend to the misperception of facial emotion.

  17. Compensatory premotor activity during affective face processing in subclinical carriers of a single mutant Parkin allele.

    Science.gov (United States)

    Anders, Silke; Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand

    2012-04-01

    Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia-cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia-cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons ('mirror neurons') in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia-cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in

  18. Neural evidence for the subliminal processing of facial trustworthiness in infancy.

    Science.gov (United States)

    Jessen, Sarah; Grossmann, Tobias

    2017-04-22

    Face evaluation is thought to play a vital role in human social interactions. One prominent aspect is the evaluation of facial signs of trustworthiness, which has been shown to occur reliably, rapidly, and without conscious awareness in adults. Recent developmental work indicates that the sensitivity to facial trustworthiness has early ontogenetic origins as it can already be observed in infancy. However, it is unclear whether infants' sensitivity to facial signs of trustworthiness relies upon conscious processing of a face or, similar to adults, occurs also in response to subliminal faces. To investigate this question, we conducted an event-related brain potential (ERP) study, in which we presented 7-month-old infants with faces varying in trustworthiness. Facial stimuli were presented subliminally (below infants' face visibility threshold) for only 50ms and then masked by presenting a scrambled face image. Our data revealed that infants' ERP responses to subliminally presented faces differed as a function of trustworthiness. Specifically, untrustworthy faces elicited an enhanced negative slow wave (800-1000ms) at frontal and central electrodes. The current findings critically extend prior work by showing that, similar to adults, infants' neural detection of facial signs of trustworthiness occurs also in response to subliminal face. This supports the view that detecting facial trustworthiness is an early developing and automatic process in humans. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Subjective disturbance of perception is related to facial affect recognition in schizophrenia.

    Science.gov (United States)

    Comparelli, Anna; De Carolis, Antonella; Corigliano, Valentina; Romano, Silvia; Kotzalidis, Giorgio D; Campana, Chiara; Ferracuti, Stefano; Tatarelli, Roberto; Girardi, Paolo

    2011-10-01

    To examine the relationship between facial affect recognition (FAR) and subjective perceptual disturbances (SPDs), we assessed SPDs in 82 patients with DSM-IV schizophrenia (44 with first-episode psychosis [FEP] and 38 with multiple episodes [ME]) using two subscales of the Frankfurt Complaint Questionnaire (FCQ), WAS (simple perception) and WAK (complex perception). Emotional judgment ability was assessed using Ekman and Friesen's FAR task. Impaired recognition of emotion correlated with scores on the WAS but not on the WAK. The association was significant in the entire group and in the ME group. FAR was more impaired in the ME than in the FEP group. Our findings suggest that there is a relationship between SPDs and FAR impairment in schizophrenia, particularly in multiple-episode patients.

  20. Long-term academic stress enhances early processing of facial expressions.

    Science.gov (United States)

    Zhang, Liang; Qin, Shaozheng; Yao, Zhuxi; Zhang, Kan; Wu, Jianhui

    2016-11-01

    Exposure to long-term stress can lead to a variety of emotional and behavioral problems. Although widely investigated, the neural basis of how long-term stress impacts emotional processing in humans remains largely elusive. Using event-related brain potentials (ERPs), we investigated the effects of long-term stress on the neural dynamics of emotionally facial expression processing. Thirty-nine male college students undergoing preparation for a major examination and twenty-one matched controls performed a gender discrimination task for faces displaying angry, happy, and neutral expressions. The results of the Perceived Stress Scale showed that participants in the stress group perceived higher levels of long-term stress relative to the control group. ERP analyses revealed differential effects of long-term stress on two early stages of facial expression processing: 1) long-term stress generally augmented posterior P1 amplitudes to facial stimuli irrespective of expression valence, suggesting that stress can increase sensitization to visual inputs in general, and 2) long-term stress selectively augmented fronto-central P2 amplitudes for angry but not for neutral or positive facial expressions, suggesting that stress may lead to increased attentional prioritization to processing negative emotional stimuli. Together, our findings suggest that long-term stress has profound impacts on the early stages of facial expression processing, with an increase at the very early stage of general information inputs and a subsequent attentional bias toward processing emotionally negative stimuli.

  1. Positive emotional priming of facial affect perception in females is diminished by chemosensory anxiety signals.

    Science.gov (United States)

    Pause, Bettina M; Ohrt, Anne; Prehn, Alexander; Ferstl, Roman

    2004-11-01

    Chemosensory communication of anxiety is a common phenomenon in vertebrates and improves perceptual and responsive behaviour in the perceiver in order to optimize ontogenetic survival. A few rating studies reported a similar phenomenon in humans. Here, we investigated whether subliminal face perception changes in the context of chemosensory anxiety signals. Axillary sweat samples were taken from 12 males while they were waiting for an academic examination and while exercising ergometric training some days later. 16 subjects (eight females) participated in an emotional priming study, using happy, fearful and sad facial expressions as primes (11.7 ms) and neutral faces as targets (47 ms). The pooled chemosensory samples were presented before and during picture presentation (920 ms). In the context of chemosensory stimuli derived from sweat samples taken during the sport condition, subjects judged the targets significantly more positive when they were primed by a happy face than when they were primed by the negative facial expressions (P = 0.02). In the context of the chemosensory anxiety signals, the priming effect of the happy faces was diminished in females (P = 0.02), but not in males. It is discussed whether, in socially relevant ambiguous perceptual conditions, chemosensory signals have a processing advantage and dominate visual signals or whether fear signals in general have a stronger behavioural impact than positive signals.

  2. Elementary neurocognitive function, facial affect recognition and social-skills in schizophrenia.

    Science.gov (United States)

    Meyer, Melissa B; Kurtz, Matthew M

    2009-05-01

    Social-skill deficits are pervasive in schizophrenia and negatively impact many key aspects of functioning. Prior studies have found that measures of elementary neurocognition and social cognition are related to social-skills. In the present study we selected a range of neurocognitive measures and examined their relationship with identification of happy and sad faces and performance-based social-skills. Fifty-three patients with schizophrenia or schizoaffective disorder participated. Results revealed that: 1) visual vigilance, problem-solving and affect recognition were related to social-skill; 2) links between problem-solving and social-skill, but not visual vigilance and social-skill, remained significant when estimates of verbal intelligence were controlled; 3) affect recognition deficits explained unique variance in social-skill after neurocognitive variables were controlled; and 4) affect recognition deficits partially mediated the relationship of visual vigilance and social-skill. These results support the conclusion that facial affect recognition deficits are a crucial domain of impairment in schizophrenia that both contribute unique variance to social-skill deficits and may also mediate the relationship between some aspects of neurocognition and social-skill. These findings may help guide the development and refinement of cognitive and social-cognitive remediation methods for social-skill impairment.

  3. Facial affect recognition in body dysmorphic disorder versus obsessive-compulsive disorder: An eye-tracking study.

    Science.gov (United States)

    Toh, Wei Lin; Castle, David J; Rossell, Susan L

    2015-10-01

    Body dysmorphic disorder (BDD) is characterised by repetitive behaviours and/or mental acts occurring in response to preoccupations with perceived defects or flaws in physical appearance (American Psychiatric Association, 2013). This study aimed to investigate facial affect recognition in BDD using an integrated eye-tracking paradigm. Participants were 21 BDD patients, 19 obsessive-compulsive disorder (OCD) patients and 21 healthy controls (HC), who were age-, sex-, and IQ-matched. Stimuli were from the Pictures of Facial Affect (Ekman & Friesen, 1975), and outcome measures were affect recognition accuracy as well as spatial and temporal scanpath parameters. Relative to OCD and HC groups, BDD patients demonstrated significantly poorer facial affect perception and an angry recognition bias. An atypical scanning strategy encompassing significantly more blinks, fewer fixations of extended mean durations, higher mean saccade amplitudes, and less visual attention devoted to salient facial features was found. Patients with BDD were substantially impaired in the scanning of faces, and unable to extract affect-related information, likely indicating deficits in basic perceptual operations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Role of temporal processing stages by inferior temporal neurons in facial recognition

    Directory of Open Access Journals (Sweden)

    Yasuko eSugase-Miyamoto

    2011-06-01

    Full Text Available In this review, we focus on the role of temporal stages of encoded facial information in the visual system, which might enable the efficient determination of species, identity, and expression. Facial recognition is an important function of our brain and is known to be processed in the ventral visual pathway, where visual signals are processed through areas V1, V2, V4, and the inferior temporal (IT cortex. In the IT cortex, neurons show selective responses to complex visual images such as faces, and at each stage along the pathway the stimulus selectivity of the neural responses becomes sharper, particularly in the later portion of the responses.In the IT cortex of the monkey, facial information is represented by different temporal stages of neural responses, as shown in our previous study: the initial transient response of face-responsive neurons represents information about global categories, i.e., human vs. monkey vs. simple shapes, whilst the later portion of these responses represents information about detailed facial categories, i.e., expression and/or identity. This suggests that the temporal stages of the neuronal firing pattern play an important role in the coding of visual stimuli, including faces. This type of coding may be a plausible mechanism underlying the temporal dynamics of recognition, including the process of detection/categorization followed by the identification of objects. Recent single-unit studies in monkeys have also provided evidence consistent with the important role of the temporal stages of encoded facial information. For example, view-invariant facial identity information is represented in the response at a later period within a region of face-selective neurons. Consistent with these findings, temporally modulated neural activity has also been observed in human studies. These results suggest a close correlation between the temporal processing stages of facial information by IT neurons and the temporal dynamics of

  5. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  6. Perceptual, Categorical, and Affective Processing of Ambiguous Smiling Facial Expressions

    Science.gov (United States)

    Calvo, Manuel G.; Fernandez-Martin, Andres; Nummenmaa, Lauri

    2012-01-01

    Why is a face with a smile but non-happy eyes likely to be interpreted as happy? We used blended expressions in which a smiling mouth was incongruent with the eyes (e.g., angry eyes), as well as genuine expressions with congruent eyes and mouth (e.g., both happy or angry). Tasks involved detection of a smiling mouth (perceptual), categorization of…

  7. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    OpenAIRE

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggr...

  8. Emotion categories and dimensions in the facial communication of affect: An integrated approach.

    Science.gov (United States)

    Mehu, Marc; Scherer, Klaus R

    2015-12-01

    We investigated the role of facial behavior in emotional communication, using both categorical and dimensional approaches. We used a corpus of enacted emotional expressions (GEMEP) in which professional actors are instructed, with the help of scenarios, to communicate a variety of emotional experiences. The results of Study 1 replicated earlier findings showing that only a minority of facial action units are associated with specific emotional categories. Likewise, facial behavior did not show a specific association with particular emotional dimensions. Study 2 showed that facial behavior plays a significant role both in the detection of emotions and in the judgment of their dimensional aspects, such as valence, arousal, dominance, and unpredictability. In addition, a mediation model revealed that the association between facial behavior and recognition of the signaler's emotional intentions is mediated by perceived emotional dimensions. We conclude that, from a production perspective, facial action units convey neither specific emotions nor specific emotional dimensions, but are associated with several emotions and several dimensions. From the perceiver's perspective, facial behavior facilitated both dimensional and categorical judgments, and the former mediated the effect of facial behavior on recognition accuracy. The classification of emotional expressions into discrete categories may, therefore, rely on the perception of more general dimensions such as valence and arousal and, presumably, the underlying appraisals that are inferred from facial movements.

  9. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  10. Cradling Side Preference Is Associated with Lateralized Processing of Baby Facial Expressions in Females

    Science.gov (United States)

    Huggenberger, Harriet J.; Suter, Susanne E.; Reijnen, Ester; Schachinger, Hartmut

    2009-01-01

    Women's cradling side preference has been related to contralateral hemispheric specialization of processing emotional signals; but not of processing baby's facial expression. Therefore, 46 nulliparous female volunteers were characterized as left or non-left holders (HG) during a doll holding task. During a signal detection task they were then…

  11. Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.

    Science.gov (United States)

    Grewe, Oliver; Nagel, Frederik; Kopiez, Reinhard; Altenmüller, Eckart

    2007-11-01

    Most people are able to identify basic emotions expressed in music and experience affective reactions to music. But does music generally induce emotion? Does it elicit subjective feelings, physiological arousal, and motor reactions reliably in different individuals? In this interdisciplinary study, measurement of skin conductance, facial muscle activity, and self-monitoring were synchronized with musical stimuli. A group of 38 participants listened to classical, rock, and pop music and reported their feelings in a two-dimensional emotion space during listening. The first entrance of a solo voice or choir and the beginning of new sections were found to elicit interindividual changes in subjective feelings and physiological arousal. Quincy Jones' "Bossa Nova" motivated movement and laughing in more than half of the participants. Bodily reactions such as "goose bumps" and "shivers" could be stimulated by the "Tuba Mirum" from Mozart's Requiem in 7 of 38 participants. In addition, the authors repeated the experiment seven times with one participant to examine intraindividual stability of effects. This exploratory combination of approaches throws a new light on the astonishing complexity of affective music listening.

  12. Callous-unemotional traits and empathy deficits: Mediating effects of affective perspective-taking and facial emotion recognition.

    Science.gov (United States)

    Lui, Joyce H L; Barry, Christopher T; Sacco, Donald F

    2016-09-01

    Although empathy deficits are thought to be associated with callous-unemotional (CU) traits, findings remain equivocal and little is known about what specific abilities may underlie these purported deficits. Affective perspective-taking (APT) and facial emotion recognition may be implicated, given their independent associations with both empathy and CU traits. The current study examined how CU traits relate to cognitive and affective empathy and whether APT and facial emotion recognition mediate these relations. Participants were 103 adolescents (70 males) aged 16-18 attending a residential programme. CU traits were negatively associated with cognitive and affective empathy to a similar degree. The association between CU traits and affective empathy was partially mediated by APT. Results suggest that assessing mechanisms that may underlie empathic deficits, such as perspective-taking, may be important for youth with CU traits and may inform targets of intervention.

  13. Neural correlates of automatic perceptual sensitivity to facial affect in posttraumatic stress disorder subjects who survived L'Aquila eartquake of April 6, 2009.

    Science.gov (United States)

    Mazza, Monica; Catalucci, Alessia; Mariano, Melania; Pino, Maria Chiara; Tripaldi, Simona; Roncone, Rita; Gallucci, Massimo

    2012-09-01

    The "Emotional Numbing" (EN) constitutes one of the core symptoms in PTSD although its exact nature remains elusive. This disorder shows an abnormal response of cortical and limbic regions which are normally involved in understanding emotions since the very earliest stages of the development of processing ability. The aim of our study, which included ten physically healthy subjects with PTSD, diagnosed according to DSM-IV-TR, who survived L'Aquila earthquake of April 6, 2009, and 10 healthy controls matching for age, sex and education, was to examine automatic perceptual sensitivity to facial affect in PTSD, through an affective priming task that was administered during functional magnetic resonance (fMRI). Behavioural data revealed in the PTSD group a higher sensitivity to negative facial affect on an automatic processing level. FMRI data analysis revealed that PTSD subjects showed a significantly higher activation in right insula and left amygdala that we did not observe in healthy subjects; on the contrary, healthy controls showed a greater activation of left lingual gyrus. Our data support the hypothesis that PTSD appears to be sensitive to negative affect on an automatic processing level and correlates with the activation of specific areas involved in processing emotions. An elevated activation of these areas may underlie the emotion dysregulation in PTSD and could explain the Emotional Numbing symptom associated with this disorder. The present study suffers of a number of limitations, for instance, the relatively small sample size did not allow the application of alternative statistical models.

  14. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  15. The Thatcher illusion reveals orientation dependence in brain regions involved in processing facial expressions.

    Science.gov (United States)

    Psalta, Lilia; Young, Andrew W; Thompson, Peter; Andrews, Timothy J

    2014-01-01

    Although the processing of facial identity is known to be sensitive to the orientation of the face, it is less clear whether orientation sensitivity extends to the processing of facial expressions. To address this issue, we used functional MRI (fMRI) to measure the neural response to the Thatcher illusion. This illusion involves a local inversion of the eyes and mouth in a smiling face-when the face is upright, the inverted features make it appear grotesque, but when the face is inverted, the inversion is no longer apparent. Using an fMRI-adaptation paradigm, we found a release from adaptation in the superior temporal sulcus-a region directly linked to the processing of facial expressions-when the images were upright and they changed from a normal to a Thatcherized configuration. However, this release from adaptation was not evident when the faces were inverted. These results show that regions involved in processing facial expressions display a pronounced orientation sensitivity.

  16. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  17. [Peripheral facial nerve palsy].

    Science.gov (United States)

    Pons, Y; Ukkola-Pons, E; Ballivet de Régloix, S; Champagne, C; Raynal, M; Lepage, P; Kossowski, M

    2013-06-01

    Facial palsy can be defined as a decrease in function of the facial nerve, the primary motor nerve of the facial muscles. When the facial palsy is peripheral, it affects both the superior and inferior areas of the face as opposed to central palsies, which affect only the inferior portion. The main cause of peripheral facial palsies is Bell's palsy, which remains a diagnosis of exclusion. The prognosis is good in most cases. In cases with significant cosmetic sequelae, a variety of surgical procedures are available (such as hypoglossal-facial anastomosis, temporalis myoplasty and Tenzel external canthopexy) to rehabilitate facial aesthetics and function.

  18. Facial Injuries and Disorders

    Science.gov (United States)

    Face injuries and disorders can cause pain and affect how you look. In severe cases, they can affect sight, ... your nose, cheekbone and jaw, are common facial injuries. Certain diseases also lead to facial disorders. For ...

  19. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    Science.gov (United States)

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  20. Does format matter for comprehension of a facial affective scale and a numeric scale for pain by adults with Down syndrome?

    NARCIS (Netherlands)

    N.C. de Knegt; H.M. Evenhuis; F. Lobbezoo; C. Schuengel; E.J.A. Scherder

    2013-01-01

    People with intellectual disabilities are at high risk for pain and have communication difficulties. Facial and numeric scales for self-report may aid pain identification. It was examined whether the comprehension of a facial affective scale and a numeric scale for pain in adults with Down syndrome

  1. The Relationship of the Facial Nerve to the Condylar Process: A Cadaveric Study with Implications for Open Reduction Internal Fixation.

    Science.gov (United States)

    Barham, H P; Collister, P; Eusterman, V D; Terella, A M

    2015-01-01

    Introduction. The mandibular condyle is the most common site of mandibular fracture. Surgical treatment of condylar fractures by open reduction and internal fixation (ORIF) demands direct visualization of the fracture. This project aimed to investigate the anatomic relationship of the tragus to the facial nerve and condylar process. Materials and Methods. Twelve fresh hemicadavers heads were used. An extended retromandibular/preauricular approach was utilized, with the incision being based parallel to the posterior edge of the ramus. Measurements were obtained from the tragus to the facial nerve and condylar process. Results. The temporozygomatic division of the facial nerve was encountered during each approach, crossing the mandible at the condylar neck. The mean tissue depth separating the facial nerve from the condylar neck was 5.5 mm (range: 3.5 mm-7 mm, SD 1.2 mm). The upper division of the facial nerve crossed the posterior border of the condylar process on average 2.31 cm (SD 0.10 cm) anterior to the tragus. Conclusions. This study suggests that the temporozygomatic division of the facial nerve will be encountered in most approaches to the condylar process. As visualization of the relationship of the facial nerve to condyle is often limited, recognition that, on average, 5.5 mm of tissue separates condylar process from nerve should help reduce the incidence of facial nerve injury during this procedure.

  2. The Relationship of the Facial Nerve to the Condylar Process: A Cadaveric Study with Implications for Open Reduction Internal Fixation

    Directory of Open Access Journals (Sweden)

    H. P. Barham

    2015-01-01

    Full Text Available Introduction. The mandibular condyle is the most common site of mandibular fracture. Surgical treatment of condylar fractures by open reduction and internal fixation (ORIF demands direct visualization of the fracture. This project aimed to investigate the anatomic relationship of the tragus to the facial nerve and condylar process. Materials and Methods. Twelve fresh hemicadavers heads were used. An extended retromandibular/preauricular approach was utilized, with the incision being based parallel to the posterior edge of the ramus. Measurements were obtained from the tragus to the facial nerve and condylar process. Results. The temporozygomatic division of the facial nerve was encountered during each approach, crossing the mandible at the condylar neck. The mean tissue depth separating the facial nerve from the condylar neck was 5.5 mm (range: 3.5 mm–7 mm, SD 1.2 mm. The upper division of the facial nerve crossed the posterior border of the condylar process on average 2.31 cm (SD 0.10 cm anterior to the tragus. Conclusions. This study suggests that the temporozygomatic division of the facial nerve will be encountered in most approaches to the condylar process. As visualization of the relationship of the facial nerve to condyle is often limited, recognition that, on average, 5.5 mm of tissue separates condylar process from nerve should help reduce the incidence of facial nerve injury during this procedure.

  3. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder

    Directory of Open Access Journals (Sweden)

    Xiaozhe Peng

    2017-06-01

    Full Text Available Internet Gaming Disorder (IGD is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC in the processing of subliminally presented facial expressions (sad, happy, and neutral with event-related potentials (ERPs. The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad–neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing in response to neutral expressions compared to happy expressions in the happy–neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy–neutral expressions context, as well as sad and neutral expressions in the sad–neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy–neutral expressions context.Highlights:• The present study investigated whether the unconscious processing of facial expressions is influenced by

  4. Emotional processing affects movement speed.

    Science.gov (United States)

    Hälbig, Thomas D; Borod, Joan C; Frisina, Pasquale G; Tse, Winona; Voustianiouk, Andrei; Olanow, C Warren; Gracies, Jean-Michel

    2011-09-01

    Emotions can affect various aspects of human behavior. The impact of emotions on behavior is traditionally thought to occur at central, cognitive and motor preparation stages. Using EMG to measure the effects of emotion on movement, we found that emotional stimuli differing in valence and arousal elicited highly specific effects on peripheral movement time. This result has conceptual implications for the emotion-motion link and potentially practical implications for neurorehabilitation and professional environments where fast motor reactions are critical.

  5. Emotional Processing, Recognition, Empathy and Evoked Facial Expression in Eating Disorders: An Experimental Study to Map Deficits in Social Cognition

    National Research Council Canada - National Science Library

    Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet

    2015-01-01

    .... The aim of this study is to examine distinct processes of social-cognition in this patient group, including attentional processing and recognition, empathic reaction and evoked facial expression...

  6. Rapid influence of emotional scenes on encoding of facial expressions: An ERP study

    National Research Council Canada - National Science Library

    Righart, R. G. R; de Gelder, B

    2008-01-01

    ... to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made...

  7. Facial emotion processing in major depression: a systematic review of neuroimaging findings

    Directory of Open Access Journals (Sweden)

    Stuhrmann Anja

    2011-11-01

    Full Text Available Abstract Background Cognitive models of depression suggest that major depression is characterized by biased facial emotion processing, making facial stimuli particularly valuable for neuroimaging research on the neurobiological correlates of depression. The present review provides an overview of functional neuroimaging studies on abnormal facial emotion processing in major depression. Our main objective was to describe neurobiological differences between depressed patients with major depressive disorder (MDD and healthy controls (HCs regarding brain responsiveness to facial expressions and, furthermore, to delineate altered neural activation patterns associated with mood-congruent processing bias and to integrate these data with recent functional connectivity results. We further discuss methodological aspects potentially explaining the heterogeneity of results. Methods A Medline search was performed up to August 2011 in order to identify studies on emotional face processing in acutely depressed patients compared with HCs. A total of 25 studies using functional magnetic resonance imaging were reviewed. Results The analysis of neural activation data showed abnormalities in MDD patients in a common face processing network, pointing to mood-congruent processing bias (hyperactivation to negative and hypoactivation to positive stimuli particularly in the amygdala, insula, parahippocampal gyrus, fusiform face area, and putamen. Furthermore, abnormal activation patterns were repeatedly found in parts of the cingulate gyrus and the orbitofrontal cortex, which are extended by investigations implementing functional connectivity analysis. However, despite several converging findings, some inconsistencies are observed, particularly in prefrontal areas, probably caused by heterogeneities in paradigms and patient samples. Conclusions Further studies in remitted patients and high-risk samples are required to discern whether the described abnormalities represent

  8. Neurocognition and symptoms identify links between facial recognition and emotion processing in schizophrenia: meta-analytic findings.

    Science.gov (United States)

    Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S

    2013-12-01

    In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.

  9. Can the usage of human growth hormones affect facial appearance and the accuracy of face recognition systems?

    Science.gov (United States)

    Rose, Jake; Martin, Michael; Bourlai, Thirimachos

    2014-06-01

    In law enforcement and security applications, the acquisition of face images is critical in producing key trace evidence for the successful identification of potential threats. The goal of the study is to demonstrate that steroid usage significantly affects human facial appearance and hence, the performance of commercial and academic face recognition (FR) algorithms. In this work, we evaluate the performance of state-of-the-art FR algorithms on two unique face image datasets of subjects before (gallery set) and after (probe set) steroid (or human growth hormone) usage. For the purpose of this study, datasets of 73 subjects were created from multiple sources found on the Internet, containing images of men and women before and after steroid usage. Next, we geometrically pre-processed all images of both face datasets. Then, we applied image restoration techniques on the same face datasets, and finally, we applied FR algorithms in order to match the pre-processed face images of our probe datasets against the face images of the gallery set. Experimental results demonstrate that only a specific set of FR algorithms obtain the most accurate results (in terms of the rank-1 identification rate). This is because there are several factors that influence the efficiency of face matchers including (i) the time lapse between the before and after image pre-processing and restoration face photos, (ii) the usage of different drugs (e.g. Dianabol, Winstrol, and Decabolan), (iii) the usage of different cameras to capture face images, and finally, (iv) the variability of standoff distance, illumination and other noise factors (e.g. motion noise). All of the previously mentioned complicated scenarios make clear that cross-scenario matching is a very challenging problem and, thus, further investigation is required.

  10. Facial Feedback Affects Perceived Intensity but Not Quality of Emotional Expressions.

    Science.gov (United States)

    Lobmaier, Janek S; Fischer, Martin H

    2015-08-26

    Motivated by conflicting evidence in the literature, we re-assessed the role of facial feedback when detecting quantitative or qualitative changes in others' emotional expressions. Fifty-three healthy adults observed self-paced morph sequences where the emotional facial expression either changed quantitatively (i.e., sad-to-neutral, neutral-to-sad, happy-to-neutral, neutral-to-happy) or qualitatively (i.e. from sad to happy, or from happy to sad). Observers held a pen in their own mouth to induce smiling or frowning during the detection task. When morph sequences started or ended with neutral expressions we replicated a congruency effect: Happiness was perceived longer and sooner while smiling; sadness was perceived longer and sooner while frowning. Interestingly, no such congruency effects occurred for transitions between emotional expressions. These results suggest that facial feedback is especially useful when evaluating the intensity of a facial expression, but less so when we have to recognize which emotion our counterpart is expressing.

  11. Facial Feedback Affects Perceived Intensity but Not Quality of Emotional Expressions

    Directory of Open Access Journals (Sweden)

    Janek S. Lobmaier

    2015-08-01

    Full Text Available Motivated by conflicting evidence in the literature, we re-assessed the role of facial feedback when detecting quantitative or qualitative changes in others’ emotional expressions. Fifty-three healthy adults observed self-paced morph sequences where the emotional facial expression either changed quantitatively (i.e., sad-to-neutral, neutral-to-sad, happy-to-neutral, neutral-to-happy or qualitatively (i.e. from sad to happy, or from happy to sad. Observers held a pen in their own mouth to induce smiling or frowning during the detection task. When morph sequences started or ended with neutral expressions we replicated a congruency effect: Happiness was perceived longer and sooner while smiling; sadness was perceived longer and sooner while frowning. Interestingly, no such congruency effects occurred for transitions between emotional expressions. These results suggest that facial feedback is especially useful when evaluating the intensity of a facial expression, but less so when we have to recognize which emotion our counterpart is expressing.

  12. [OsiriX, a useful tool for processing tomographic images in patients with facial fracture].

    Science.gov (United States)

    Sierra-Martínez, Eduardo; Cienfuegos-Monroy, Ricardo; Fernández-Sobrino, Gerardo

    2009-01-01

    OsiriX, a Mac OS X-based open source program, is presented as a useful tool to process tomographic images for diagnosis and preoperative planning in patients with facial fractures. CT scans were performed on 124 patients with facial fractures treated at the Department of Maxillofacial and Reconstructive Surgery of the Hospital de Traumatología y Ortopedia "Lomas Verdes" in Mexico City. Information obtained was recorded in DICOM format in CDs and processed in a Macintosh laptop with OsiriX software, doing multiplanar and 3D reconstructions. Surgical findings were compared to the images obtained by the software. Of the surgical findings, 96.5% matched with the OsiriX images. Only 3.5% of the OsiriX images were not consistent because of distortion or artifacts in the CT due to firearm projectiles and Erich arch bars near the involved area. Based on the results obtained, the authors consider that the OsiriX software is a useful tool for diagnosis and preoperative planning in patients with facial fractures. Furthermore, it prevents the loss of information due to the process of image selection by the radiology staff.

  13. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception: An Event-Related Potential Study.

    Directory of Open Access Journals (Sweden)

    Tongran Liu

    Full Text Available The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8 and happy expressions were deviant stimuli (p = 0.2, and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8 and fearful expressions were deviant stimuli (p = 0.2. Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs were obtained during the tasks. The visual mismatch negativity (vMMN components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50-130 ms, the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320-450 ms, the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information.

  14. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception: An Event-Related Potential Study.

    Science.gov (United States)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan; Shi, Jiannong

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50-130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320-450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information.

  15. Facial anatomy.

    Science.gov (United States)

    Marur, Tania; Tuna, Yakup; Demirci, Selman

    2014-01-01

    Dermatologic problems of the face affect both function and aesthetics, which are based on complex anatomical features. Treating dermatologic problems while preserving the aesthetics and functions of the face requires knowledge of normal anatomy. When performing successfully invasive procedures of the face, it is essential to understand its underlying topographic anatomy. This chapter presents the anatomy of the facial musculature and neurovascular structures in a systematic way with some clinically important aspects. We describe the attachments of the mimetic and masticatory muscles and emphasize their functions and nerve supply. We highlight clinically relevant facial topographic anatomy by explaining the course and location of the sensory and motor nerves of the face and facial vasculature with their relations. Additionally, this chapter reviews the recent nomenclature of the branching pattern of the facial artery.

  16. Affect, Behavioural Schemas and the Proving Process

    Science.gov (United States)

    Selden, Annie; McKee, Kerry; Selden, John

    2010-01-01

    In this largely theoretical article, we discuss the relation between a kind of affect, behavioural schemas and aspects of the proving process. We begin with affect as described in the mathematics education literature, but soon narrow our focus to a particular kind of affect--nonemotional cognitive feelings. We then mention the position of feelings…

  17. Neurocognitive processing of emotion facial expressions in individuals with self-reported depressive symptoms: the role of personality and anxiety.

    Science.gov (United States)

    Mardaga, S; Iakimova, G

    2014-11-01

    Neurocognition may constitute one of the numerous factors that mediate the reciprocal influences between personality and depression. The present study explored the influence of personality and anxiety traits on the neurocognitive processing of emotional faces and specifically focused on personal characteristics related to negative (harm avoidance - HA) and positive affectivity (self-directedness - SD) and to anxiety. Twenty participants with self-reported depressive symptoms and 18 control participants were selected based on their BDI-II scores. Personality (TCI-R), anxiety and attention were measured and event-related potentials (ERPs) were recorded during an implicit emotional face perception task (fear, sadness, happiness, neutrality). The participants who self-reported depressive symptoms had higher HA, lower SD and higher anxiety compared to controls. Controls showed enhanced P300 and LPP amplitudes for fear. Individuals with self-reported depression showed reduced ERPs amplitudes for happiness. HA did not account for the difference between the groups but high HA and high anxiety were positively correlated with enhanced P300 amplitude for fear in participants with depressive symptoms. In contrast, SD accounted for the difference between the groups but was not correlated to the ERP components' amplitudes recorded for facial expressions. Other personality dimensions (reward dependence, cooperativeness) influenced the ERPs recorded for facial emotions. Personality dimensions influence the neurocognitive processing of emotional faces in individuals with self-reported depressive symptoms, which may constitute a cognitive vulnerability to depression. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  18. Faces in context: a review and systematization of contextual influences on affective face processing.

    Science.gov (United States)

    Wieser, Matthias J; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.

  19. Faces in context: A review and systematization of contextual influences on affective face processing

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2012-11-01

    Full Text Available Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant basic emotion approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, decontextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at 1 systematizing the contextual variables that may influence the perception of facial expressions and 2 summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in

  20. [Research on the correlation between the temperature asymmetry at acupoints of healthy and affected side and the severity index of facial paralysis].

    Science.gov (United States)

    Wu, Zhen-Ying; Liu, Xu-Long; Hong, Wen-Xue; Zhang, Dong

    2010-11-01

    To observe the change trend of the temperature asymmetry coefficient at acupoints between healthy side and affected side in patients with facial paralysis, to study the correlation between the temperature asymmetry and Facial Disability Index (FDI), to provide scientific guidance for the application of infrared thermography in the examination of severity of facial paralysis. Using the infrared thermography to observe the temperature asymmetry at acupoints, the temperature asymmetry coefficient at acupoints between healthy side and affected side was calculated; the correlation between the temperature asymmetry and FDI was analyzed. The correlation between the temperature asymmetry coefficient and FDI was statistically significant at acupoints of Yang-bai (GB 14), Cuanzhu (BL 2), Dicang (ST 4), Yuyao (EX-HN 4), Quanliao (SI 18), Jiache (ST 6) (P facial paralysis.

  1. Speech Signal and Facial Image Processing for Obstructive Sleep Apnea Assessment

    Directory of Open Access Journals (Sweden)

    Fernando Espinoza-Cuadros

    2015-01-01

    Full Text Available Obstructive sleep apnea (OSA is a common sleep disorder characterized by recurring breathing pauses during sleep caused by a blockage of the upper airway (UA. OSA is generally diagnosed through a costly procedure requiring an overnight stay of the patient at the hospital. This has led to proposing less costly procedures based on the analysis of patients’ facial images and voice recordings to help in OSA detection and severity assessment. In this paper we investigate the use of both image and speech processing to estimate the apnea-hypopnea index, AHI (which describes the severity of the condition, over a population of 285 male Spanish subjects suspected to suffer from OSA and referred to a Sleep Disorders Unit. Photographs and voice recordings were collected in a supervised but not highly controlled way trying to test a scenario close to an OSA assessment application running on a mobile device (i.e., smartphones or tablets. Spectral information in speech utterances is modeled by a state-of-the-art low-dimensional acoustic representation, called i-vector. A set of local craniofacial features related to OSA are extracted from images after detecting facial landmarks using Active Appearance Models (AAMs. Support vector regression (SVR is applied on facial features and i-vectors to estimate the AHI.

  2. Metacognitive Awareness of Facial Affect in Higher-Functioning Children and Adolescents with Autism Spectrum Disorder.

    Science.gov (United States)

    McMahon, Camilla M; Henderson, Heather A; Newell, Lisa; Jaime, Mark; Mundy, Peter

    2016-03-01

    Higher-functioning participants with and without autism spectrum disorder (ASD) viewed a series of face stimuli, made decisions regarding the affect of each face, and indicated their confidence in each decision. Confidence significantly predicted accuracy across all participants, but this relation was stronger for participants with typical development than participants with ASD. In the hierarchical linear modeling analysis, there were no differences in face processing accuracy between participants with and without ASD, but participants with ASD were more confident in their decisions. These results suggest that individuals with ASD have metacognitive impairments and are overconfident in face processing. Additionally, greater metacognitive awareness was predictive of better face processing accuracy, suggesting that metacognition may be a pivotal skill to teach in interventions.

  3. Holistic gaze strategy to categorize facial expression of varying intensities.

    Directory of Open Access Journals (Sweden)

    Kun Guo

    Full Text Available Using faces representing exaggerated emotional expressions, recent behaviour and eye-tracking studies have suggested a dominant role of individual facial features in transmitting diagnostic cues for decoding facial expressions. Considering that in everyday life we frequently view low-intensity expressive faces in which local facial cues are more ambiguous, we probably need to combine expressive cues from more than one facial feature to reliably decode naturalistic facial affects. In this study we applied a morphing technique to systematically vary intensities of six basic facial expressions of emotion, and employed a self-paced expression categorization task to measure participants' categorization performance and associated gaze patterns. The analysis of pooled data from all expressions showed that increasing expression intensity would improve categorization accuracy, shorten reaction time and reduce number of fixations directed at faces. The proportion of fixations and viewing time directed at internal facial features (eyes, nose and mouth region, however, was not affected by varying levels of intensity. Further comparison between individual facial expressions revealed that although proportional gaze allocation at individual facial features was quantitatively modulated by the viewed expressions, the overall gaze distribution in face viewing was qualitatively similar across different facial expressions and different intensities. It seems that we adopt a holistic viewing strategy to extract expressive cues from all internal facial features in processing of naturalistic facial expressions.

  4. Characterization of p75{sup +} ectomesenchymal stem cells from rat embryonic facial process tissue

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xiujie; Liu, Luchuan; Deng, Manjing; Zhang, Li; Liu, Rui; Xing, Yongjun; Zhou, Xia [Department of Stomatology, Daping Hospital and Research Institute of Surgery, Third Military Medical University, Chongqing 400042 (China); Nie, Xin, E-mail: dr.xinnie@gmail.com [Department of Stomatology, Daping Hospital and Research Institute of Surgery, Third Military Medical University, Chongqing 400042 (China)

    2012-10-12

    Highlights: Black-Right-Pointing-Pointer Ectomesenchymal stem cells (EMSCs) were found to migrate to rat facial processes at E11.5. Black-Right-Pointing-Pointer We successfully sorted p75NTR positive EMSCs (p75{sup +} EMSCs). Black-Right-Pointing-Pointer p75{sup +} EMSCs up to nine passages showed relative stable proliferative activity. Black-Right-Pointing-Pointer We examined the in vitro multilineage potential of p75{sup +} EMSCs. Black-Right-Pointing-Pointer p75{sup +}EMSCs provide an in vitro model for tooth morphogenesis. -- Abstract: Several populations of stem cells, including those from the dental pulp and periodontal ligament, have been isolated from different parts of the tooth and periodontium. The characteristics of such stem cells have been reported as well. However, as a common progenitor of these cells, ectomesenchymal stem cells (EMSCs), derived from the cranial neural crest have yet to be fully characterized. The aim of this study was to better understand the characteristics of EMSCs isolated from rat embryonic facial processes. Immunohistochemical staining showed that EMSCs had migrated to rat facial processes at E11.5, while the absence of epithelial invagination or tooth-like epithelium suggested that any epithelial-mesenchymal interactions were limited at this stage. The p75 neurotrophin receptor (p75NTR), a typical neural crest marker, was used to select p75NTR-positive EMSCs (p75{sup +} EMSCs), which were found to show a homogeneous fibroblast-like morphology and little change in the growth curve, proliferation capacity, and cell phenotype during cell passage. They also displayed the capacity to differentiate into diverse cell types under chemically defined conditions in vitro. p75{sup +} EMSCs proved to be homogeneous, stable in vitro and potentially capable of multiple lineages, suggesting their potential for application in dental or orofacial tissue engineering.

  5. Dissimilar processing of emotional facial expressions in human and monkey temporal cortex.

    Science.gov (United States)

    Zhu, Qi; Nelissen, Koen; Van den Stock, Jan; De Winter, François-Laurent; Pauwels, Karl; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu

    2013-02-01

    Emotional facial expressions play an important role in social communication across primates. Despite major progress made in our understanding of categorical information processing such as for objects and faces, little is known, however, about how the primate brain evolved to process emotional cues. In this study, we used functional magnetic resonance imaging (fMRI) to compare the processing of emotional facial expressions between monkeys and humans. We used a 2×2×2 factorial design with species (human and monkey), expression (fear and chewing) and configuration (intact versus scrambled) as factors. At the whole brain level, neural responses to conspecific emotional expressions were anatomically confined to the superior temporal sulcus (STS) in humans. Within the human STS, we found functional subdivisions with a face-selective right posterior STS area that also responded to emotional expressions of other species and a more anterior area in the right middle STS that responded specifically to human emotions. Hence, we argue that the latter region does not show a mere emotion-dependent modulation of activity but is primarily driven by human emotional facial expressions. Conversely, in monkeys, emotional responses appeared in earlier visual cortex and outside face-selective regions in inferior temporal cortex that responded also to multiple visual categories. Within monkey IT, we also found areas that were more responsive to conspecific than to non-conspecific emotional expressions but these responses were not as specific as in human middle STS. Overall, our results indicate that human STS may have developed unique properties to deal with social cues such as emotional expressions.

  6. Aggressive osteoblastoma in mastoid process of temporal bone with facial palsy

    Directory of Open Access Journals (Sweden)

    Manoj Jain

    2013-01-01

    Full Text Available Osteoblastoma is an uncommon primary bone tumor with a predilection for posterior elements of spine. Its occurrence in temporal bone and middle ear is extremely rare. Clinical symptoms are non-specific and cranial nerve involvement is uncommon. The cytomorphological features of osteoblastoma are not very well defined and the experience is limited to only few reports. We report an interesting and rare case of aggressive osteoblastoma, with progressive hearing loss and facial palsy, involving the mastoid process of temporal bone and middle ear along with the description of cyto-morphological features.

  7. Shape-constrained Gaussian Process Regression for Facial-point-based Head-pose Normalization

    NARCIS (Netherlands)

    Rudovic, Ognjen; Pantic, Maja

    2011-01-01

    Given the facial points extracted from an image of a face in an arbitrary pose, the goal of facial-point-based headpose normalization is to obtain the corresponding facial points in a predefined pose (e.g., frontal). This involves inference of complex and high-dimensional mappings due to the large n

  8. Affective processing in bilingual speakers: disembodied cognition?

    Science.gov (United States)

    Pavlenko, Aneta

    2012-01-01

    A recent study by Keysar, Hayakawa, and An (2012) suggests that "thinking in a foreign language" may reduce decision biases because a foreign language provides a greater emotional distance than a native tongue. The possibility of such "disembodied" cognition is of great interest for theories of affect and cognition and for many other areas of psychological theory and practice, from clinical and forensic psychology to marketing, but first this claim needs to be properly evaluated. The purpose of this review is to examine the findings of clinical, introspective, cognitive, psychophysiological, and neuroimaging studies of affective processing in bilingual speakers in order to identify converging patterns of results, to evaluate the claim about "disembodied cognition," and to outline directions for future inquiry. The findings to date reveal two interrelated processing effects. First-language (L1) advantage refers to increased automaticity of affective processing in the L1 and heightened electrodermal reactivity to L1 emotion-laden words. Second-language (L2) advantage refers to decreased automaticity of affective processing in the L2, which reduces interference effects and lowers electrodermal reactivity to negative emotional stimuli. The differences in L1 and L2 affective processing suggest that in some bilingual speakers, in particular late bilinguals and foreign language users, respective languages may be differentially embodied, with the later learned language processed semantically but not affectively. This difference accounts for the reduction of framing biases in L2 processing in the study by Keysar et al. (2012). The follow-up discussion identifies the limits of the findings to date in terms of participant populations, levels of processing, and types of stimuli, puts forth alternative explanations of the documented effects, and articulates predictions to be tested in future research.

  9. Short-term visual deprivation reduces interference effects of task-irrelevant facial expressions on affective prosody judgments

    Directory of Open Access Journals (Sweden)

    Ineke eFengler

    2015-04-01

    Full Text Available Several studies have suggested that neuroplasticity can be triggered by short-term visual deprivation in healthy adults. Specifically, these studies have provided evidence that visual deprivation reversibly affects basic perceptual abilities. The present study investigated the long-lasting effects of short-term visual deprivation on emotion perception. To this aim, we visually deprived a group of young healthy adults, age-matched with a group of non-deprived controls, for 3 hours and tested them before and after visual deprivation (i.e., after 8 h on average and at 4 week follow-up on an audio-visual (i.e., faces and voices emotion discrimination task. To observe changes at the level of basic perceptual skills, we additionally employed a simple audio-visual (i.e., tone bursts and light flashes discrimination task and two unimodal (one auditory and one visual perceptual threshold measures. During the 3 h period, both groups performed a series of auditory tasks. To exclude the possibility that changes in emotion discrimination may emerge as a consequence of the exposure to auditory stimulation during the 3 h stay in the dark, we visually deprived an additional group of age-matched participants who concurrently performed unrelated (i.e., tactile tasks to the later tested abilities. The two visually deprived groups showed enhanced affective prosodic discrimination abilities in the context of incongruent facial expressions following the period of visual deprivation; this effect was partially maintained until follow-up. By contrast, no changes were observed in affective facial expression discrimination and in the basic perception tasks in any group. These findings suggest that short-term visual deprivation per se triggers a reweighting of visual and auditory emotional cues, which seem to possibly prevail for longer durations.

  10. Fixation to features and neural processing of facial expressions in a gender discrimination task.

    Science.gov (United States)

    Neath, Karly N; Itier, Roxane J

    2015-10-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times.

  11. Facial beauty affects implicit and explicit learning of men and women differently

    Directory of Open Access Journals (Sweden)

    Eleni eZiori

    2015-08-01

    Full Text Available The present work explores the unconscious and/or conscious nature of learning attractive faces of same and opposite sex, that is, of stimuli that experimental and neuroimaging research has shown to be rewarding and thus highly motivating. To this end, we examined performance of men and women while classifying strings of average and attractive faces for grammaticality in the experimental task of artificial grammar learning (AGL, which reflects both conscious and unconscious processes. Subjective measures were used to assess participants’ conscious and unconscious knowledge. It was found that female attractiveness impaired performance in male participants. In particular, male participants demonstrated the lowest accuracy while classifying beautiful faces of women. Conversely, female attractiveness facilitated performance in female participants. The pattern was similar for conscious and unconscious knowledge. Presumably, objects with high incentive salience, as are beautiful faces, captured resources, which were used in task relevant versus task irrelevant ways by women versus men. The present findings shed light on the relation of conscious and unconscious processing with affective and reward-related stimuli, as well as on gender differences underlying this relation.

  12. Facial beauty affects implicit and explicit learning of men and women differently

    Science.gov (United States)

    Ziori, Eleni; Dienes, Zoltán

    2015-01-01

    The present work explores the unconscious and/or conscious nature of learning attractive faces of same and opposite sex, that is, of stimuli that experimental and neuroimaging research has shown to be rewarding and thus highly motivating. To this end, we examined performance of men and women while classifying strings of average and attractive faces for grammaticality in the experimental task of artificial grammar learning (AGL), which reflects both conscious and unconscious processes. Subjective measures were used to assess participants’ conscious and unconscious knowledge. It was found that female attractiveness impaired performance in male participants. In particular, male participants demonstrated the lowest accuracy while classifying beautiful faces of women. Conversely, female attractiveness facilitated performance in female participants. The pattern was similar for conscious and unconscious knowledge. Presumably, objects with high incentive salience, as are beautiful faces, captured resources, which were used in task relevant versus task irrelevant ways by women versus men. The present findings shed light on the relation of conscious and unconscious processing with affective and reward-related stimuli, as well as on gender differences underlying this relation. PMID:26300819

  13. Surgical-Allogeneic Facial Reconstruction: Facial Transplants

    OpenAIRE

    Marcelo Coelho Goiato; Daniela Micheline Dos Santos; Lisiane Cristina Bannwart; Marcela Filié Haddad; Leonardo Viana Pereira; Aljomar José Vechiato Filho

    2014-01-01

    Several factors including cancer, malformations and traumas may cause large facial mutilation. These functional and aesthetic deformities negatively affect the psychological perspectives and quality of life of the mutilated patient. Conventional treatments are prone to fail aesthetically and functionally. The recent introduction of the composite tissue allotransplantation (CTA), which uses transplanted facial tissues of healthy donors to recover the damaged or non-existent facial tissue of mu...

  14. Facial Expression Analysis

    NARCIS (Netherlands)

    Pantic, Maja; Li, S.; Jain, A.

    2009-01-01

    Facial expression recognition is a process performed by humans or computers, which consists of: 1. Locating faces in the scene (e.g., in an image; this step is also referred to as face detection), 2. Extracting facial features from the detected face region (e.g., detecting the shape of facial compon

  15. Facial Expression Recognition

    NARCIS (Netherlands)

    Pantic, Maja; Li, S.; Jain, A.

    2009-01-01

    Facial expression recognition is a process performed by humans or computers, which consists of: 1. Locating faces in the scene (e.g., in an image; this step is also referred to as face detection), 2. Extracting facial features from the detected face region (e.g., detecting the shape of facial

  16. A comparison of facial emotion processing in neurological and psychiatric conditions

    Directory of Open Access Journals (Sweden)

    Benoit eBediou

    2012-04-01

    Full Text Available Investigating the relative severity of emotion recognition deficit across different clinical and high-risk populations has potential implications not only for the prevention, diagnosis and treatment of these diseases, but also for our understanding of the neurobiological mechanisms of emotion perception itself. We reanalyzed data from 4 studies in which we examined facial expression and gender recognition using the same tasks and stimuli. We used a standardized and bias-corrected measure of effect size (Cohen’s D to assess the extent of impairments in frontotemporal dementia (FTD, Parkinson’s disease treated by L-DOPA (PD-ON or not (PD-OFF, amnestic Mild Cognitive Impairment (aMCI, Alzheimer’s disease at mild dementia stage (AD, major depressive disorder (MDD, remitted schizophrenia (SCZ-rem, first-episode schizophrenia before (SCZ-OFF and after (SCZ-ON medication, as well as unaffected siblings of partients with schizophrenia (SIB. Analyses revealed a pattern of differential impairment of emotion (but not gender recognition, consistent with the extent of impairment of the fronto-temporal neural networks involved in the processing of faces and facial expressions. Our transnosographic approach combining clinical and high-risk populations with the impact of medication brings new information on the trajectory of impaired emotion perception in neuropsychiatric conditions, and on the neural networks and neurotransmitter systems subserving emotion perception.

  17. Neural processing of facial identity and emotion in infants at high risk for autism spectrum disorders

    Directory of Open Access Journals (Sweden)

    Sharon Elizabeth Fox

    2013-04-01

    Full Text Available Deficits in face processing and social impairment are core characteristics of autism spectrum disorder. The present work examined 7 month-old infants at high risk for developing autism and typically developing controls at low risk, using a face perception task designed to differentiate between the effects of face identity and facial emotions on neural response using functional Near Infrared Spectroscopy (fNIRS. In addition, we employed independent component analysis (ICA, as well as a novel method of condition-related component selection and classification to identify group differences in hemodynamic waveforms and response distributions associated with face and emotion processing. The results indicate similarities of waveforms, but differences in the magnitude, spatial distribution, and timing of responses between groups. These early differences in local cortical regions and the hemodynamic response may, in turn, contribute to differences in patterns of functional connectivity.

  18. The influence of indirect and direct emotional processing on memory for facial expressions.

    Science.gov (United States)

    Patel, Ronak; Girard, Todd A; Green, Robin E A

    2012-01-01

    We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.

  19. Interaction between facial expression and color

    OpenAIRE

    Kae Nakajima; Tetsuto Minami; Shigeki Nakauchi

    2017-01-01

    Facial color varies depending on emotional state, and emotions are often described in relation to facial color. In this study, we investigated whether the recognition of facial expressions was affected by facial color and vice versa. In the facial expression task, expression morph continua were employed: fear-anger and sadness-happiness. The morphed faces were presented in three different facial colors (bluish, neutral, and reddish color). Participants identified a facial expression between t...

  20. Facial blindsight

    Directory of Open Access Journals (Sweden)

    Marco eSolcà

    2015-09-01

    Full Text Available Blindsight denotes unconscious residual visual capacities in the context of an inability to consciously recollect or identify visual information. It has been described for color and shape discrimination, movement or facial emotion recognition. The present study investigates a patient suffering from cortical blindness whilst maintaining select residual abilities in face detection. Our patient presented the capacity to distinguish between jumbled/normal faces, known/unknown faces or famous people’s categories although he failed to explicitly recognize or describe them. Conversely, performance was at chance level when asked to categorize non-facial stimuli. Our results provide clinical evidence for the notion that some aspects of facial processing can occur without perceptual awareness, possibly using direct tracts from the thalamus to associative visual cortex, bypassing the primary visual cortex.

  1. Parallel Processing of Affective Visual Stimuli

    Science.gov (United States)

    Peyk, Peter; Schupp, Harald T.; Keil, Andreas; Elbert, Thomas; Junghöfer, Markus

    2009-01-01

    Event-related potential (ERP) studies of affective picture processing have demonstrated an early posterior negativity (EPN) for emotionally arousing pictures that are embedded in a rapid visual stream. The present study examined the selective processing of emotional pictures while systematically varying picture presentation rates between 1 and 16 Hz. Previous results with presentation rates up to 5 Hz were replicated in that emotional compared to neutral pictures were associated with a greater EPN. Discrimination among emotional and neutral contents was maintained up to 12 Hz. To explore the notion of parallel processing, convolution analysis was used: EPNs generated by linear superposition of slow rate ERPs explained 70-93% of the variance of measured EPNs, giving evidence for an impressive capacity of parallel affective discrimination in rapid serial picture presentation. PMID:19055507

  2. Mutual regulation between infant facial affect and maternal touch in depressed and nondepressed dyads

    DEFF Research Database (Denmark)

    Egmose, Ida; Cordes, Katharina; Smith-Nielsen, Johanne

    2017-01-01

    research suggests that touch is an important means through which parents regulate their infants’ affects. Also, previous research has shown that post-partum depressed (PPD) mothers and nonclinical mothers differ in their touching behaviors when interacting with their infants. We examined the affect...

  3. Facial-paralysis diagnostic system based on 3D reconstruction

    Science.gov (United States)

    Khairunnisaa, Aida; Basah, Shafriza Nisha; Yazid, Haniza; Basri, Hassrizal Hassan; Yaacob, Sazali; Chin, Lim Chee

    2015-05-01

    The diagnostic process of facial paralysis requires qualitative assessment for the classification and treatment planning. This result is inconsistent assessment that potential affect treatment planning. We developed a facial-paralysis diagnostic system based on 3D reconstruction of RGB and depth data using a standard structured-light camera - Kinect 360 - and implementation of Active Appearance Models (AAM). We also proposed a quantitative assessment for facial paralysis based on triangular model. In this paper, we report on the design and development process, including preliminary experimental results. Our preliminary experimental results demonstrate the feasibility of our quantitative assessment system to diagnose facial paralysis.

  4. Fast Facial Detection by Depth Map Analysis

    Directory of Open Access Journals (Sweden)

    Ming-Yuan Shieh

    2013-01-01

    Full Text Available In order to obtain correct facial recognition results, one needs to adopt appropriate facial detection techniques. Moreover, the effects of facial detection are usually affected by the environmental conditions such as background, illumination, and complexity of objectives. In this paper, the proposed facial detection scheme, which is based on depth map analysis, aims to improve the effectiveness of facial detection and recognition under different environmental illumination conditions. The proposed procedures consist of scene depth determination, outline analysis, Haar-like classification, and related image processing operations. Since infrared light sources can be used to increase dark visibility, the active infrared visual images captured by a structured light sensory device such as Kinect will be less influenced by environmental lights. It benefits the accuracy of the facial detection. Therefore, the proposed system will detect the objective human and face firstly and obtain the relative position by structured light analysis. Next, the face can be determined by image processing operations. From the experimental results, it demonstrates that the proposed scheme not only improves facial detection under varying light conditions but also benefits facial recognition.

  5. Arginine vasopressin 1a receptor RS3 promoter microsatellites in schizophrenia: a study of the effect of the "risk" allele on clinical symptoms and facial affect recognition.

    Science.gov (United States)

    Golimbet, Vera; Alfimova, Margarita; Abramova, Lilia; Kaleda, Vasily; Gritsenko, Inga

    2015-02-28

    We studied AVPR1A RS3 polymorphism in schizophrenic patients and controls. AVPR1A RS3 was not associated with schizophrenia. The allele 327bp implicated in autism and social behavior was associated with negative symptoms and tended to be linked to patient facial affect recognition suggesting its impact on schizophrenia social phenotypes.

  6. Stress modulation of cognitive and affective processes

    Science.gov (United States)

    CAMPEAU, SERGE; LIBERZON, ISRAEL; MORILAK, DAVID; RESSLER, KERRY

    2012-01-01

    This review summarizes the major discussion points of a symposium on stress modulation of cognitive and affective processes, which was held during the 2010 workshop on the neurobiology of stress (Boulder, CO, USA). The four discussants addressed a number of specific cognitive and affective factors that are modulated by exposure to acute or repeated stress. Dr David Morilak discussed the effects of various repeated stress situations on cognitive flexibility, as assessed with a rodent model of attentional set-shifting task, and how performance on slightly different aspects of this test is modulated by different prefrontal regions through monoaminergic neurotransmission. Dr Serge Campeau summarized the findings of several studies exploring a number of factors and brain regions that regulate habituation of various autonomic and neuroendocrine responses to repeated audiogenic stress exposures. Dr Kerry Ressler discussed a body of work exploring the modulation and extinction of fear memories in rodents and humans, especially focusing on the role of key neurotransmitter systems including excitatory amino acids and brain-derived neurotrophic factor. Dr Israel Liberzon presented recent results on human decision-making processes in response to exogenous glucocorticoid hormone administration. Overall, these discussions are casting a wider framework on the cognitive/affective processes that are distinctly regulated by the experience of stress and some of the brain regions and neurotransmitter systems associated with these effects. PMID:21790481

  7. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  8. The Effect of Gaze Direction on the Processing of Facial Expressions in Children with Autism Spectrum Disorder: An ERP Study

    Science.gov (United States)

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2010-01-01

    This study investigated the neural basis of the effect of gaze direction on facial expression processing in children with and without ASD, using event-related potential (ERP). Children with ASD (10-17-year olds) and typically developing (TD) children (9-16-year olds) were asked to determine the emotional expressions (anger or fearful) of a facial…

  9. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  10. A genome-screen experiment to detect quantitative trait loci affecting resistance to facial eczema disease in sheep.

    Science.gov (United States)

    Phua, S H; Dodds, K G; Morris, C A; Henry, H M; Beattie, A E; Garmonsway, H G; Towers, N R; Crawford, A M

    2009-02-01

    Facial eczema (FE) is a secondary photosensitization disease arising from liver cirrhosis caused by the mycotoxin sporidesmin. The disease affects sheep, cattle, deer and goats, and costs the New Zealand sheep industry alone an estimated NZ$63M annually. A long-term sustainable solution to this century-old FE problem is to breed for disease-resistant animals by marker-assisted selection. As a step towards finding a diagnostic DNA test for FE sensitivity, we have conducted a genome-scan experiment to screen for quantitative trait loci (QTL) affecting this trait in Romney sheep. Four F(1) sires, obtained from reciprocal matings of FE resistant and susceptible selection-line animals, were used to generate four outcross families. The resulting half-sib progeny were artificially challenged with sporidesmin to phenotype their FE traits measured in terms of their serum levels of liver-specific enzymes, namely gamma-glutamyl transferase and glutamate dehydrogenase. In a primary screen using selective genotyping on extreme progeny of each family, a total of 244 DNA markers uniformly distributed over all 26 ovine autosomes (with an autosomal genome coverage of 79-91%) were tested for linkage to the FE traits. Data were analysed using Haley-Knott regression. The primary screen detected one significant and one suggestive QTL on chromosomes 3 and 8 respectively. Both the significant and suggestive QTL were followed up in a secondary screen where all progeny were genotyped and analysed; the QTL on chromosome 3 was significant in this analysis.

  11. Shared Gaussian Process Latent Variable Model for Multi-view Facial Expression Recognition

    NARCIS (Netherlands)

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    2013-01-01

    Facial-expression data often appear in multiple views either due to head-movements or the camera position. Existing methods for multi-view facial expression recognition perform classification of the target expressions either by using classifiers learned separately for each view or by using a single

  12. Does Gaze Direction Modulate Facial Expression Processing in Children with Autism Spectrum Disorder?

    Science.gov (United States)

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent…

  13. Discriminative shared Gaussian processes for multi-view and view-invariant facial expression recognition

    NARCIS (Netherlands)

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    Images of facial expressions are often captured from various views as a result of either head movements or variable camera position. Existing methods for multiview and/or view-invariant facial expression recognition typically perform classification of the observed expression using either classifiers

  14. Discriminative shared Gaussian processes for multi-view and view-invariant facial expression recognition

    NARCIS (Netherlands)

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    2015-01-01

    Images of facial expressions are often captured from various views as a result of either head movements or variable camera position. Existing methods for multiview and/or view-invariant facial expression recognition typically perform classification of the observed expression using either classifiers

  15. Employing Textual and Facial Emotion Recognition to Design an Affective Tutoring System

    Science.gov (United States)

    Lin, Hao-Chiang Koong; Wang, Cheng-Hung; Chao, Ching-Ju; Chien, Ming-Kuan

    2012-01-01

    Emotional expression in Artificial Intelligence has gained lots of attention in recent years, people applied its affective computing not only in enhancing and realizing the interaction between computers and human, it also makes computer more humane. In this study, emotional expressions were applied into intelligent tutoring system, where learners'…

  16. Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.

    Science.gov (United States)

    Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha

    2015-04-01

    According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. The factors affecting the recarburization process indicators

    Directory of Open Access Journals (Sweden)

    K. Janerka

    2011-07-01

    Full Text Available The article presents the factors affecting the carburizing rates obtained (rate and efficiency during the process of melting cast iron. The analysis includes the recarburizer type (anthracite, natural and synthetic graphite, petroleum coke and particle size. Further factors considered in work are the methods of recarburization (recarburizer introduction to a solid charge and on the surface of the metal bath and the parameters of the melt (temperature and chemical composition. The analysis was based on experiments performed, the calculation results of computer simulations and literature data.

  18. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-03-01

    Full Text Available The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation or unconsciously (subliminal stimulation processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral under two different conditions: supraliminal (200 ms vs. subliminal (30 ms stimulation (140 target-mask pairs for each condition. The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  19. Happy facial expression processing with different social interaction cues: an fMRI study of individuals with schizotypal personality traits.

    Science.gov (United States)

    Huang, Jia; Wang, Yi; Jin, Zhen; Di, Xin; Yang, Tao; Gur, Ruben C; Gur, Raquel E; Shum, David H K; Cheung, Eric F C; Chan, Raymond C K

    2013-07-01

    In daily life facial expressions change rapidly and the direction of change provides important clues about social interaction. The aim of conducting this study was to elucidate the dynamic happy facial expression processing with different social interaction cues in individuals with (n=14) and without (n=14) schizotypal personality disorder (SPD) traits. Using functional magnetic resonance imaging (fMRI), dynamic happy facial expression processing was examined by presenting video clips depicting happiness appearing and disappearing under happiness inducing ('praise') or reducing ('blame') interaction cues. The happiness appearing condition consistently elicited more brain activations than the happiness disappearing condition in the posterior cingulate bilaterally in all participants. Further analyses showed that the SPD group was less deactivated than the non-SPD group in the right anterior cingulate cortex in the happiness appearing-disappearing contrast. The SPD group deactivated more than the non-SPD group in the left posterior cingulate and right superior temporal gyrus in the praise-blame contrast. Moreover, the incongruence of cues and facial expression activated the frontal-thalamus-caudate-parietal network, which is involved in emotion recognition and conflict resolution. These results shed light on the neural basis of social interaction deficits in individuals with schizotypal personality traits.

  20. Through the eyes of a child: preschoolers' identification of emotional expressions from the child affective facial expression (CAFE) set.

    Science.gov (United States)

    LoBue, Vanessa; Baker, Lewis; Thrasher, Cat

    2017-08-10

    Researchers have been interested in the perception of human emotional expressions for decades. Importantly, most empirical work in this domain has relied on controlled stimulus sets of adults posing for various emotional expressions. Recently, the Child Affective Facial Expression (CAFE) set was introduced to the scientific community, featuring a large validated set of photographs of preschool aged children posing for seven different emotional expressions. Although the CAFE set was extensively validated using adult participants, the set was designed for use with children. It is therefore necessary to verify that adult validation applies to child performance. In the current study, we examined 3- to 4-year-olds' identification of a subset of children's faces in the CAFE set, and compared it to adult ratings cited in previous research. Our results demonstrate an exceptionally strong relationship between adult ratings of the CAFE photos and children's ratings, suggesting that the adult validation of the set can be applied to preschool-aged participants. The results are discussed in terms of methodological implications for the use of the CAFE set with children, and theoretical implications for using the set to study the development of emotion perception in early childhood.

  1. Mental processes affecting the piano performance

    Directory of Open Access Journals (Sweden)

    Buğra Gültek

    2013-02-01

    Full Text Available The aim of this study is to examine the mental processes that affect the piano performance. An efficient performance and education of the piano instrument depends on a complex systematic structure of a combination of some musical and technical challenges. To succeed in piano playing, one shall overcome the difficulties systematically over several years. Naturally, this process depends on many variables and the correct organization of the central nervous system plays a very important role on it. Having a complete understanding on the subject matter, it might guide the performance of piano playing and its educationBy the analysis of the historical processes, the activity of playing the piano was initially seen as a pure mechanical activity. However, at the beginning of the 20th Century, scholars had started to consider it not without its physiological and mental elements. This trend has reached today by gradually increasing its importance. Today in the contemporary piano performance and education, the mental processes are not being isolated from the whole matter of subject. In this study, the importance of the practicing methods that are based on correct organization of the central nervous system is set forth and to increase the efficiency of the piano performance, some practicing examples are given.

  2. Surgical-allogeneic facial reconstruction: facial transplants.

    Directory of Open Access Journals (Sweden)

    Marcelo Coelho Goiato

    2014-12-01

    Full Text Available Several factors including cancer, malformations and traumas may cause large facial mutilation. These functional and aesthetic deformities negatively affect the psychological perspectives and quality of life of the mutilated patient. Conventional treatments are prone to fail aesthetically and functionally. The recent introduction of the composite tissue allotransplantation (CTA, which uses transplanted facial tissues of healthy donors to recover the damaged or non-existent facial tissue of mutilated patients, resulted in greater clinical results. Therefore, the present study aims to conduct a literature review on the relevance and effectiveness of facial transplants in mutilated subjects. It was observed that the facial transplants recovered both the aesthetics and function of these patients and consequently improved their quality of life.

  3. The light-makeup advantage in facial processing: Evidence from event-related potentials

    Science.gov (United States)

    Tagai, Keiko; Shimakura, Hitomi; Isobe, Hiroko; Nittono, Hiroshi

    2017-01-01

    The effects of makeup on attractiveness have been evaluated using mainly subjective measures. In this study, event-related brain potentials (ERPs) were recorded from a total of 45 Japanese women (n = 23 and n = 22 for Experiment 1 and 2, respectively) to examine the neural processing of faces with no makeup, light makeup, and heavy makeup. To have the participants look at each face carefully, an identity judgement task was used: they were asked to judge whether the two faces presented in succession were of the same person or not. The ERP waveforms in response to the first faces were analyzed. In two experiments with different stimulus probabilities, the amplitudes of N170 and vertex positive potential (VPP) were smaller for faces with light makeup than for faces with heavy makeup or no makeup. The P1 amplitude did not differ between facial types. In a subsequent rating phase, faces with light makeup were rated as more attractive than faces with heavy makeup and no makeup. The results suggest that the processing fluency of faces with light makeup is one of the reasons why light makeup is preferred to heavy makeup and no makeup in daily life. PMID:28234959

  4. Brain lateralization of holistic versus analytic processing of emotional facial expressions.

    Science.gov (United States)

    Calvo, Manuel G; Beltrán, David

    2014-05-15

    This study investigated the neurocognitive mechanisms underlying the role of the eye and the mouth regions in the recognition of facial happiness, anger, and surprise. To this end, face stimuli were shown in three formats (whole face, upper half visible, and lower half visible) and behavioral categorization, computational modeling, and ERP (event-related potentials) measures were combined. N170 (150-180 ms post-stimulus; right hemisphere) and EPN (early posterior negativity; 200-300 ms; mainly, right hemisphere) were modulated by expression of whole faces, but not by separate halves. This suggests that expression encoding (N170) and emotional assessment (EPN) require holistic processing, mainly in the right hemisphere. In contrast, the mouth region of happy faces enhanced left temporo-occipital activity (150-180 ms), and also the LPC (late positive complex; centro-parietal) activity (350-450 ms) earlier than the angry eyes (450-600 ms) or other face regions. Relatedly, computational modeling revealed that the mouth region of happy faces was also visually salient by 150 ms following stimulus onset. This suggests that analytical or part-based processing of the salient smile occurs early (150-180 ms) and lateralized (left), and is subsequently used as a shortcut to identify the expression of happiness (350-450 ms). This would account for the happy face advantage in behavioral recognition tasks when the smile is visible. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Conscious and unconscious processing of facial expressions: evidence from two split-brain patients.

    Science.gov (United States)

    Prete, Giulia; D'Ascenzo, Stefania; Laeng, Bruno; Fabri, Mara; Foschi, Nicoletta; Tommasi, Luca

    2015-03-01

    We investigated how the brain's hemispheres process explicit and implicit facial expressions in two 'split-brain' patients (one with a complete and one with a partial anterior resection). Photographs of faces expressing positive, negative or neutral emotions were shown either centrally or bilaterally. The task consisted in judging the friendliness of each person in the photographs. Half of the photograph stimuli were 'hybrid faces', that is an amalgamation of filtered images which contained emotional information only in the low range of spatial frequency, blended to a neutral expression of the same individual in the rest of the spatial frequencies. The other half of the images contained unfiltered faces. With the hybrid faces the patients and a matched control group were more influenced in their social judgements by the emotional expression of the face shown in the left visual field (LVF). When the expressions were shown explicitly, that is without filtering, the control group and the partially callosotomized patient based their judgement on the face shown in the LVF, whereas the complete split-brain patient based his ratings mainly on the face presented in the right visual field. We conclude that the processing of implicit emotions does not require the integrity of callosal fibres and can take place within subcortical routes lateralized in the right hemisphere.

  6. Alexithymia and the processing of emotional facial expressions (EFEs): systematic review, unanswered questions and further perspectives.

    Science.gov (United States)

    Grynberg, Delphine; Chang, Betty; Corneille, Olivier; Maurage, Pierre; Vermeulen, Nicolas; Berthoz, Sylvie; Luminet, Olivier

    2012-01-01

    Alexithymia is characterized by difficulties in identifying, differentiating and describing feelings. A high prevalence of alexithymia has often been observed in clinical disorders characterized by low social functioning. This review aims to assess the association between alexithymia and the ability to decode emotional facial expressions (EFEs) within clinical and healthy populations. More precisely, this review has four main objectives: (1) to assess if alexithymia is a better predictor of the ability to decode EFEs than the diagnosis of clinical disorder; (2) to assess the influence of comorbid factors (depression and anxiety disorder) on the ability to decode EFE; (3) to investigate if deficits in decoding EFEs are specific to some levels of processing or task types; (4) to investigate if the deficits are specific to particular EFEs. Twenty four studies (behavioural and neuroimaging) were identified through a computerized literature search of Psycinfo, PubMed, and Web of Science databases from 1990 to 2010. Data on methodology, clinical characteristics, and possible confounds were analyzed. The review revealed that: (1) alexithymia is associated with deficits in labelling EFEs among clinical disorders, (2) the level of depression and anxiety partially account for the decoding deficits, (3) alexithymia is associated with reduced perceptual abilities, and is likely to be associated with impaired semantic representations of emotional concepts, and (4) alexithymia is associated with neither specific EFEs nor a specific valence. These studies are discussed with respect to processes involved in the recognition of EFEs. Future directions for research on emotion perception are also discussed.

  7. Alexithymia and the processing of emotional facial expressions (EFEs: systematic review, unanswered questions and further perspectives.

    Directory of Open Access Journals (Sweden)

    Delphine Grynberg

    Full Text Available Alexithymia is characterized by difficulties in identifying, differentiating and describing feelings. A high prevalence of alexithymia has often been observed in clinical disorders characterized by low social functioning. This review aims to assess the association between alexithymia and the ability to decode emotional facial expressions (EFEs within clinical and healthy populations. More precisely, this review has four main objectives: (1 to assess if alexithymia is a better predictor of the ability to decode EFEs than the diagnosis of clinical disorder; (2 to assess the influence of comorbid factors (depression and anxiety disorder on the ability to decode EFE; (3 to investigate if deficits in decoding EFEs are specific to some levels of processing or task types; (4 to investigate if the deficits are specific to particular EFEs. Twenty four studies (behavioural and neuroimaging were identified through a computerized literature search of Psycinfo, PubMed, and Web of Science databases from 1990 to 2010. Data on methodology, clinical characteristics, and possible confounds were analyzed. The review revealed that: (1 alexithymia is associated with deficits in labelling EFEs among clinical disorders, (2 the level of depression and anxiety partially account for the decoding deficits, (3 alexithymia is associated with reduced perceptual abilities, and is likely to be associated with impaired semantic representations of emotional concepts, and (4 alexithymia is associated with neither specific EFEs nor a specific valence. These studies are discussed with respect to processes involved in the recognition of EFEs. Future directions for research on emotion perception are also discussed.

  8. Mathematical problems in the application of multilinear models to facial emotion processing experiments

    Science.gov (United States)

    Andersen, Anders H.; Rayens, William S.; Li, Ren-Cang; Blonder, Lee X.

    2000-10-01

    In this paper we describe the enormous potential that multilinear models hold for the analysis of data from neuroimaging experiments that rely on functional magnetic resonance imaging (MRI) or other imaging modalities. A case is made for why one might fully expect that the successful introduction of these models to the neuroscience community could define the next generation of structure-seeking paradigms in the area. In spite of the potential for immediate application, there is much to do from the perspective of statistical science. That is, although multilinear models have already been particularly successful in chemistry and psychology, relatively little is known about their statistical properties. To that end, our research group at the University of Kentucky has made significant progress. In particular, we are in the process of developing formal influence measures for multilinear methods as well as associated classification models and effective implementations. We believe that these problems will be among the most important and useful to the scientific community. Details are presented herein and an application is given in the context of facial emotion processing experiments.

  9. The light-makeup advantage in facial processing: Evidence from event-related potentials.

    Science.gov (United States)

    Tagai, Keiko; Shimakura, Hitomi; Isobe, Hiroko; Nittono, Hiroshi

    2017-01-01

    The effects of makeup on attractiveness have been evaluated using mainly subjective measures. In this study, event-related brain potentials (ERPs) were recorded from a total of 45 Japanese women (n = 23 and n = 22 for Experiment 1 and 2, respectively) to examine the neural processing of faces with no makeup, light makeup, and heavy makeup. To have the participants look at each face carefully, an identity judgement task was used: they were asked to judge whether the two faces presented in succession were of the same person or not. The ERP waveforms in response to the first faces were analyzed. In two experiments with different stimulus probabilities, the amplitudes of N170 and vertex positive potential (VPP) were smaller for faces with light makeup than for faces with heavy makeup or no makeup. The P1 amplitude did not differ between facial types. In a subsequent rating phase, faces with light makeup were rated as more attractive than faces with heavy makeup and no makeup. The results suggest that the processing fluency of faces with light makeup is one of the reasons why light makeup is preferred to heavy makeup and no makeup in daily life.

  10. Surface Facial Electromyography Reactions to Light-Relevant and Season-Relevant Stimuli in Seasonal Affective Disorder

    Science.gov (United States)

    2005-01-01

    psychophysiological responding in the nonseasonal depression literature (e.g., Cacioppo, Bush, & Tassinary, 1992; Greden, Genero , Price, Feinberg...with Schwartz et al. (1978) and Carney et al. (1981), Greden, Price, Genero , Feinberg, and Levine (1984) found that higher baseline levels of facial...York: Academic Press. Greden, J. F., Price, H. L., Genero , N., Feinberg, M., & Levine, S. (1984). Facial EMG activity levels predict treatment

  11. Differences in the dynamics of affective and cognitive processing - An ERP study.

    Science.gov (United States)

    Mueller, Christina J; Fritsch, Nathalie; Hofmann, Markus J; Kuchinke, Lars

    2017-01-15

    A controversy in emotion research concerns the question of whether affective or cognitive primacy are evident in processing affective stimuli and the factors contributing to each alternative. Using electrophysiological recordings in an adapted visual oddball paradigm allowed tracking the dynamics of affective and cognitive effects. Stimuli consisted of face pictures displaying affective expressions with rare oddballs differing from frequent stimuli in either affective expression, structure (while frequent stimuli were shown frontally these deviants were turned sideways) or they differed on both dimensions, i.e. in affective expression and structure. Results revealed a defined sequence of differences in ERP amplitudes: For stimuli deviating in their affective expression only, P1 modulations ~100ms were evident, while affective differences of structure deviants were not evident before the N170 time window. All three types of deviants differed in P300 amplitudes, indicating integration of affective and structural information. These results encompass evidence for both, cognitive and affective primacy depending on stimulus properties. Specifically affective primacy is only visible when the respective facial features can be extracted with ease. When structural differences make face processing harder, however, cognitive primacy is brought forward.

  12. Impaired social brain network for processing dynamic facial expressions in autism spectrum disorders

    Directory of Open Access Journals (Sweden)

    Sato Wataru

    2012-08-01

    Full Text Available Abstract Background Impairment of social interaction via facial expressions represents a core clinical feature of autism spectrum disorders (ASD. However, the neural correlates of this dysfunction remain unidentified. Because this dysfunction is manifested in real-life situations, we hypothesized that the observation of dynamic, compared with static, facial expressions would reveal abnormal brain functioning in individuals with ASD. We presented dynamic and static facial expressions of fear and happiness to individuals with high-functioning ASD and to age- and sex-matched typically developing controls and recorded their brain activities using functional magnetic resonance imaging (fMRI. Result Regional analysis revealed reduced activation of several brain regions in the ASD group compared with controls in response to dynamic versus static facial expressions, including the middle temporal gyrus (MTG, fusiform gyrus, amygdala, medial prefrontal cortex, and inferior frontal gyrus (IFG. Dynamic causal modeling analyses revealed that bi-directional effective connectivity involving the primary visual cortex–MTG–IFG circuit was enhanced in response to dynamic as compared with static facial expressions in the control group. Group comparisons revealed that all these modulatory effects were weaker in the ASD group than in the control group. Conclusions These results suggest that weak activity and connectivity of the social brain network underlie the impairment in social interaction involving dynamic facial expressions in individuals with ASD.

  13. Contemporary facial reanimation.

    Science.gov (United States)

    Bhama, Prabhat K; Hadlock, Tessa A

    2014-04-01

    The facial nerve is the most commonly paralyzed nerve in the human body. Facial paralysis affects aesthetic appearance, and it has a profound effect on function and quality of life. Management of patients with facial paralysis requires a multidisciplinary approach, including otolaryngologists, plastic surgeons, ophthalmologists, and physical therapists. Regardless of etiology, patients with facial paralysis should be evaluated systematically, with initial efforts focused upon establishing proper diagnosis. Management should proceed with attention to facial zones, including the brow and periocular region, the midface and oral commissure, the lower lip and chin, and the neck. To effectively compare contemporary facial reanimation strategies, it is essential to employ objective intake assessment methods, and standard reassessment schemas during the entire management period.

  14. Disconnection mechanism and regional cortical atrophy contribute to impaired processing of facial expressions and theory of mind in multiple sclerosis: a structural MRI study.

    Directory of Open Access Journals (Sweden)

    Andrea Mike

    Full Text Available Successful socialization requires the ability of understanding of others' mental states. This ability called as mentalization (Theory of Mind may become deficient and contribute to everyday life difficulties in multiple sclerosis. We aimed to explore the impact of brain pathology on mentalization performance in multiple sclerosis. Mentalization performance of 49 patients with multiple sclerosis was compared to 24 age- and gender matched healthy controls. T1- and T2-weighted three-dimensional brain MRI images were acquired at 3Tesla from patients with multiple sclerosis and 18 gender- and age matched healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right inferior fronto-occipital fasciculus, uncinate fasciculus. Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed, processing of emotions (right entorhinal cortex and socially relevant information (left temporal pole. Thus, both disconnection mechanism due to white matter lesions and cortical thinning of specific brain areas may result in cognitive deficit in multiple sclerosis affecting emotion and mental state processing from facial expressions and contributing to everyday and social life difficulties of these patients.

  15. Simultaneous recording of EEG and fNIRS during visuo-spatial and facial expression processing in a dual task paradigm.

    Science.gov (United States)

    Herrmann, Martin J; Neueder, Dorothea; Troeller, Anna K; Schulz, Stefan M

    2016-11-01

    Emotional processing is probably the most crucial tool for orienting oneself in our everyday social life and has been considered to be highly automatic for a long time. Dual task (DT) research shows that information competing for working memory resources impairs the identification of emotional facial expressions. Effects of cognitive load in DT paradigms have been confirmed in numerous neuroimaging studies. However, interference occurring during a DT comprised of decoding emotional facial expressions and a visuo-spatial working memory task has yet to be visualized. To investigate the DT interference effect on brain areas associated not only with working memory, but also emotional and visuo-spatial processing, we recorded brain activation within the prefrontal cortex and parietal-occipital sensory areas using functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) simultaneously. Our study consisted of N = 36 participants (27 female) performing the following tasks: a) Corsi blocks, b) identification of emotional facial expressions or c) DT comprising of tasks a) and b). We predicted higher activation of the prefrontal cortex during DT and corresponding reduced P100 and P300 amplitudes. As expected, fNIRS measurements revealed significantly higher neuronal activation within the prefrontal cortex in the DT condition. When comparing DT to the single tasks, the P100 amplitude was reduced, but the P300 amplitude did not show the expected reduction. Our findings underline that at least some aspects of emotional processing are not entirely automatic, but depend on prefrontal control and are therefore affected by cognitive load, in particular visuo-spatial working memory resources.

  16. Psychopathy and facial emotion recognition ability in patients with bipolar affective disorder with or without delinquent behaviors.

    Science.gov (United States)

    Demirel, Husrev; Yesilbas, Dilek; Ozver, Ismail; Yuksek, Erhan; Sahin, Feyzi; Aliustaoglu, Suheyla; Emul, Murat

    2014-04-01

    It is well known that patients with bipolar disorder are more prone to violence and have more criminal behaviors than general population. A strong relationship between criminal behavior and inability to empathize and imperceptions to other person's feelings and facial expressions increases the risk of delinquent behaviors. In this study, we aimed to investigate the deficits of facial emotion recognition ability in euthymic bipolar patients who committed an offense and compare with non-delinquent euthymic patients with bipolar disorder. Fifty-five euthymic patients with delinquent behaviors and 54 non-delinquent euthymic bipolar patients as a control group were included in the study. Ekman's Facial Emotion Recognition Test, sociodemographic data, Hare Psychopathy Checklist, Hamilton Depression Rating Scale and Young Mania Rating Scale were applied to both groups. There were no significant differences between case and control groups in the meaning of average age, gender, level of education, mean age onset of disease and suicide attempt (p>0.05). The three types of most committed delinquent behaviors in patients with euthymic bipolar disorder were as follows: injury (30.8%), threat or insult (20%) and homicide (12.7%). The best accurate percentage of identified facial emotion was "happy" (>99%, for both) while the worst misidentified facial emotion was "fear" in both groups (fear expressions was significantly worse in the case group than in the control group (pfear, disgusted and angry expressions had been significantly longer in the case group than in the control group (pfearful and modestly anger facial emotions and need some more time to response facial emotions even in remission. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Magnetoencephalographic study on facial movements

    Directory of Open Access Journals (Sweden)

    Kensaku eMiki

    2014-07-01

    Full Text Available In this review, we introduced our three studies that focused on facial movements. In the first study, we examined the temporal characteristics of neural responses elicited by viewing mouth movements, and assessed differences between the responses to mouth opening and closing movements and an averting eyes condition. Our results showed that the occipitotemporal area, the human MT/V5 homologue, was active in the perception of both mouth and eye motions. Viewing mouth and eye movements did not elicit significantly different activity in the occipitotemporal area, which indicated that perception of the movement of facial parts may be processed in the same manner, and this is different from motion in general. In the second study, we investigated whether early activity in the occipitotemporal region evoked by eye movements was influenced by a face contour and/or features such as the mouth. Our results revealed specific information processing for eye movements in the occipitotemporal region, and this activity was significantly influenced by whether movements appeared with the facial contour and/or features, in other words, whether the eyes moved, even if the movement itself was the same. In the third study, we examined the effects of inverting the facial contour (hair and chin and features (eyes, nose, and mouth on processing for static and dynamic face perception. Our results showed the following: (1 In static face perception, activity in the right fusiform area was affected more by the inversion of features while that in the left fusiform area was affected more by a disruption in the spatial relationship between the contour and features, and (2 In dynamic face perception, activity in the right occipitotemporal area was affected by the inversion of the facial contour.

  18. Disconnection mechanism and regional cortical atrophy contribute to impaired processing of facial expressions and theory of mind in multiple sclerosis

    DEFF Research Database (Denmark)

    Mike, Andrea; Strammer, Erzsebet; Aradi, Mihaly

    2013-01-01

    healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states...... and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right...... inferior fronto-occipital fasciculus, uncinate fasciculus). Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed), processing of emotions (right entorhinal cortex) and socially...

  19. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research.

    Science.gov (United States)

    Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying

    2017-01-01

    Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants' reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual's processing of positive emotional faces; for instance, the presentation of the partner's name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing

  20. The Relationship of the Facial Nerve to the Condylar Process: A Cadaveric Study with Implications for Open Reduction Internal Fixation

    OpenAIRE

    Barham, H. P.; Collister, P.; V. D. Eusterman; Terella, A. M.

    2015-01-01

    Introduction. The mandibular condyle is the most common site of mandibular fracture. Surgical treatment of condylar fractures by open reduction and internal fixation (ORIF) demands direct visualization of the fracture. This project aimed to investigate the anatomic relationship of the tragus to the facial nerve and condylar process. Materials and Methods. Twelve fresh hemicadavers heads were used. An extended retromandibular/preauricular approach was utilized, with the incision being based pa...

  1. Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults.

    Directory of Open Access Journals (Sweden)

    Nicholas Robert Cooper

    2013-04-01

    Full Text Available The human mirror neuron system (hMNS has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders. In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression and if so, would this differ according to the individual level of autistic traits (high versus low AQ score.Participants were presented with 3 second films of actors opening and closing their hands (classic hMNS mu-suppression protocol while simultaneously wearing happy, angry or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta ERD to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group.In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self reported traits of autism.

  2. Does format matter for comprehension of a facial affective scale and a numeric scale for pain by adults with Down syndrome?

    Science.gov (United States)

    de Knegt, N C; Evenhuis, H M; Lobbezoo, F; Schuengel, C; Scherder, E J A

    2013-10-01

    People with intellectual disabilities are at high risk for pain and have communication difficulties. Facial and numeric scales for self-report may aid pain identification. It was examined whether the comprehension of a facial affective scale and a numeric scale for pain in adults with Down syndrome (DS) varies with presentation format. Adults with DS were included (N=106, mild to severe ID, mean age 37 years), both with (N=57) and without (N=49) physical conditions that may cause pain or discomfort. The Facial Affect Scale (FAS) and a numeric rating scale (NRS) were compared. One subgroup of participants (N=50) had to choose the two items within each format to indicate 'least pain' and 'most pain'. The other subgroup of participants (N=56) had to order three faces of the FAS from 'least pain' to 'most pain', and to answer questions about the magnitude of numbers for the NRS. Comprehension percentages were compared between two subgroups. More participants understood the FAS than the NRS, irrespective of the presentation format. The comprehension percentage for the FAS did not differ between the least-most extremities format and the ordering/magnitude format. In contrast, comprehension percentages for the NRS differed significantly between the least-most extremities format (61%) and the ordering/magnitude format (32%). The inclusion of ordering and magnitude in a presentation format is essential to assess thorough comprehension of facial and numeric scales for self-reported pain. The use of this format does not influence the number of adults with DS who pass the comprehension test for the FAS, but reduces the number of adults with DS who pass the comprehension test for the NRS.

  3. Progressive facial hemiatrophy : a complex disorder not only affecting the face. A report in a monozygotic male twin pair

    NARCIS (Netherlands)

    Hulzebos, C V; de Vries, T W; Armbrust, W; Sauer, P J J; Kerstjens-Frederikse, W S

    2004-01-01

    Progressive facial hemiatrophy (PFH) is a ubiquitous disease, characterized by hyperpigmentation of the skin followed by unilateral craniofacial atrophy of subcutaneous tissues, including fat, muscle and bone. Hereditary factors have been postulated to be involved in the aetiology of PFH. Yet, the o

  4. Neural Correlates of Explicit versus Implicit Facial Emotion Processing in ASD

    Science.gov (United States)

    Luckhardt, Christina; Kröger, Anne; Cholemkery, Hannah; Bender, Stephan; Freitag, Christine M.

    2017-01-01

    The underlying neural mechanisms of implicit and explicit facial emotion recognition (FER) were studied in children and adolescents with autism spectrum disorder (ASD) compared to matched typically developing controls (TDC). EEG was obtained from N = 21 ASD and N = 16 TDC. Task performance, visual (P100, N170) and cognitive (late positive…

  5. The Role of Facial Microexpression State (FMES) Change in the Process of Conceptual Conflict

    Science.gov (United States)

    Chiu, Mei-Hung; Chou, Chin-Cheng; Wu, Wen-Lung; Liaw, Hongming

    2014-01-01

    This paper explores whether facial microexpression state (FMES) changes can be used to identify moments of conceptual conflict, one of the pathways to conceptual change. It is known that when the preconditions of conceptual conflicts are met and conceptual conflicts are detected in students, it is then possible for conceptual change to take place.…

  6. Statistical processing of facial electromyography (EMG) signals in emotional film scenes

    NARCIS (Netherlands)

    Westerink, Joyce; van den Broek, Egon; van Herk, Jan; Tuinenbreijer, Kees; Schut, Marleen

    To improve human-computer interaction, computers need to recognize and respond properly to their users’ emotional state. As a first step to such systems, we investigated how emotional experiences are expressed in various statistical parameters of facial EMG signals. 22 Subjects were presented with 8

  7. The MPI facial expression database--a validated database of emotional and conversational facial expressions.

    Directory of Open Access Journals (Sweden)

    Kathrin Kaulard

    Full Text Available The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision to investigate the processing of a wider range of natural

  8. Odontogenic Facial Cellulitis

    Directory of Open Access Journals (Sweden)

    Yordany Boza Mejias

    2012-11-01

    Full Text Available Background: odontogenic facial cellulitis is an acute inflammatory process manifested in very different ways, with a variable scale in clinical presentation ranging from harmless well defined processes, to diffuse and progressive that may develop complications leading the patient to a critical condition, even risking their lives. Objective: To characterize the behavior of odontogenic facial cellulitis. Methods: A descriptive case series study was conducted at the dental clinic of Aguada de Pasajeros, Cienfuegos, from September 2010 to March 2011. It included 56 patients who met the inclusion criteria. Variables analyzed included: sex, age, teeth and regions affected, causes of cellulite and prescribed treatment. Results: no sex predilection was observed, lower molars and submandibular anatomical region were the most affected (50% and 30 4% respectively being tooth decay the main cause for this condition (51, 7%. The opening access was not performed to all the patients in the emergency service. The causal tooth extraction was not commonly done early, according to the prescribed antibiotic group. Thermotherapy with warm fomentation and saline mouthwash was the most prescribed and the most widely used group of antibiotics was the penicillin. Conclusions: dental caries were the major cause of odontogenic cellulite. There are still difficulties with the implementation of opening access.

  9. Factors affecting multifunctional teams in innovation processes

    OpenAIRE

    Shen, Xin

    2002-01-01

    Structuring the innovation process and managing multifunctional teams is a basic prerequisite successful innovation. A well-structured process gives the possibility to implement effective multifunctional teamwork. Meanwhile, multifunctional teamwork helps to optimise and accomplish the innovation process. Organizational support is necessary to achieve effective teamwork. Designing or changing the organizational structures for multifunctional collaboration is an important issue. Changing the s...

  10. Facial paralysis

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/003028.htm Facial paralysis To use the sharing features on this page, please enable JavaScript. Facial paralysis occurs when a person is no longer able ...

  11. Facial Identification in Observers with Colour-Grapheme Synaesthesia

    DEFF Research Database (Denmark)

    Sørensen, Thomas Alrik

    2013-01-01

    suggested as an explanation of a neural substrate of synaesthesia. The present study does not have a strong point on this view. However, as the fusiform gyrus also have been proposed to play a crucial role in the processing of facial features for identification [e.g. Kanwisher et al, 1997, The Journal...... of Neuroscience, 17(11), 4302–4311], increased colour-word form representations in observers with colour-grapheme synaesthesia may affect facial identification in people with synaesthesia. This study investigates the ability to process facial features for identification in observers with colour...

  12. The association between chronic exposure to video game violence and affective picture processing: an ERP study.

    Science.gov (United States)

    Bailey, Kira; West, Robert; Anderson, Craig A

    2011-06-01

    Exposure to video game violence (VGV) is known to result in desensitization to violent material and may alter the processing of positive emotion related to facial expressions. The present study was designed to address three questions: (1) Does the association between VGV and positive emotion extend to stimuli other than faces, (2) is the association between VGV and affective picture processing observed with a single presentation of the stimuli, and (3) is the association between VGV and the response to violent stimuli sensitive to the relevance of emotion for task performance? The data revealed that transient modulations of the event-related potentials (ERPs) related to attentional orienting and sustained modulations of the ERPs related to evaluative processing were sensitive to VGV exposure.

  13. Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-Related Potential Component

    OpenAIRE

    Hideaki Tanaka

    2016-01-01

    Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow) and mouth (i.e., lipstick). Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the ...

  14. Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-Related Potential Component

    Science.gov (United States)

    Tanaka, Hideaki

    2016-01-01

    Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow) and mouth (i.e., lipstick). Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the application of cosmetic makeup and the amplitudes of the P1 and N170 event-related potential components during facial perception tasks. Moreover, the influence of visual perception on N170 amplitude, was evaluated under three makeup conditions: Eye Shadow, Lipstick, and No Makeup. Electroencephalography was used to monitor 17 participants who were exposed to visual stimuli under each these three makeup conditions. The results of the present study subsequently demonstrated that the Lipstick condition elicited a significantly greater N170 amplitude than the No Makeup condition, while P1 amplitude was unaffected by any of the conditions. Such findings indicate that the application of cosmetic makeup alters general facial perception but exerts no influence on the perception of low-level visual features. Collectively, these results support the notion that the application of makeup induces subtle alterations in the processing of facial stimuli, with a particular effect on the processing of specific facial components (i.e., the mouth), as reflected by changes in N170 amplitude. PMID:27656161

  15. Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-Related Potential Component.

    Science.gov (United States)

    Tanaka, Hideaki

    2016-01-01

    Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow) and mouth (i.e., lipstick). Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the application of cosmetic makeup and the amplitudes of the P1 and N170 event-related potential components during facial perception tasks. Moreover, the influence of visual perception on N170 amplitude, was evaluated under three makeup conditions: Eye Shadow, Lipstick, and No Makeup. Electroencephalography was used to monitor 17 participants who were exposed to visual stimuli under each these three makeup conditions. The results of the present study subsequently demonstrated that the Lipstick condition elicited a significantly greater N170 amplitude than the No Makeup condition, while P1 amplitude was unaffected by any of the conditions. Such findings indicate that the application of cosmetic makeup alters general facial perception but exerts no influence on the perception of low-level visual features. Collectively, these results support the notion that the application of makeup induces subtle alterations in the processing of facial stimuli, with a particular effect on the processing of specific facial components (i.e., the mouth), as reflected by changes in N170 amplitude.

  16. Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-related Potential Component

    Directory of Open Access Journals (Sweden)

    Hideaki Tanaka

    2016-09-01

    Full Text Available Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow and mouth (i.e., lipstick. Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the application of cosmetic makeup and the amplitudes of the P1 and N170 event-related potential components during facial perception tasks. Moreover, the influence of visual perception on N170 amplitude, was evaluated under three makeup conditions: Eye Shadow, Lipstick, and No Makeup. Electroencephalography was used to monitor 17 participants who were exposed to visual stimuli under each these three makeup conditions. The results of the present study subsequently demonstrated that the Lipstick condition elicited a significantly greater N170 amplitude than the No Makeup condition, while P1 amplitude was unaffected by any of the conditions. Such findings indicate that the application of cosmetic makeup alters general facial perception but exerts no influence on the perception of low-level visual features. Collectively, these results support the notion that the application of makeup induces subtle alterations in the processing of facial stimuli, with a particular effect on the processing of specific facial components (i.e., the mouth, as reflected by changes in N170 amplitude.

  17. Do unconscious processes affect educational institutions?

    Science.gov (United States)

    Hinshelwood, R D

    2009-10-01

    In this article I discuss the way that aspects of school and teaching have unconscious roots. Where anxiety about the process, for teachers and children, is high then there is the risk that unconscious defensive processes may occur resulting in institutionalized phenomena. These take the form of cultural attitudes and common practices which may not necessarily enhance the work and in some cases may actively interfere.

  18. Conditions and processes affecting radionuclide transport

    Science.gov (United States)

    Simmons, Ardyth M.; Neymark, Leonid A.

    2012-01-01

    Characteristics of host rocks, secondary minerals, and fluids would affect the transport of radionuclides from a previously proposed repository at Yucca Mountain, Nevada. Minerals in the Yucca Mountain tuffs that are important for retarding radionuclides include clinoptilolite and mordenite (zeolites), clay minerals, and iron and manganese oxides and hydroxides. Water compositions along flow paths beneath Yucca Mountain are controlled by dissolution reactions, silica and calcite precipitation, and ion-exchange reactions. Radionuclide concentrations along flow paths from a repository could be limited by (1) low waste-form dissolution rates, (2) low radionuclide solubility, and (3) radionuclide sorption onto geological media.

  19. Transcranial Electrical Stimulation over Dorsolateral Prefrontal Cortex Modulates Processing of Social Cognitive and Affective Information.

    Science.gov (United States)

    Conson, Massimiliano; Errico, Domenico; Mazzarella, Elisabetta; Giordano, Marianna; Grossi, Dario; Trojano, Luigi

    2015-01-01

    Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person's visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants' tendency to adopt another's point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males' responses to threatening faces whereas it interferes with the ability to adopt another's viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.

  20. Transcranial Electrical Stimulation over Dorsolateral Prefrontal Cortex Modulates Processing of Social Cognitive and Affective Information.

    Directory of Open Access Journals (Sweden)

    Massimiliano Conson

    Full Text Available Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS over dorsolateral prefrontal cortex (DLPFC. To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task, and on one cognitive task assessing the ability to adopt another person's visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants' tendency to adopt another's point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males' responses to threatening faces whereas it interferes with the ability to adopt another's viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.

  1. Sound Affects the Speed of Visual Processing

    Science.gov (United States)

    Keetels, Mirjam; Vroomen, Jean

    2011-01-01

    The authors examined the effects of a task-irrelevant sound on visual processing. Participants were presented with revolving clocks at or around central fixation and reported the hand position of a target clock at the time an exogenous cue (1 clock turning red) or an endogenous cue (a line pointing toward 1 of the clocks) was presented. A…

  2. Oocyte Maturation Process and Affecting Factors

    Directory of Open Access Journals (Sweden)

    Yurdun Kuyucu

    2009-08-01

    Full Text Available Normal female fertility depends on normally occuring oogenesis and maturation progress. Oogenesis and folliculogenesis are different progresses but occure in a harmony and at the same time. Oogenesis includes the events that take place matur ovum produced from primordial germ cells. Although folliculogenesis includes the stages primordial, primary, secondary, matur (Graaf follicules in the influece of gonadotropines and local growth factors. During oocyte maturation meiosis is distrupted till the puberty. Under LH influence it starts again and first meiosis completes before ovulation. Oocyte maturation can be regarded as the process of coming metaphase II from prophase I of oocyte at the puberty and can be studied as nuclear and cytoplasmic maturation. Meiosis is completed when fertilization occures and zygot is formed. In this article oogenesis, folliculogenesis and oocyte maturation process are summerized with related studies and reiews are revised. [Archives Medical Review Journal 2009; 18(4.000: 227-240

  3. Perception of facial expression and facial identity in subjects with social developmental disorders.

    Science.gov (United States)

    Hefter, Rebecca L; Manoach, Dara S; Barton, Jason J S

    2005-11-22

    It has been hypothesized that the social dysfunction in social developmental disorders (SDDs), such as autism, Asperger disorder, and the socioemotional processing disorder, impairs the acquisition of normal face-processing skills. The authors investigated whether this purported perceptual deficit was generalized to both facial expression and facial identity or whether these different types of facial perception were dissociated in SDDs. They studied 26 adults with a variety of SDD diagnoses, assessing their ability to discriminate famous from anonymous faces, their perception of emotional expression from facial and nonfacial cues, and the relationship between these abilities. They also compared the performance of two defined subgroups of subjects with SDDs on expression analysis: one with normal and one with impaired recognition of facial identity. While perception of facial expression was related to the perception of nonfacial expression, the perception of facial identity was not related to either facial or nonfacial expression. Likewise, subjects with SDDs with impaired facial identity processing perceived facial expression as well as those with normal facial identity processing. The processing of facial identity and that of facial expression are dissociable in social developmental disorders. Deficits in perceiving facial expression may be related to emotional processing more than face processing. Dissociations between the perception of facial identity and facial emotion are consistent with current cognitive models of face processing. The results argue against hypotheses that the social dysfunction in social developmental disorder causes a generalized failure to acquire face-processing skills.

  4. [Facial palsy].

    Science.gov (United States)

    Cavoy, R

    2013-09-01

    Facial palsy is a daily challenge for the clinicians. Determining whether facial nerve palsy is peripheral or central is a key step in the diagnosis. Central nervous lesions can give facial palsy which may be easily differentiated from peripheral palsy. The next question is the peripheral facial paralysis idiopathic or symptomatic. A good knowledge of anatomy of facial nerve is helpful. A structure approach is given to identify additional features that distinguish symptomatic facial palsy from idiopathic one. The main cause of peripheral facial palsies is idiopathic one, or Bell's palsy, which remains a diagnosis of exclusion. The most common cause of symptomatic peripheral facial palsy is Ramsay-Hunt syndrome. Early identification of symptomatic facial palsy is important because of often worst outcome and different management. The prognosis of Bell's palsy is on the whole favorable and is improved with a prompt tapering course of prednisone. In Ramsay-Hunt syndrome, an antiviral therapy is added along with prednisone. We also discussed of current treatment recommendations. We will review short and long term complications of peripheral facial palsy.

  5. Cloud Processed CCN Affect Cloud Microphysics

    Science.gov (United States)

    Hudson, J. G.; Noble, S. R., Jr.; Tabor, S. S.

    2015-12-01

    Variations in the bimodality/monomodality of CCN spectra (Hudson et al. 2015) exert opposite effects on cloud microphysics in two aircraft field projects. The figure shows two examples, droplet concentration, Nc, and drizzle liquid water content, Ld, against classification of CCN spectral modality. Low ratings go to balanced separated bimodal spectra, high ratings go to single mode spectra, strictly monomodal 8. Intermediate ratings go merged modes, e.g., one mode a shoulder of another. Bimodality is caused by mass or hygroscopicity increases that go only to CCN that made activated cloud droplets. In the Ice in Clouds Experiment-Tropical (ICE-T) small cumuli with lower Nc, greater droplet mean diameters, MD, effective radii, re, spectral widths, σ, cloud liquid water contents, Lc, and Ld were closer to more bimodal (lower modal ratings) below cloud CCN spectra whereas clouds with higher Nc, smaller MD, re, σ, and Ld were closer to more monomodal CCN (higher modal ratings). In polluted stratus clouds of the MArine Stratus/Stratocumulus Experiment (MASE) clouds that had greater Nc, and smaller MD, re, σ, Lc, and Ld were closer to more bimodal CCN spectra whereas clouds with lower Nc, and greater MD, re, σ, Lc, and Ld were closer to more monomodal CCN. These relationships are opposite because the dominant ICE-T cloud processing was coalescence whereas chemical transformations (e.g., SO2 to SO4) were dominant in MASE. Coalescence reduces Nc and thus also CCN concentrations (NCCN) when droplets evaporate. In subsequent clouds the reduced competition increases MD and σ, which further enhance coalescence and drizzle. Chemical transformations do not change Nc but added sulfate enhances droplet and CCN solubility. Thus, lower critical supersaturation (S) CCN can produce more cloud droplets in subsequent cloud cycles, especially for the low W and effective S of stratus. The increased competition reduces MD, re, and σ, which inhibit coalescence and thus reduce drizzle

  6. Crossmodal interactions during affective picture processing.

    Directory of Open Access Journals (Sweden)

    Vera Ferrari

    Full Text Available "Natural" crossmodal correspondences, such as the spontaneous tendency to associate high pitches with high spatial locations, are often hypothesized to occur preattentively and independently of task instructions (top-down attention. Here, we investigate bottom-up attentional engagement by using emotional scenes that are known to naturally and reflexively engage attentional resources. We presented emotional (pleasant and unpleasant or neutral pictures either below or above a fixation cross, while participants were required to discriminate between a high or a low pitch tone (experiment 1. Results showed that despite a robust crossmodal attentional capture of task-irrelevant emotional pictures, the general advantage in classifying the tones for congruent over incongruent visual-auditory stimuli was similar for emotional and neutral pictures. On the other hand, when picture position was task-relevant (experiment 2, task-irrelevant tones did not interact with pictures with regard to their combination of pitch and visual vertical spatial position, but instead they were effective in minimizing the interference effect of emotional picture processing on the ongoing task. These results provide constraints on our current understanding of natural crossmodal correspondences.

  7. Studies of dynamical processes affecting global climate

    Energy Technology Data Exchange (ETDEWEB)

    Keller, C.; Cooper, D.; Eichinger, W. [and others

    1998-12-31

    This is the final report of a three-year, Laboratory Directed Research and Development project at the Los Alamos National Laboratory (LANL). The main objective was, by a combined theoretical and observational approach, to develop improved models of dynamic processes in the oceans and atmosphere and to incorporate them into large climate codes, chiefly in four main areas: numerical physics, chemistry, water vapor, and ocean-atmosphere interactions. Main areas of investigation included studies of: cloud parameterizations for global climate codes, Lidar and the planetary boundary layer, chemistry, climate variability using coupled ocean-atmospheric models, and numerical physical methods. This project employed a unique approach that included participation of a number of University of California faculty, postdoctoral fellows and graduate students who collaborated with Los Alamos research staff on specific tasks, thus greatly enhancing the research output. Overall accomplishments during the sensing of the atmospheric planetary were: (1) first two- and three-dimensional remote sensing of the atmospheric planetary boundary layer using Lidars, (2) modeling of 20-year cycle in both pressure and sea surface temperatures in North Pacific, (3) modeling of low frequency internal variability, (4) addition of aerosols to stratosphere to simulate Pinatubo effect on ozone, (5) development of fast, comprehensive chemistry in the troposphere for urban pollution studies, (6) new prognostic cloud parameterization in global atmospheric code remedied problems with North Pacific atmospheric circulation and excessive equatorial precipitation, (7) development of a unique aerosol analysis technique, the aerosol time-of-flight mass spectrometer (ATOFMS), which allows real-time analysis of the size and chemical composition of individual aerosol particles, and (8) numerical physics applying Approximate Inertial Manifolds to ocean circulation. 14 refs., 6 figs.

  8. The Effect of Affective Context on Visuocortical Processing of Neutral Faces in Social Anxiety.

    Science.gov (United States)

    Wieser, Matthias J; Moscovitch, David A

    2015-01-01

    It has been demonstrated that verbal context information alters the neural processing of ambiguous faces such as faces with no apparent facial expression. In social anxiety, neutral faces may be implicitly threatening for socially anxious individuals due to their ambiguous nature, but even more so if these neutral faces are put in self-referential negative contexts. Therefore, we measured event-related brain potentials (ERPs) in response to neutral faces which were preceded by affective verbal information (negative, neutral, positive). Participants with low social anxiety (LSA; n = 23) and high social anxiety (HSA; n = 21) were asked to watch and rate valence and arousal of the respective faces while continuous EEG was recorded. ERP analysis revealed that HSA showed elevated P100 amplitudes in response to faces, but reduced structural encoding of faces as indexed by reduced N170 amplitudes. In general, affective context led to an enhanced early posterior negativity (EPN) for negative compared to neutral facial expressions. Moreover, HSA compared to LSA showed enhanced late positive potentials (LPP) to negatively contextualized faces, whereas in LSA this effect was found for faces in positive contexts. Also, HSA rated faces in negative contexts as more negative compared to LSA. These results point at enhanced vigilance for neutral faces regardless of context in HSA, while structural encoding seems to be diminished (avoidance). Interestingly, later components of sustained processing (LPP) indicate that LSA show enhanced visuocortical processing for faces in positive contexts (happy bias), whereas this seems to be the case for negatively contextualized faces in HSA (threat bias). Finally, our results add further new evidence that top-down information in interaction with individual anxiety levels can influence early-stage aspects of visual perception.

  9. Cognitive control modulates preferential sensory processing of affective stimuli.

    Science.gov (United States)

    Steinhauser, Marco; Flaisch, Tobias; Meinzer, Marcus; Schupp, Harald T

    2016-10-01

    Adaptive human behavior crucially relies on the ability of the brain to allocate resources automatically to emotionally significant stimuli. This ability has consistently been demonstrated by studies showing preferential processing of affective stimuli in sensory cortical areas. It is still unclear, however, whether this putatively automatic mechanism can be modulated by cognitive control processes. Here, we use functional magnetic resonance imaging (fMRI) to investigate whether preferential processing of an affective face distractor is suppressed when an affective distractor has previously elicited a response conflict in a word-face Stroop task. We analyzed this for three consecutive stages in the ventral stream of visual processing for which preferential processing of affective stimuli has previously been demonstrated: the striate area (BA 17), category-unspecific extrastriate areas (BA 18/19), and the fusiform face area (FFA). We found that response conflict led to a selective suppression of affective face processing in category-unspecific extrastriate areas and the FFA, and this effect was accompanied by changes in functional connectivity between these areas and the rostral anterior cingulate cortex. In contrast, preferential processing of affective face distractors was unaffected in the striate area. Our results indicate that cognitive control processes adaptively suppress preferential processing of affective stimuli under conditions where affective processing is detrimental because it elicits response conflict.

  10. The Facial Expressive Action Stimulus Test. A test battery for the assessment of face memory, face and object perception, configuration processing, and facial expression recognition.

    Science.gov (United States)

    de Gelder, Beatrice; Huis In 't Veld, Elisabeth M J; Van den Stock, Jan

    2015-01-01

    There are many ways to assess face perception skills. In this study, we describe a novel task battery FEAST (Facial Expressive Action Stimulus Test) developed to test recognition of identity and expressions of human faces as well as stimulus control categories. The FEAST consists of a neutral and emotional face memory task, a face and shoe identity matching task, a face and house part-to-whole matching task, and a human and animal facial expression matching task. The identity and part-to-whole matching tasks contain both upright and inverted conditions. The results provide reference data of a healthy sample of controls in two age groups for future users of the FEAST.

  11. Temporal factors affecting somatosensory-auditory interactions in speech processing

    Directory of Open Access Journals (Sweden)

    Takayuki eIto

    2014-11-01

    Full Text Available Speech perception is known to rely on both auditory and visual information. However, sound specific somatosensory input has been shown also to influence speech perceptual processing (Ito et al., 2009. In the present study we addressed further the relationship between somatosensory information and speech perceptual processing by addressing the hypothesis that the temporal relationship between orofacial movement and sound processing contributes to somatosensory-auditory interaction in speech perception. We examined the changes in event-related potentials in response to multisensory synchronous (simultaneous and asynchronous (90 ms lag and lead somatosensory and auditory stimulation compared to individual unisensory auditory and somatosensory stimulation alone. We used a robotic device to apply facial skin somatosensory deformations that were similar in timing and duration to those experienced in speech production. Following synchronous multisensory stimulation the amplitude of the event-related potential was reliably different from the two unisensory potentials. More importantly, the magnitude of the event-related potential difference varied as a function of the relative timing of the somatosensory-auditory stimulation. Event-related activity change due to stimulus timing was seen between 160-220 ms following somatosensory onset, mostly around the parietal area. The results demonstrate a dynamic modulation of somatosensory-auditory convergence and suggest the contribution of somatosensory information for speech processing process is dependent on the specific temporal order of sensory inputs in speech production.

  12. Facial Recognition

    Directory of Open Access Journals (Sweden)

    Mihalache Sergiu

    2014-05-01

    Full Text Available During their lifetime, people learn to recognize thousands of faces that they interact with. Face perception refers to an individual's understanding and interpretation of the face, particularly the human face, especially in relation to the associated information processing in the brain. The proportions and expressions of the human face are important to identify origin, emotional tendencies, health qualities, and some social information. From birth, faces are important in the individual's social interaction. Face perceptions are very complex as the recognition of facial expressions involves extensive and diverse areas in the brain. Our main goal is to put emphasis on presenting human faces specialized studies, and also to highlight the importance of attractiviness in their retention. We will see that there are many factors that influence face recognition.

  13. Recognition of facial expressions of different emotional intensities in patients with frontotemporal lobar degeneration

    NARCIS (Netherlands)

    Kessels, Roy P. C.; Gerritsen, Lotte; Montagne, Barbara; Ackl, Nibal; Diehl, Janine; Danek, Adrian

    2007-01-01

    Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD). Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more d

  14. Extremely Preterm-Born Infants Demonstrate Different Facial Recognition Processes at 6-10 Months of Corrected Age.

    Science.gov (United States)

    Frie, Jakob; Padilla, Nelly; Ådén, Ulrika; Lagercrantz, Hugo; Bartocci, Marco

    2016-05-01

    To compare cortical hemodynamic responses to known and unknown facial stimuli between infants born extremely preterm and term-born infants, and to correlate the responses of the extremely preterm-born infants to regional cortical volumes at term-equivalent age. We compared 27 infants born extremely preterm (infrared spectroscopy. In the preterm group, we also performed structural brain magnetic resonance imaging and correlated regional cortical volumes to hemodynamic responses. The preterm-born infants demonstrated different cortical face recognition processes than the term-born infants. They had a significantly smaller hemodynamic response in the right frontotemporal areas while watching their mother's face (0.13 μmol/L vs 0.63 μmol/L; P recognition process compared with term-born infants. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Emotion recognition in pictures of facial affect: Is there a difference between forensic and non-forensic patients with schizophrenia?

    Directory of Open Access Journals (Sweden)

    Wiebke Wolfkühler

    Full Text Available Background and Objectives: Abundant research has demonstrated that patients with schizophrenia have difficulties in recognizing the emotional content in facial expressions. However, there is a paucity of studies on emotion recognition in schizophrenia patients with a history of violent behavior compared to patients without a criminal record. Methods: Emotion recognition skills were examined in thirty-three forensic patients with schizophrenia. In addition, executive function and psychopathology was assessed. Results were compared to a group of 38 schizophrenia patients in regular psychiatric care and to a healthy control group. Results: Both patient groups performed more poorly on almost all tasks compared to controls. However, in the forensic group the recognition of the expression of disgust was preserved. When the excitement factor of the Positive and Negative Syndrome Scale was co-varied out, forensic patients outperformed the non-forensic patient group on emotion recognition across modalities. Conclusions: The superior recognition of disgust could be uniquely associated with delinquent behavior.

  16. Affective priming during the processing of news articles

    NARCIS (Netherlands)

    Baumgartner, S.E.; Wirth, W.

    2012-01-01

    The present study investigates the role of affective priming during the processing of news articles. It is assumed that the valence of the affective response to a news article will influence the processing of subsequent news articles. More specifically, it is hypothesized that participants who read

  17. Affective priming during the processing of news articles

    NARCIS (Netherlands)

    Baumgartner, S.E.; Wirth, W.

    2012-01-01

    The present study investigates the role of affective priming during the processing of news articles. It is assumed that the valence of the affective response to a news article will influence the processing of subsequent news articles. More specifically, it is hypothesized that participants who read

  18. Exogenous cortisol shifts a motivated bias from fear to anger in spatial working memory for facial expressions.

    NARCIS (Netherlands)

    Putman, P.; Hermans, E.J.; Honk, J. van

    2007-01-01

    Studies assessing processing of facial expressions have established that cortisol levels, emotional traits, and affective disorders predict selective responding to these motivationally relevant stimuli in expression specific manners. For instance, increased attentional processing of fearful faces

  19. Facial swelling in a sickle cell patient.

    Science.gov (United States)

    DeBlieux, Tyler K; Jackson, Neal; Jeyakumar, Anita; Townsend, Janice A; Naik, Bijal V

    2014-01-01

    Sickle cell disease (SCD) is characterized as a chronic hemolytic anemia with vaso-occlusive crises that result in multisystem organ damage. Bone marrow is one of the more common sites of these crises, presumably due to marrow hypercellularity that impairs blood flow- leading to regional hypoxia and subsequent infarction. Infarcts of facial bones are considered an uncommon complication of SCD. When infarcts occur in facial bones, the mandible and orbital bones are the most commonly affected. Overall, the clinical presentation of facial bone infarctions may mimic an infectious process, such as cellulitis, an abscess, or, more commonly, osteomyelitis. The purpose of this paper was to present the case of a patient with a confluence of symptoms in the face as a result of her sickle cell disease.

  20. Live facial feature extraction

    Institute of Scientific and Technical Information of China (English)

    ZHAO JieYu

    2008-01-01

    Precise facial feature extraction is essential to the high-level face recognition and expression analysis. This paper presents a novel method for the real-time geomet-ric facial feature extraction from live video. In this paper, the input image is viewed as a weighted graph. The segmentation of the pixels corresponding to the edges of facial components of the mouth, eyes, brows, and nose is implemented by means of random walks on the weighted graph. The graph has an 8-connected lattice structure and the weight value associated with each edge reflects the likelihood that a random walker will cross that edge. The random walks simulate an anisot-ropic diffusion process that filters out the noise while preserving the facial expres-sion pixels. The seeds for the segmentation are obtained from a color and motion detector. The segmented facial pixels are represented with linked lists in the origi-nal geometric form and grouped into different parts corresponding to facial com-ponents. For the convenience of implementing high-level vision, the geometric description of facial component pixels is further decomposed into shape and reg-istration information. Shape is defined as the geometric information that is invari-ant under the registration transformation, such as translation, rotation, and iso-tropic scale. Statistical shape analysis is carried out to capture global facial fea-tures where the Procrustes shape distance measure is adopted. A Bayesian ap-proach is used to incorporate high-level prior knowledge of face structure. Ex-perimental results show that the proposed method is capable of real-time extraction of precise geometric facial features from live video. The feature extraction is robust against the illumination changes, scale variation, head rotations, and hand inter-ference.

  1. Facial swelling

    Science.gov (United States)

    ... help reduce facial swelling. When to Contact a Medical Professional Call your health care provider if you have: Sudden, painful, or severe facial ... or if you have breathing problems. The health care provider will ask about your medical and personal history. This helps determine treatment or ...

  2. 正负效价面部表情图片加工的差异%The Processing Difference on the Facial Expression Pictures with Different Valence

    Institute of Scientific and Technical Information of China (English)

    隋雪; 纪雅婷; 陈欣; 任桂琴

    2015-01-01

    为探讨正负效价面部表情图片识别的差异,采用独立呈现范式呈现正负效价面部表情图片,控制呈现时间和提示线索位置,并利用眼动仪记录识别过程中的眼动指标。结果发现:(1)在识别速度和正确率上,加工积极面部表情图片高于加工消极面部表情图片,显示出积极表情优势。(2)呈现时间没有改变正负效价表情图片加工之间的差异。(3)存在提示线索位置效应,即提示线索在嘴部有利于面部表情识别。(4)面部表情识别遵循“眼部-嘴部-眼部”的规律。结果提示不同性质面部表情的加工机制不同,加工深度作用小于线索作用。%The experiment adopted the independent present paradigm to show positive and negative valence facial ex-pression pictures to explore the differences between the recognition of positive and negative valence facial ex-pressions pictures. The presentation time and the position of cue-target were controlled, and the eye-tracker was used to record the indexes of eye movement in the identification process. Results were as follows: 1) On the recognition speed and accuracy rate part, positive facial expression pictures′ processing exceeded the processing of negative facial expressions pictures, which showed the positive expression advantage. 2) The presentation time did not change the difference between the processing of positive and negative valence facial expressions pictures. 3) There was a position effect of cues that cues around the mouth would contribute to the facial expression recognition. 4) The facial expression recognition followed the “eye-to-mouth-to-eye” principle. The conse-quence suggests that the processing mechanism of different facial expression is different; the effect is less in the depth of processing than the hint position.

  3. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  4. Characterization of admitted patients with cervicofacial inflammatory process in the Maxilo facial surgery service. Caracterización de pacientes ingresados con procesos inflamatorios cérvico-faciales en el servicio de cirugia maxilofacial.

    Directory of Open Access Journals (Sweden)

    Julio Romero Rodríguez

    2005-12-01

    Full Text Available

    Fundament: Cervico-facial inflammatory processes constituyen an important health problem and not to treat them since its onset could cause associated complications that could be fatal for the health of the patient. Objective: To characterize the patients with inflammatory processes who were admitted at the Maxillo-Facial service in Cienfuegos province. Method: Descriptive study that included all the patients admitted because of cervico-inflammatory processes from January 2001 to December 2003 in Cienfuegos. The variables under study were: Age, sex, data related with the disease under study, signs and symptoms on admission, applied treatment, complications, follow up and hospital stay. Results: The odontogenous processes are the most frequent and within these those caused by caries. The widespread of the process to other regions, fever, malaise and trismus were the predominant symptoms. The main cause of admission was the inadequate treatment in the initial period of the disease.

    Fundamento: Los procesos inflamatorios cérvico-faciales constituyen un importante problema de salud, no tratarlos correctamente desde sus inicios provocaría complicaciones asociadas que pueden resultar fatales para la vida de los pacientes. Objetivo: Caracterizar los pacientes con procesos inflamatorios que ingresaron en el Servicio de Cirugía Máxilo-Facial en la provincia de Cienfuegos. Método: Estudio descriptivo que incluyó todos los pacientes ingresados por procesos inflamatorios cérvico-faciales desde enero del 2001 al 31 de diciembre del 2003 en Cienfuegos. Se estudiaron las variables: edad y sexo, datos relacionados con la afección objeto de estudio, signos y síntomas al ingreso, tratamiento aplicado, complicaciones, evolución y la estadía hospitalaria. Resultados: Los grupos etáreos jóvenes predominaron en

  5. Retinotopy of facial expression adaptation.

    Science.gov (United States)

    Matsumiya, Kazumichi

    2014-01-01

    The face aftereffect (FAE; the illusion of faces after adaptation to a face) has been reported to occur without retinal overlap between adaptor and test, but recent studies revealed that the FAE is not constant across all test locations, which suggests that the FAE is also retinotopic. However, it remains unclear whether the characteristic of the retinotopy of the FAE for one facial aspect is the same as that of the FAE for another facial aspect. In the research reported here, an examination of the retinotopy of the FAE for facial expression indicated that the facial expression aftereffect occurs without retinal overlap between adaptor and test, and depends on the retinal distance between them. Furthermore, the results indicate that, although dependence of the FAE on adaptation-test distance is similar between facial expression and facial identity, the FAE for facial identity is larger than that for facial expression when a test face is presented in the opposite hemifield. On the basis of these results, I discuss adaptation mechanisms underlying facial expression processing and facial identity processing for the retinotopy of the FAE.

  6. Discrimination of gender using facial image with expression change

    Science.gov (United States)

    Kuniyada, Jun; Fukuda, Takahiro; Terada, Kenji

    2005-12-01

    By carrying out marketing research, the managers of large-sized department stores or small convenience stores obtain the information such as ratio of men and women of visitors and an age group, and improve their management plan. However, these works are carried out in the manual operations, and it becomes a big burden to small stores. In this paper, the authors propose a method of men and women discrimination by extracting difference of the facial expression change from color facial images. Now, there are a lot of methods of the automatic recognition of the individual using a motion facial image or a still facial image in the field of image processing. However, it is very difficult to discriminate gender under the influence of the hairstyle and clothes, etc. Therefore, we propose the method which is not affected by personality such as size and position of facial parts by paying attention to a change of an expression. In this method, it is necessary to obtain two facial images with an expression and an expressionless. First, a region of facial surface and the regions of facial parts such as eyes, nose, and mouth are extracted in the facial image with color information of hue and saturation in HSV color system and emphasized edge information. Next, the features are extracted by calculating the rate of the change of each facial part generated by an expression change. In the last step, the values of those features are compared between the input data and the database, and the gender is discriminated. In this paper, it experimented for the laughing expression and smile expression, and good results were provided for discriminating gender.

  7. [Prosopagnosia and facial expression recognition].

    Science.gov (United States)

    Koyama, Shinichi

    2014-04-01

    This paper reviews clinical neuropsychological studies that have indicated that the recognition of a person's identity and the recognition of facial expressions are processed by different cortical and subcortical areas of the brain. The fusiform gyrus, especially the right fusiform gyrus, plays an important role in the recognition of identity. The superior temporal sulcus, amygdala, and medial frontal cortex play important roles in facial-expression recognition. Both facial recognition and facial-expression recognition are highly intellectual processes that involve several regions of the brain.

  8. Processing of affective prosody in boys suffering from attention deficit hyperactivity disorder: A near-infrared spectroscopy study.

    Science.gov (United States)

    Köchel, Angelika; Schöngaßner, Florian; Feierl-Gsodam, Silke; Schienle, Anne

    2015-01-01

    Neurobiological studies on facial affect recognition have demonstrated reduced response amplitudes to anger cues in patients suffering from attention deficit hyperactivity disorder (ADHD). It is still unclear whether a similar deficit exists in the auditory domain. Therefore, this near-infrared spectroscopy study focused on neuronal correlates of affective prosody processing. Fourteen boys suffering from ADHD and fourteen healthy boys were exposed to emotionally intoned, standardized sentences of the categories anger, sadness, happiness, and to affectively neutral sentences. Relative to controls, the patients displayed a diminished activation of the right superior temporal gyrus (STG) when processing anger prosody, which was correlated with aggressive behavior. There were no group differences for the other emotions. Additionally, the ADHD group showed increased supramarginal gyrus (SMG) activation in the anger condition. This might mirror compensatory attention allocation. In summary, we identified a selectively lowered STG activation to auditory anger cues in ADHD patients. Consequently, STG recruitment during anger exposure might be used for evaluation of psychotherapy effects.

  9. The Process of Physiotherapy for Idiopathic Facial Paralysis%特发性面神经麻痹的康复物理治疗进展

    Institute of Scientific and Technical Information of China (English)

    李志银; 代蓉

    2015-01-01

    Physiotherapy includes exercise therapy, manipulation and physical agents therapy, this article would summarize the process of physiother-apy for idiopathic facial paralysis, decide treatment options depending on specific circumstances of patients, to promote the recovery of facial nerve and muscles.%物理治疗包括运动治疗、手法治疗及物理因子治疗,本文将总结特发性面神经麻痹的物理治疗进展,视患者的具体情况选择治疗方案,促进面神经及面肌恢复。

  10. Embodied simulation as part of affective evaluation processes: task dependence of valence concordant EMG activity.

    Science.gov (United States)

    Weinreich, André; Funcke, Jakob Maria

    2014-01-01

    Drawing on recent findings, this study examines whether valence concordant electromyography (EMG) responses can be explained as an unconditional effect of mere stimulus processing or as somatosensory simulation driven by task-dependent processing strategies. While facial EMG over the Corrugator supercilii and the Zygomaticus major was measured, each participant performed two tasks with pictures of album covers. One task was an affective evaluation task and the other was to attribute the album covers to one of five decades. The Embodied Emotion Account predicts that valence concordant EMG is more likely to occur if the task necessitates a somatosensory simulation of the evaluative meaning of stimuli. Results support this prediction with regard to Corrugator supercilii in that valence concordant EMG activity was only present in the affective evaluation task but not in the non-evaluative task. Results for the Zygomaticus major were ambiguous. Our findings are in line with the view that EMG activity is an embodied part of the evaluation process and not a mere physical outcome.

  11. Integrative Processing of Touch and Affect in Social Perception: An fMRI Study.

    Science.gov (United States)

    Ebisch, Sjoerd J H; Salone, Anatolia; Martinotti, Giovanni; Carlucci, Leonardo; Mantini, Dante; Perrucci, Mauro G; Saggino, Aristide; Romani, Gian Luca; Di Giannantonio, Massimo; Northoff, Georg; Gallese, Vittorio

    2016-01-01

    Social perception commonly employs multiple sources of information. The present study aimed at investigating the integrative processing of affective social signals. Task-related and task-free functional magnetic resonance imaging was performed in 26 healthy adult participants during a social perception task concerning dynamic visual stimuli simultaneously depicting facial expressions of emotion and tactile sensations that could be either congruent or incongruent. Confounding effects due to affective valence, inhibitory top-down influences, cross-modal integration, and conflict processing were minimized. The results showed that the perception of congruent, compared to incongruent stimuli, elicited enhanced neural activity in a set of brain regions including left amygdala, bilateral posterior cingulate cortex (PCC), and left superior parietal cortex. These congruency effects did not differ as a function of emotion or sensation. A complementary task-related functional interaction analysis preliminarily suggested that amygdala activity depended on previous processing stages in fusiform gyrus and PCC. The findings provide support for the integrative processing of social information about others' feelings from manifold bodily sources (sensory-affective information) in amygdala and PCC. Given that the congruent stimuli were also judged as being more self-related and more familiar in terms of personal experience in an independent sample of participants, we speculate that such integrative processing might be mediated by the linking of external stimuli with self-experience. Finally, the prediction of task-related responses in amygdala by intrinsic functional connectivity between amygdala and PCC during a task-free state implies a neuro-functional basis for an individual predisposition for the integrative processing of social stimulus content.

  12. Unintentionality of affective attention across visual processing stages.

    Science.gov (United States)

    Uusberg, Andero; Uibo, Helen; Kreegipuu, Kairi; Tamm, Maria; Raidvee, Aire; Allik, Jüri

    2013-01-01

    Affective attention involves bottom-up perceptual selection that prioritizes motivationally significant stimuli. To clarify the extent to which this process is automatic, we investigated the dependence of affective attention on the intention to process emotional meaning. Affective attention was manipulated by presenting affective images with variable arousal and intentionality by requiring participants to make affective and non-affective evaluations. Polytomous rather than binary decisions were required from the participants in order to elicit relatively deep emotional processing. The temporal dynamics of prioritized processing were assessed using early posterior negativity (EPN, 175-300 ms) as well as P3-like (P3, 300-500 ms) and slow wave (SW, 500-1500 ms) portions of the late positive potential. All analyzed components were differentially sensitive to stimulus categories suggesting that they indeed reflect distinct stages of motivational significance encoding. The intention to perceive emotional meaning had no effect on EPN, an additive effect on P3, and an interactive effect on SW. We concluded that affective attention went from completely unintentional during the EPN to partially unintentional during P3 and SW where top-down signals, respectively, complemented and modulated bottom-up differences in stimulus prioritization. The findings were interpreted in light of two-stage models of visual perception by associating the EPN with large-capacity initial relevance detection and the P3 as well as SW with capacity-limited consolidation and elaboration of affective stimuli.

  13. Guidelines for Affective Signal Processing (ASP): From lab to life

    NARCIS (Netherlands)

    van den Broek, Egon; Janssen, Joris H.; Westerink, Joyce H.D.M.; Cohn, J.; Nijholt, Antinus; Pantic, Maja

    2009-01-01

    This article presents the rationale behind ACII2009’s special session: Guidelines for Affective Signal Processing (ASP): From lab to life. Although affect is embraced by both science and engineering, its recognition has not reached a satisfying level. Through a concise overview of ASP and the automa

  14. Guidelines for Affective Signal Processing (ASP): From lab to life

    NARCIS (Netherlands)

    van den Broek, Egon; Janssen, Joris H.; Westerink, Joyce H.D.M.; Cohn, J.; Nijholt, Antinus; Pantic, Maja

    2009-01-01

    This article presents the rationale behind ACII2009’s special session: Guidelines for Affective Signal Processing (ASP): From lab to life. Although affect is embraced by both science and engineering, its recognition has not reached a satisfying level. Through a concise overview of ASP and the

  15. Cephalometric soft tissue facial analysis.

    Science.gov (United States)

    Bergman, R T

    1999-10-01

    My objective is to present a cephalometric-based facial analysis to correlate with an article that was published previously in the American Journal of Orthodontic and Dentofacial Orthopedics. Eighteen facial or soft tissue traits are discussed in this article. All of them are significant in successful orthodontic outcome, and none of them depend on skeletal landmarks for measurement. Orthodontic analysis most commonly relies on skeletal and dental measurement, placing far less emphasis on facial feature measurement, particularly their relationship to each other. Yet, a thorough examination of the face is critical for understanding the changes in facial appearance that result from orthodontic treatment. A cephalometric approach to facial examination can also benefit the diagnosis and treatment plan. Individual facial traits and their balance with one another should be identified before treatment. Relying solely on skeletal analysis, assuming that the face will balance if the skeletal/dental cephalometric values are normalized, may not yield the desired outcome. Good occlusion does not necessarily mean good facial balance. Orthodontic norms for facial traits can permit their measurement. Further, with a knowledge of standard facial traits and the patient's soft tissue features, an individualized norm can be established for each patient to optimize facial attractiveness. Four questions should be asked regarding each facial trait before treatment: (1) What is the quality and quantity of the trait? (2) How will future growth affect the trait? (3) How will orthodontic tooth movement affect the existing trait (positively or negatively)? (4) How will surgical bone movement to correct the bite affect the trait (positively or negatively)?

  16. Processes affecting the remediation of chromium-contaminated sites.

    OpenAIRE

    Palmer, C.D.; Wittbrodt, P R

    1991-01-01

    The remediation of chromium-contaminated sites requires knowledge of the processes that control the migration and transformation of chromium. Advection, dispersion, and diffusion are physical processes affecting the rate at which contaminants can migrate in the subsurface. Heterogeneity is an important factor that affects the contribution of each of these mechanisms to the migration of chromium-laden waters. Redox reactions, chemical speciation, adsorption/desorption phenomena, and precipitat...

  17. Neural processing of emotional facial and semantic expressions in euthymic bipolar disorder (BD and its association with theory of mind (ToM.

    Directory of Open Access Journals (Sweden)

    Agustin Ibanez

    Full Text Available BACKGROUND: Adults with bipolar disorder (BD have cognitive impairments that affect face processing and social cognition. However, it remains unknown whether these deficits in euthymic BD have impaired brain markers of emotional processing. METHODOLOGY/PRINCIPAL FINDINGS: We recruited twenty six participants, 13 controls subjects with an equal number of euthymic BD participants. We used an event-related potential (ERP assessment of a dual valence task (DVT, in which faces (angry and happy, words (pleasant and unpleasant, and face-word simultaneous combinations are presented to test the effects of the stimulus type (face vs word and valence (positive vs. negative. All participants received clinical, neuropsychological and social cognition evaluations. ERP analysis revealed that both groups showed N170 modulation of stimulus type effects (face > word. BD patients exhibited reduced and enhanced N170 to facial and semantic valence, respectively. The neural source estimation of N170 was a posterior section of the fusiform gyrus (FG, including the face fusiform area (FFA. Neural generators of N170 for faces (FG and FFA were reduced in BD. In these patients, N170 modulation was associated with social cognition (theory of mind. CONCLUSIONS/SIGNIFICANCE: This is the first report of euthymic BD exhibiting abnormal N170 emotional discrimination associated with theory of mind impairments.

  18. Classifying Facial Actions

    Science.gov (United States)

    Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.

    2010-01-01

    The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284

  19. The functional profile of the human amygdala in affective processing: insights from intracranial recordings.

    Science.gov (United States)

    Murray, Ryan J; Brosch, Tobias; Sander, David

    2014-11-01

    The amygdala is suggested to serve as a key structure in the emotional brain, implicated in diverse affective processes. Still, the bulk of existing neuroscientific investigations of the amygdala relies on conventional neuroimaging techniques such as fMRI, which are very useful but subject to limitations. These limitations are particular to their temporal resolution, but also to their spatial precision at a very fine-grained level. Here, we review studies investigating the functional profile of the human amygdala using intracranial electroencephalography (iEEG), an invasive technique with high temporal and spatial precision. We conducted a systematic literature review of 47 iEEG studies investigating the human amygdala, and we focus on two content-related domains and one process-related domain: (1) memory formation and retrieval; (2) affective processing; and (3) latency components. This review reveals the human amygdala to engage in invariant semantic encoding and recognition of specific objects and individuals, independent of context or visuospatial attributes, and to discriminate between familiar and novel stimuli. The review highlights the amygdala's role in emotion processing witnessed in differential treatment of social-affective facial cues, differential neuronal firing to relevant novel stimuli, and habituation to familiar affective stimuli. Overall, the review suggests the amygdala plays a key role in the processing of affective relevance. Finally, this review delineates effects on amygdala neuronal activity into three time latency windows (post-stimulus onset). The early window (∼ 5 0-290 msec) subsumes effects respective to exogenous stimulus-driven affective processing of faces and emotion. The intermediate window (∼ 270-470 msec) comprises effects related to explicit attention to novel task-relevant stimuli, irrespective of sensory modality. The late window (∼ 600-1400 msec) subsumes effects from tasks soliciting semantic associations and working

  20. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: A fixation-to-feature approach.

    Science.gov (United States)

    Neath-Tavares, Karly N; Itier, Roxane J

    2016-09-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100-120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms.

  1. Facial tics

    Science.gov (United States)

    Tic - facial; Mimic spasm ... Tics may involve repeated, uncontrolled spasm-like muscle movements, such as: Eye blinking Grimacing Mouth twitching Nose wrinkling Squinting Repeated throat clearing or grunting may also be ...

  2. Facial Recognition

    National Research Council Canada - National Science Library

    Mihalache Sergiu; Stoica Mihaela-Zoica

    2014-01-01

    .... From birth, faces are important in the individual's social interaction. Face perceptions are very complex as the recognition of facial expressions involves extensive and diverse areas in the brain...

  3. Unintentionality of affective attention across visual processing stages

    Directory of Open Access Journals (Sweden)

    Andero eUusberg

    2013-12-01

    Full Text Available Affective attention involves bottom-up perceptual selection that prioritizes motivationally significant stimuli. To clarify the extent to which this process is automatic, we investigated the dependence of affective attention on the intention to process emotional meaning. Affective attention was manipulated by presenting IAPS images with variable arousal and intentionality by requiring participants to make affective and non-affective evaluations. Polytomous rather than binary decisions were required from the participants in order to elicit relatively deep emotional processing. The temporal dynamics of prioritized processing were assessed using Early Posterior Negativity (EPN, 175-300 ms as well as P3-like (P3, 300 – 500 ms and Slow Wave (SW, 500 – 1500 ms portions of the Late Positive Potential. All analysed components were differentially sensitive to stimulus categories suggesting that they indeed reflect distinct stages of motivational significance encoding. The intention to perceive emotional meaning had no effect on EPN, an additive effect on P3, and an interactive effect on SW. We concluded that affective attention went from completely unintentional during the EPN to partially unintentional during P3 and SW where top-down signals, respectively, complemented and modulated bottom-up differences in stimulus prioritization. The findings were interpreted in light of two-stage models of visual perception by associating the EPN with large-capacity initial relevance detection and the P3 as well as SW with capacity-limited consolidation and elaboration of affective stimuli.

  4. Computer Aided Facial Prosthetics Manufacturing System

    Directory of Open Access Journals (Sweden)

    Peng H.K.

    2016-01-01

    Full Text Available Facial deformities can impose burden to the patient. There are many solutions for facial deformities such as plastic surgery and facial prosthetics. However, current fabrication method of facial prosthetics is high-cost and time consuming. This study aimed to identify a new method to construct a customized facial prosthetic. A 3D scanner, computer software and 3D printer were used in this study. Results showed that the new developed method can be used to produce a customized facial prosthetics. The advantages of the developed method over the conventional process are low cost, reduce waste of material and pollution in order to meet the green concept.

  5. Brain asymmetry and facial attractiveness: facial beauty is not simply in the eye of the beholder.

    Science.gov (United States)

    Chen, A C; German, C; Zaidel, D W

    1997-04-01

    We recently reported finding asymmetry in the appearance of beauty on the face [Zaidel et al., Neuropsychologia, Vol. 33, pp. 649-655, 1995]. Here, we investigated whether facial beauty is a stable characteristic (on the owner's very face) or is in the perceptual space of the observer. We call the question 'the owner vs observer hypothesis'. We compared identity judgements and attractiveness ratings of observers. Subjects viewed left-left and right-right composites of faces and decided which most resembled the normal face (Experiment 1). Identity judgements (resemblance) are known to be associated with perceptual factors in the observer. Another group viewed the same normal faces and rated them on attractiveness (Experiment 2). In each experiment, there were two separate viewing conditions, original and reversed (mirror-image). Lateral reversal did affect the results of Experiment 1 (confirming previous findings [Bennett et al., Neuropsychologia, Vol. 25, pp. 681-687, 1987; Gilbert and Bakan, Journal of Anatomy, Vol. 183, pp. 593-600, 1993]) but did not affect the results of Experiment 2. The fact that lateral reversal did not affect the results of Experiment 2 suggests that facial attractiveness is more dependent on physiognomy (of the owner) and less dependent on an asymmetrical perceptual process (in the observer) than is facial identity. The results are discussed in the context of beauty's biological significance and facial processing in the brain.

  6. Similar exemplar pooling processes underlie the learning of facial identity and handwriting style: Evidence from typical observers and individuals with Autism.

    Science.gov (United States)

    Ipser, Alberta; Ring, Melanie; Murphy, Jennifer; Gaigg, Sebastian B; Cook, Richard

    2016-05-01

    Considerable research has addressed whether the cognitive and neural representations recruited by faces are similar to those engaged by other types of visual stimuli. For example, research has examined the extent to which objects of expertise recruit holistic representation and engage the fusiform face area. Little is known, however, about the domain-specificity of the exemplar pooling processes thought to underlie the acquisition of familiarity with particular facial identities. In the present study we sought to compare observers' ability to learn facial identities and handwriting styles from exposure to multiple exemplars. Crucially, while handwritten words and faces differ considerably in their topographic form, both learning tasks share a common exemplar pooling component. In our first experiment, we find that typical observers' ability to learn facial identities and handwriting styles from exposure to multiple exemplars correlates closely. In our second experiment, we show that observers with Autism Spectrum Disorder (ASD) are impaired at both learning tasks. Our findings suggest that similar exemplar pooling processes are recruited when learning facial identities and handwriting styles. Models of exemplar pooling originally developed to explain face learning, may therefore offer valuable insights into exemplar pooling across a range of domains, extending beyond faces. Aberrant exemplar pooling, possibly resulting from structural differences in the inferior longitudinal fasciculus, may underlie difficulties recognising familiar faces often experienced by individuals with ASD, and leave observers overly reliant on local details present in particular exemplars.

  7. Age and gender modulate the neural circuitry supporting facial emotion processing in adults with major depressive disorder.

    Science.gov (United States)

    Briceño, Emily M; Rapport, Lisa J; Kassel, Michelle T; Bieliauskas, Linas A; Zubieta, Jon-Kar; Weisenbach, Sara L; Langenecker, Scott A

    2015-03-01

    Emotion processing, supported by frontolimbic circuitry known to be sensitive to the effects of aging, is a relatively understudied cognitive-emotional domain in geriatric depression. Some evidence suggests that the neurophysiological disruption observed in emotion processing among adults with major depressive disorder (MDD) may be modulated by both gender and age. Therefore, the present study investigated the effects of gender and age on the neural circuitry supporting emotion processing in MDD. Cross-sectional comparison of fMRI signal during performance of an emotion processing task. Outpatient university setting. One hundred adults recruited by MDD status, gender, and age. Participants underwent fMRI while completing the Facial Emotion Perception Test. They viewed photographs of faces and categorized the emotion perceived. Contrast for fMRI was of face perception minus animal identification blocks. Effects of depression were observed in precuneus and effects of age in a number of frontolimbic regions. Three-way interactions were present between MDD status, gender, and age in regions pertinent to emotion processing, including frontal, limbic, and basal ganglia. Young women with MDD and older men with MDD exhibited hyperactivation in these regions compared with their respective same-gender healthy comparison (HC) counterparts. In contrast, older women and younger men with MDD exhibited hypoactivation compared to their respective same-gender HC counterparts. This the first study to report gender- and age-specific differences in emotion processing circuitry in MDD. Gender-differential mechanisms may underlie cognitive-emotional disruption in older adults with MDD. The present findings have implications for improved probes into the heterogeneity of the MDD syndrome. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  8. Three-year-olds' rapid facial electromyographic responses to emotional facial expressions and body postures.

    Science.gov (United States)

    Geangu, Elena; Quadrelli, Ermanno; Conte, Stefania; Croci, Emanuela; Turati, Chiara

    2016-04-01

    Rapid facial reactions (RFRs) to observed emotional expressions are proposed to be involved in a wide array of socioemotional skills, from empathy to social communication. Two of the most persuasive theoretical accounts propose RFRs to rely either on motor resonance mechanisms or on more complex mechanisms involving affective processes. Previous studies demonstrated that presentation of facial and bodily expressions can generate rapid changes in adult and school-age children's muscle activity. However, to date there is little to no evidence to suggest the existence of emotional RFRs from infancy to preschool age. To investigate whether RFRs are driven by motor mimicry or could also be a result of emotional appraisal processes, we recorded facial electromyographic (EMG) activation from the zygomaticus major and frontalis medialis muscles to presentation of static facial and bodily expressions of emotions (i.e., happiness, anger, fear, and neutral) in 3-year-old children. Results showed no specific EMG activation in response to bodily emotion expressions. However, observing others' happy faces led to increased activation of the zygomaticus major and decreased activation of the frontalis medialis, whereas observing others' angry faces elicited the opposite pattern of activation. This study suggests that RFRs are the result of complex mechanisms in which both affective processes and motor resonance may play an important role. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. A REVIEW ON FACIAL NEURALGIAS

    OpenAIRE

    Solanki, Gaurav

    2010-01-01

    Facial neuralgias are produced by a change in neurological structure or function. This type of neuropathic pain affects the mental health as well as quality of life of patients. There are different types of neuralgias affecting the oral and maxillofacial region. These unusual pains are linked to some possible mechanisms. Various diagnostic tests are done to diagnose the proper cause of facial neuralgia and according to it the medical and surgical treatment is done to provide relief to patient.

  10. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  11. Facial Sports Injuries

    Science.gov (United States)

    ... Find an ENT Doctor Near You Facial Sports Injuries Facial Sports Injuries Patient Health Information News media interested in ... should receive immediate medical attention. Prevention Of Facial Sports Injuries The best way to treat facial sports injuries ...

  12. Children and Facial Trauma

    Science.gov (United States)

    ... an ENT Doctor Near You Children and Facial Trauma Children and Facial Trauma Patient Health Information News ... staff at newsroom@entnet.org . What is facial trauma? The term facial trauma means any injury to ...

  13. Facial Cosmetic Surgery

    Science.gov (United States)

    ... to find out more. Facial Cosmetic Surgery Facial Cosmetic Surgery Extensive education and training in surgical procedures ... to find out more. Facial Cosmetic Surgery Facial Cosmetic Surgery Extensive education and training in surgical procedures ...

  14. Facial Scar Revision: Understanding Facial Scar Treatment

    Science.gov (United States)

    ... a facial plastic surgeon Facial Scar Revision Understanding Facial Scar Treatment When the skin is injured from a cut or tear the body heals by forming scar tissue. The appearance of the scar can range from ...

  15. Affective value and associative processing share a cortical substrate.

    Science.gov (United States)

    Shenhav, Amitai; Barrett, Lisa Feldman; Bar, Moshe

    2013-03-01

    The brain stores information in an associative manner so that contextually related entities are connected in memory. Such associative representations mediate the brain's ability to generate predictions about which other objects and events to expect in a given context. Likewise, the brain encodes and is able to rapidly retrieve the affective value of stimuli in our environment. That both contextual associations and affect serve as building blocks of numerous mental functions often makes interpretation of brain activation ambiguous. A critical brain region where such activation has often resulted in equivocal interpretation is the medial orbitofrontal cortex (mOFC), which has been implicated separately in both affective and associative processing. To characterize its role more unequivocally, we tested whether activity in the mOFC was most directly attributable to affective processing, associative processing, or a combination of both. Subjects performed an object recognition task while undergoing fMRI scans. Objects varied independently in their affective valence and in their degree of association with other objects (associativity). Analyses revealed an overlapping sensitivity whereby the left mOFC responded both to increasingly positive affective value and to stronger associativity. These two properties individually accounted for mOFC response, even after controlling for their interrelationship. The role of the mOFC is either general enough to encompass associations that link stimuli both with reinforcing outcomes and with other stimuli or abstract enough to use both valence and associativity in conjunction to inform downstream processes related to perception and action. These results may further point to a fundamental relationship between associativity and positive affect.

  16. Plastic Changes of Synapses and Excitatory Neurotransmitter Receptors in Facial Nucleus Following Facial-facial Anastomosis

    Institute of Scientific and Technical Information of China (English)

    Pei CHEN; Jun SONG; Linghui LUO; Shusheng GONG

    2008-01-01

    The remodeling process of synapses and eurotransmitter receptors of facial nucleus were observed. Models were set up by facial-facial anastomosis in rat. At post-surgery day (PSD) 0, 7, 21 and 60, synaptophysin (p38), NMDA receptor subunit 2A and AMPA receptor subunit 2 (GIuR2) were observed by immunohistochemical method and emi-quantitative RT-PCR, respectively. Meanwhile, the synaptic structure of the facial motorneurons was observed under a transmission electron microscope (TEM). The intensity of p38 immunoreactivity was decreased, reaching the lowest value at PSD day 7, and then increased slightly at PSD 21. Ultrastructurally, the number of synapses in nucleus of the operational side decreased, which was consistent with the change in P38 immhnoreactivity. NMDAR2A mRNA was down-regulated significantly in facial nucleus after the operation (P000.05). The synapses innervation and the expression of NMDAR2A and AMPAR2 mRNA in facial nucleus might be modified to suit for the new motor tasks following facial-facial anastomosis, and influenced facial nerve regeneration and recovery.

  17. Neural activities during affective processing in people with Alzheimer's disease

    NARCIS (Netherlands)

    Lee, Tatia M. C.; Sun, Delin; Leung, Mei-Kei; Chu, Leung-Wing; Keysers, Christian

    2013-01-01

    This study examined brain activities in people with Alzheimer's disease when viewing happy, sad, and fearful facial expressions of others. A functional magnetic resonance imaging and a voxel-based morphometry methodology together with a passive viewing of emotional faces paradigm were employed to co

  18. Emotional facial expression processing in depression: data from behavioral and event-related potential studies.

    Science.gov (United States)

    Delle-Vigne, D; Wang, W; Kornreich, C; Verbanck, P; Campanella, S

    2014-04-01

    Behavioral literature investigating emotional processes in depressive populations (i.e., unipolar and bipolar depression) states that, compared to healthy controls, depressive subjects exhibit disrupted emotional processing, indexed by lower performance and/or delayed response latencies. The development of brain imaging techniques, such as functional magnetic resonance imaging (fMRI), provided the possibility to visualize the brain regions engaged in emotional processes and how they fail to interact in psychiatric diseases. However, fMRI suffers from poor temporal resolution and cognitive function involves various steps and cognitive stages (serially or in parallel) to give rise to a normal performance. Thus, the origin of a behavioral deficit may result from the alteration of a cognitive stage differently situated along the information-processing stream, outlining the importance of access to this dynamic "temporal" information. In this paper, we will illustrate, through depression, the role that should be attributed to cognitive event-related potentials (ERPs). Indeed, owing to their optimal temporal resolution, ERPs can monitor the neural processes engaged in disrupted cognitive function and provide crucial information for its treatment, training of the impaired cognitive functions and guidelines for clinicians in the choice and monitoring of appropriate medication for the patient. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  19. Verbal bias in recognition of facial emotions in children with Asperger syndrome.

    Science.gov (United States)

    Grossman, J B; Klin, A; Carter, A S; Volkmar, F R

    2000-03-01

    Thirteen children and adolescents with diagnoses of Asperger syndrome (AS) were matched with 13 nonautistic control children on chronological age and verbal IQ. They were tested on their ability to recognize simple facial emotions, as well as facial emotions paired with matching, mismatching, or irrelevant verbal labels. There were no differences between the groups at recognizing simple emotions but the Asperger group performed significantly worse than the control group at recognizing emotions when faces were paired with mismatching words (but not with matching or irrelevant words). The results suggest that there are qualitative differences from nonclinical populations in how children with AS process facial expressions. When presented with a more demanding affective processing task, individuals with AS showed a bias towards visual-verbal over visual-affective information (i.e., words over faces). Thus, children with AS may be utilizing compensatory strategies, such as verbal mediation, to process facial expressions of emotion.

  20. Rejuvenecimiento facial

    Directory of Open Access Journals (Sweden)

    L. Daniel Jacubovsky, Dr.

    2010-01-01

    Full Text Available El envejecimiento facial es un proceso único y particular a cada individuo y está regido en especial por su carga genética. El lifting facial es una compleja técnica desarrollada en nuestra especialidad desde principios de siglo, para revertir los principales signos de este proceso. Los factores secundarios que gravitan en el envejecimiento facial son múltiples y por ello las ritidectomías o lifting cérvico faciales descritas han buscado corregir los cambios fisonómicos del envejecimiento excursionando, como se describe, en todos los planos tisulares involucrados. Esta cirugía por lo tanto, exige conocimiento cabal de la anatomía quirúrgica, pericia y experiencia para reducir las complicaciones, estigmas quirúrgicos y revisiones secundarias. La ridectomía facial ha evolucionado hacia un procedimiento más simple, de incisiones más cortas y disecciones menos extensas. Las suspensiones musculares han variado en su ejecución y los vectores de montaje y resección cutánea son cruciales en los resultados estéticos de la cirugía cérvico facial. Hoy estos vectores son de tracción más vertical. La corrección de la flaccidez va acompañada de un interés en reponer el volumen de la superficie del rostro, en especial el tercio medio. Las técnicas quirúrgicas de rejuvenecimiento, en especial el lifting facial, exigen una planificación para cada paciente. Las técnicas adjuntas al lifting, como blefaroplastias, mentoplastía, lipoaspiración de cuello, implantes faciales y otras, también han tenido una positiva evolución hacia la reducción de riesgos y mejor éxito estético.

  1. Asymmetry in infants’ selective attention to facial features during visual processing of infant-directed speech

    Directory of Open Access Journals (Sweden)

    Nicholas A Smith

    2013-09-01

    Full Text Available Two experiments used eye tracking to examine how infant and adult observers distribute their eye gaze on videos of a mother producing infant- and adult-directed speech. Both groups showed greater attention to the eyes than to the nose and mouth, as well as an asymmetrical focus on the talker’s right eye for infant-directed speech stimuli. Observers continued to look more at the talker’s apparent right eye when the video stimuli were mirror flipped, suggesting that the asymmetry reflects a perceptual processing bias rather than a stimulus artifact, which may be related to cerebral lateralization of emotion processing.

  2. How current ginning processes affect fiber length uniformity index

    Science.gov (United States)

    There is a need to develop cotton ginning methods that improve fiber characteristics that are compatible with the newer and more efficient spinning technologies. A literature search produced recent studies that described how current ginning processes affect HVI fiber length uniformity index. Resul...

  3. Developing Worksheet Based on Science Process Skills: Factors Affecting Solubility

    Science.gov (United States)

    Karsli, Fethiye; Sahin, Cigdem

    2009-01-01

    The purpose of this study is to develop a worksheet about the factors affecting solubility, which could be useful for the prospective science teachers (PST) to remind and regain their science process skills (SPS). The pilot study of the WS was carried out with 32 first grade PST during the 2007-2008 academic year in the education department at…

  4. The kinetics of chemical processes affecting acidity in the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Pienaar, J.J.; Helas, G. [Potchefstroom University of Christian Higher Education, Potchefstroom (South Africa). Atmospheric Chemistry Research Group

    1996-03-01

    The dominant chemical reactions affecting atmospheric pollution chemistry and in particular, those leading to the formation of acid rain are outlined. The factors controlling the oxidation rate of atmospheric pollutants as well as the rate laws describing these processes are discussed in the light of our latest results and the current literature.

  5. Design and Implementation of Technology Enabled Affective Learning Using Fusion of Bio-Physical and Facial Expression

    Science.gov (United States)

    Ray, Arindam; Chakrabarti, Amlan

    2016-01-01

    Technology Enabled Learning is a cognitive, constructive, systematic, collaborative learning procedure, which transforms teaching-learning pedagogy where role of emotion is very often neglected. Emotion plays significant role in the cognitive process of human being, so the transformation is incomplete without capturing the learner's emotional…

  6. Facial Paralysis Reconstruction.

    Science.gov (United States)

    Razfar, Ali; Lee, Matthew K; Massry, Guy G; Azizzadeh, Babak

    2016-04-01

    Facial nerve paralysis is a devastating condition arising from several causes with severe functional and psychological consequences. Given the complexity of the disease process, management involves a multispecialty, team-oriented approach. This article provides a systematic approach in addressing each specific sequela of this complex problem.

  7. The Role of the Amygdala in Facial Trustworthiness Processing: A Systematic Review and Meta-Analyses of fMRI Studies.

    Science.gov (United States)

    Santos, Sara; Almeida, Inês; Oliveiros, Bárbara; Castelo-Branco, Miguel

    2016-01-01

    Faces play a key role in signaling social cues such as signals of trustworthiness. Although several studies identify the amygdala as a core brain region in social cognition, quantitative approaches evaluating its role are scarce. This review aimed to assess the role of the amygdala in the processing of facial trustworthiness, by analyzing its amplitude BOLD response polarity to untrustworthy versus trustworthy facial signals under fMRI tasks through a Meta-analysis of effect sizes (MA). Activation Likelihood Estimation (ALE) analyses were also conducted. Articles were retrieved from MEDLINE, ScienceDirect and Web-of-Science in January 2016. Following the PRISMA statement guidelines, a systematic review of original research articles in English language using the search string "(face OR facial) AND (trustworthiness OR trustworthy OR untrustworthy OR trustee) AND fMRI" was conducted. The MA concerned amygdala responses to facial trustworthiness for the contrast Untrustworthy vs. trustworthy faces, and included whole-brain and ROI studies. To prevent potential bias, results were considered even when at the single study level they did not survive correction for multiple comparisons or provided non-significant results. ALE considered whole-brain studies, using the same methodology to prevent bias. A summary of the methodological options (design and analysis) described in the articles was finally used to get further insight into the characteristics of the studies and to perform a subgroup analysis. Data were extracted by two authors and checked independently. Twenty fMRI studies were considered for systematic review. An MA of effect sizes with 11 articles (12 studies) showed high heterogeneity between studies [Q(11) = 265.68, p trustworthiness. Six articles/studies showed that posterior cingulate and medial frontal gyrus present positive correlations with increasing facial trustworthiness levels. Significant effects considering subgroup analysis based on methodological

  8. Positive affect and psychobiological processes relevant to health.

    Science.gov (United States)

    Steptoe, Andrew; Dockray, Samantha; Wardle, Jane

    2009-12-01

    Empirical evidence suggests that there are marked associations between positive psychological states and health outcomes, including reduced cardiovascular disease risk and increased resistance to infection. These observations have stimulated the investigation of behavioral and biological processes that might mediate protective effects. Evidence linking positive affect with health behaviors has been mixed, though recent cross-cultural research has documented associations with exercising regularly, not smoking, and prudent diet. At the biological level, cortisol output has been consistently shown to be lower among individuals reporting positive affect, and favorable associations with heart rate, blood pressure, and inflammatory markers such as interleukin-6 have also been described. Importantly, these relationships are independent of negative affect and depressed mood, suggesting that positive affect may have distinctive biological correlates that can benefit health. At the same time, positive affect is associated with protective psychosocial factors such as greater social connectedness, perceived social support, optimism, and preference for adaptive coping responses. Positive affect may be part of a broader profile of psychosocial resilience that reduces risk of adverse physical health outcomes.

  9. Facial, vocal and cross-modal emotion processing in early-onset schizophrenia spectrum disorders.

    Science.gov (United States)

    Giannitelli, Marianna; Xavier, Jean; François, Anne; Bodeau, Nicolas; Laurent, Claudine; Cohen, David; Chaby, Laurence

    2015-10-01

    Recognition of emotional expressions plays an essential role in children's healthy development. Anomalies in these skills may result in empathy deficits, social interaction difficulties and premorbid emotional problems in children and adolescents with schizophrenia. Twenty-six subjects with early onset schizophrenia spectrum (EOSS) disorders and twenty-eight matched healthy controls (HC) were instructed to identify five basic emotions and a neutral expression. The assessment entailed presenting visual, auditory and congruent cross-modal stimuli. Using a generalized linear mixed model, we found no significant association for handedness, age or gender. However, significant associations emerged for emotion type, perception modality, and group. EOSS patients performed worse than HC in uni- and cross-modal emotional tasks with a specific negative emotion processing impairment pattern. There was no relationship between emotion identification scores and positive or negative symptoms, self-reported empathy traits or a positive history of developmental disorders. However, we found a significant association between emotional identification scores and nonverbal communication impairments. We conclude that cumulative dysfunctions in both nonverbal communication and emotion processing contribute to the social vulnerability and morbidity found in youths who display EOSS disorder.

  10. Facial Asymmetry and Emotional Expression

    CERN Document Server

    Pickin, Andrew

    2011-01-01

    This report is about facial asymmetry, its connection to emotional expression, and methods of measuring facial asymmetry in videos of faces. The research was motivated by two factors: firstly, there was a real opportunity to develop a novel measure of asymmetry that required minimal human involvement and that improved on earlier measures in the literature; and secondly, the study of the relationship between facial asymmetry and emotional expression is both interesting in its own right, and important because it can inform neuropsychological theory and answer open questions concerning emotional processing in the brain. The two aims of the research were: first, to develop an automatic frame-by-frame measure of facial asymmetry in videos of faces that improved on previous measures; and second, to use the measure to analyse the relationship between facial asymmetry and emotional expression, and connect our findings with previous research of the relationship.

  11. The appraisal of facial beauty is rapid but not mandatory.

    Science.gov (United States)

    Schacht, Annekathrin; Werheid, Katja; Sommer, Werner

    2008-06-01

    Facial attractiveness is an important source of social affective information. Here, we studied the time course and task dependence of evaluating attractive faces from a viewer's perspective. Event-related brain potentials (ERPs) were recorded while participants classified color portraits of unfamiliar persons according to gender and facial attractiveness. During attractiveness classification, enhanced ERP amplitudes for attractive and nonattractive faces relative to faces of intermediate attractiveness were found for an early component around 150 msec and for the late positive complex (LPC). Whereas LPC enhancement conforms to previous studies employing various types of affective stimuli, the finding of an early effect extends earlier research on rapid emotion processing to the dimension of facial attractiveness. Dipole source localization of this early ERP effect revealed a scalp distribution suggesting activation of posterior extrastriate areas. Importantly, attractiveness-related modulations of brain responses were only marginal during the gender decision task, arguing against the automaticity of attractiveness appraisal.

  12. Microflora of Processed Cheese and the Factors Affecting It.

    Science.gov (United States)

    Buňková, Leona; Buňka, František

    2015-09-11

    The basic raw materials for the production of processed cheese are natural cheese which is treated by heat with the addition of emulsifying salts. From a point of view of the melting temperatures used (and the pH-value of the product), the course of processed cheese production can be considered "pasteurisation of cheese". During the melting process, the majority of vegetative forms of microorganisms, including bacteria of the family Enterobacteriaceae, are inactivated. The melting temperatures are not sufficient to kill the endospores, which survive the process but they are often weakened. From a microbiological point of view, the biggest contamination problem of processed cheese is caused by gram-positive spore-forming rod-shaped bacteria of the genera Bacillus, Geobacillus and Clostridium. Other factors affecting the shelf-life and quality of processed cheese are mainly the microbiological quality of the raw materials used, strict hygienic conditions during the manufacturing process as well as the type of packaging materials and storage conditions. The quality of processed cheese is not only dependent on the ingredients used but also on other parameters such as the value of water activity of the processed cheese, its pH-value, the presence of salts and emulsifying salts and the amount of fat in the product.

  13. Electrophysiological differences in the processing of affect misattribution.

    Science.gov (United States)

    Hashimoto, Yohei; Minami, Tetsuto; Nakauchi, Shigeki

    2012-01-01

    The affect misattribution procedure (AMP) was proposed as a technique to measure an implicit attitude to a prime image [1]. In the AMP, neutral symbols (e.g., a Chinese pictograph, called the target) are presented, following an emotional stimulus (known as the prime). Participants often misattribute the positive or negative affect of the priming images to the targets in spite of receiving an instruction to ignore the primes. The AMP effect has been investigated using behavioral measures; however, it is difficult to identify when the AMP effect occurs in emotional processing-whether the effect may occur in the earlier attention allocation stage or in the later evaluation stage. In this study, we examined the neural correlates of affect misattribution, using event-related potential (ERP) dividing the participants into two groups based on their tendency toward affect misattribution. The ERP results showed that the amplitude of P2 was larger for the prime at the parietal location in participants showing a low tendency to misattribution than for those showing a high tendency, while the effect of judging neutral targets amiss according to the primes was reflected in the late processing of targets (LPP). In addition, the topographic pattern analysis revealed that EPN-like component to targets was correlated with the difference of AMP tendency as well as P2 to primes and LPP to targets. Taken together, the mechanism of the affective misattribution was closely related to the attention allocation processing. Our findings provide neural evidence that evaluations of neutral targets are misattributed to emotional primes.

  14. Electrophysiological differences in the processing of affect misattribution.

    Directory of Open Access Journals (Sweden)

    Yohei Hashimoto

    Full Text Available The affect misattribution procedure (AMP was proposed as a technique to measure an implicit attitude to a prime image [1]. In the AMP, neutral symbols (e.g., a Chinese pictograph, called the target are presented, following an emotional stimulus (known as the prime. Participants often misattribute the positive or negative affect of the priming images to the targets in spite of receiving an instruction to ignore the primes. The AMP effect has been investigated using behavioral measures; however, it is difficult to identify when the AMP effect occurs in emotional processing-whether the effect may occur in the earlier attention allocation stage or in the later evaluation stage. In this study, we examined the neural correlates of affect misattribution, using event-related potential (ERP dividing the participants into two groups based on their tendency toward affect misattribution. The ERP results showed that the amplitude of P2 was larger for the prime at the parietal location in participants showing a low tendency to misattribution than for those showing a high tendency, while the effect of judging neutral targets amiss according to the primes was reflected in the late processing of targets (LPP. In addition, the topographic pattern analysis revealed that EPN-like component to targets was correlated with the difference of AMP tendency as well as P2 to primes and LPP to targets. Taken together, the mechanism of the affective misattribution was closely related to the attention allocation processing. Our findings provide neural evidence that evaluations of neutral targets are misattributed to emotional primes.

  15. Clinical and experimental study on facial paralysis in temporal bone fracture

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Objective: To study the main prognostic factors and significanceof facial nerve decompression for facial paralysis in temporal bone fracture.Methods: The main relative prognostic factors of 64 patients with facial paralysis were analyzed. An experimental model of facial paralysis was made. The expansion rates of facial nerve in the facial canal opening group and the facial canal non-opening group were measured and observed under electron microscope.Results: The main factors affecting the prognosis were facial nerve decompression and selection of surgery time. The expansion rate of facial nerve in the facial canal opening group was significantly higher than that of the facial canal non-opening group (t=7.53, P<0.01). The injury degree of the nerve fiber in the facial canal non-opening group was severe.Conclusions: Early facial nerve decompression is beneficial to restoration of the facial nerve function.

  16. Spelling-to-sound correspondences affect acronym recognition processes.

    Science.gov (United States)

    Playfoot, David; Izura, Cristina

    2015-01-01

    A large body of research has examined the factors that affect the speed with which words are recognized in lexical decision tasks. Nothing has yet been reported concerning the important factors in differentiating acronyms (e.g., BBC, HIV, NASA) from nonwords. It appears that this task poses little problem for skilled readers, in spite of the fact that acronyms have uncommon, even illegal, spellings in English. We used regression techniques to examine the role of a number of lexical and nonlexical variables known to be important in word processing in relation to lexical decision for acronym targets. Findings indicated that acronym recognition is affected by age of acquisition and imageability. In a departure from findings in word recognition, acronym recognition was not affected by frequency. Lexical decision responses for acronyms were also affected by the relationship between spelling and sound-a pattern not usually observed in word recognition. We argue that the complexity of acronym recognition means that the process draws phonological information in addition to semantics.

  17. Contributions of feature shapes and surface cues to the recognition of facial expressions.

    Science.gov (United States)

    Sormaz, Mladen; Young, Andrew W; Andrews, Timothy J

    2016-10-01

    Theoretical accounts of face processing often emphasise feature shapes as the primary visual cue to the recognition of facial expressions. However, changes in facial expression also affect the surface properties of the face. In this study, we investigated whether this surface information can also be used in the recognition of facial expression. First, participants identified facial expressions (fear, anger, disgust, sadness, happiness) from images that were manipulated such that they varied mainly in shape or mainly in surface properties. We found that the categorization of facial expression is possible in either type of image, but that different expressions are relatively dependent on surface or shape properties. Next, we investigated the relative contributions of shape and surface information to the categorization of facial expressions. This employed a complementary method that involved combining the surface properties of one expression with the shape properties from a different expression. Our results showed that the categorization of facial expressions in these hybrid images was equally dependent on the surface and shape properties of the image. Together, these findings provide a direct demonstration that both feature shape and surface information make significant contributions to the recognition of facial expressions.

  18. Separation of mouse embryonic facial ectoderm and mesenchyme.

    Science.gov (United States)

    Li, Hong; Williams, Trevor

    2013-04-12

    Orofacial clefts are the most frequent craniofacial defects, which affect 1.5 in 1,000 newborns worldwide. Orofacial clefting is caused by abnormal facial development. In human and mouse, initial growth and patterning of the face relies on several small buds of tissue, the facial prominences. The face is derived from six main prominences: paired frontal nasal processes (FNP), maxillary prominences (MxP) and mandibular prominences (MdP). These prominences consist of swellings of mesenchyme that are encased in an overlying epithelium. Studies in multiple species have shown that signaling crosstalk between facial ectoderm and mesenchyme is critical for shaping the face. Yet, mechanistic details concerning the genes involved in these signaling relays are lacking. One way to gain a comprehensive understanding of gene expression, transcription factor binding, and chromatin marks associated with the developing facial ectoderm and mesenchyme is to isolate and characterize the separated tissue compartments. Here we present a method for separating facial ectoderm and mesenchyme at embryonic day (E) 10.5, a critical developmental stage in mouse facial formation that precedes fusion of the prominences. Our method is adapted from the approach we have previously used for dissecting facial prominences. In this earlier study we had employed inbred C57BL/6 mice as this strain has become a standard for genetics, genomics and facial morphology. Here, though, due to the more limited quantities of tissue available, we have utilized the outbred CD-1 strain that is cheaper to purchase, more robust for husbandry, and tending to produce more embryos (12-18) per litter than any inbred mouse strain. Following embryo isolation, neutral protease Dispase II was used to treat the whole embryo. Then, the facial prominences were dissected out, and the facial ectoderm was separated from the mesenchyme. This method keeps both the facial ectoderm and mesenchyme intact. The samples obtained using this

  19. The effect of affective context on visuocortical processing of neutral faces in social anxiety - An ERP study

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2015-11-01

    Full Text Available It has been demonstrated that verbal context information alters the neural processing of ambiguous faces such as faces with no apparent facial expression. In social anxiety, neutral faces may be implicitly threatening for socially anxious individuals due to their ambiguous nature, but even more so if these neutral faces are put in self-referential negative contexts. Therefore, we measured event-related brain potentials (ERPs in response to neutral faces which were preceded by affective verbal information (negative, neutral, positive. Participants with low social anxiety (LSA; n = 23 and high social anxiety (HSA; n = 21 were asked to watch and rate valence and arousal of the respective faces while continuous EEG was recorded. ERP analysis revealed that HSA showed elevated P100 amplitudes in response to faces, but reduced structural encoding of faces as indexed by reduced N170 amplitudes. In general, affective context led to an enhanced early posterior negativity (EPN for negative compared to neutral facial expressions. Moreover, HSA compared to LSA showed enhanced late positive potentials (LPP to negatively contextualized faces, whereas in LSA this effect was found for faces in positive contexts. Also, HSA rated faces in negative contexts as more negative compared to LSA. These results point at enhanced vigilance for neutral faces regardless of context in HSA, while structural encoding seems to be diminished (avoidance. Interestingly, later components of sustained processing (LPP indicate that LSA show enhanced visuocortical processing for faces in positive contexts (happy bias, whereas this seems to be the case for negatively contextualized faces in HSA (threat bias. Finally, our results add further new evidence that top-down information in interaction with individual anxiety levels can influence early-stage aspects of visual perception.

  20. Oil sand process-affected water treatment using coke adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Gamal El-Din, M.; Pourrezaei, P.; Chelme-Ayala, P.; Zubot, W. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering

    2010-07-01

    Oil sands operations generate an array of oil sands process-affected water (OSPW) that will eventually be released to the environment. This water must be evaluated within conventional and advanced water treatment technologies. Water management strategies propose options for increased reuse and recycling of water from settling ponds, as well as safe discharge. This presentation outlined the typical composition of OSPW. Constituents of concern in OSPW include suspended solids, hydrocarbons, salts, ammonia, trace metals, and dissolved organics such as naphthenic acids (NAs). Petroleum coke is one of the by-products generated from bitumen extraction in the oil sands industry and can be used as one of the possible treatment processes for the removal of organic compounds found in OSPW. Activated carbon adsorption is an effective process, able to adsorb organic substances such as oils, radioactive compounds, petroleum hydrocarbons, poly aromatic hydrocarbons and various halogenated compounds. The objectives of this study were to evaluate the production of activated carbon from petroleum coke using steam as the activation media; to determine the factors affecting the absorption of NAs; and to evaluate the activated coke adsorption capacity for the reduction of NAs and dissolved organic carbons present in OSPW. It was concluded that petroleum non-activated coke has the ability to decrease COD, alkalinity, and NA concentration. tabs., figs.

  1. Agricultural management affects evolutionary processes in a migratory songbird.

    Science.gov (United States)

    Perlut, Noah G; Freeman-Gallant, Corey R; Strong, Allan M; Donovan, Therese M; Kilpatrick, C William; Zalik, Nathan J

    2008-03-01

    Hay harvests have detrimental ecological effects on breeding songbirds, as harvesting results in nest failure. Importantly, whether harvesting also affects evolutionary processes is not known. We explored how hay harvest affected social and genetic mating patterns, and thus, the overall opportunity for sexual selection and evolutionary processes for a ground-nesting songbird, the Savannah sparrow (Passerculus sandwichensis). On an unharvested field, 55% of females were in polygynous associations, and social polygyny was associated with greater rates of extra-pair paternity (EPP). In this treatment, synchrony explained variation in EPP rates, as broods by more synchronous females had more EPP than broods by asynchronous females. In contrast, on a harvested field, simultaneous nest failure caused by haying dramatically decreased the overall incidence of EPP by increasing the occurrence of social monogamy and, apparently, the ability of polygynous males to maintain paternity in their own nests. Despite increased social and genetic monogamy, these haying-mediated changes in mating systems resulted in greater than twofold increase in the opportunity for sexual selection. This effect arose, in part, from a 30% increase in the variance associated with within-pair fertilization success, relative to the unharvested field. This effect was caused by a notable increase (+110%) in variance associated with the quality of social mates following simultaneous nest failure. Because up to 40% of regional habitat is harvested by early June, these data may demonstrate a strong population-level effect on mating systems, sexual selection, and consequently, evolutionary processes.

  2. Agricultural management affects evolutionary processes in a migratory songbird

    Science.gov (United States)

    Perlut, N.G.; Freeman-Gallant, C. R.; Strong, A.M.; Donovan, T.M.; Kilpatrick, C.W.; Zalik, N.J.

    2008-01-01

    Hay harvests have detrimental ecological effects on breeding songbirds, as harvesting results in nest failure. Importantly, whether harvesting also affects evolutionary processes is not known. We explored how hay harvest affected social and genetic mating patterns, and thus, the overall opportunity for sexual selection and evolutionary processes for a ground-nesting songbird, the Savannah sparrow (Passerculus sandwichensis). On an unharvested field, 55% of females were in polygynous associations, and social polygyny was associated with greater rates of extra-pair paternity (EPP). In this treatment, synchrony explained variation in EPP rates, as broods by more synchronous females had more EPP than broods by asynchronous females. In contrast, on a harvested field, simultaneous nest failure caused by haying dramatically decreased the overall incidence of EPP by increasing the occurrence of social monogamy and, apparently, the ability of polygynous males to maintain paternity in their own nests. Despite increased social and genetic monogamy, these haying-mediated changes in mating systems resulted in greater than twofold increase in the opportunity for sexual selection. This effect arose, in part, from a 30% increase in the variance associated with within-pair fertilization success, relative to the unharvested field. This effect was caused by a notable increase (+110%) in variance associated with the quality of social mates following simultaneous nest failure. Because up to 40% of regional habitat is harvested by early June, these data may demonstrate a strong population-level effect on mating systems, sexual selection, and consequently, evolutionary processes. ?? 2008 The Authors.

  3. Analysis methods for facial motion

    Directory of Open Access Journals (Sweden)

    Katsuaki Mishima

    2009-05-01

    Full Text Available Objective techniques to evaluate a facial movement are indispensable for the contemporary treatment of patients with motor disorders such as facial paralysis, cleft lip, postoperative head and neck cancer, and so on. Recently, computer-assisted, video-based techniques have been devised and reported as measuring systems in which facial movements can be evaluated quantitatively. Commercially available motion analysis systems, in which a stereo-measuring technique with multiple cameras and markers to facilitate search of matching among images through all cameras, also are utilized, and are used in many measuring systems such as video-based systems. The key is how the problems of facial movement can be extracted precisely, and how useful information for the diagnosis and decision-making process can be derived from analyses of facial movement. Therefore, it is important to discuss which facial animations should be examined, and whether fixation of the head and markers attached to the face can hamper natural facial movement.

  4. Pediatric facial nerve rehabilitation.

    Science.gov (United States)

    Banks, Caroline A; Hadlock, Tessa A

    2014-11-01

    Facial paralysis is a rare but severe condition in the pediatric population. Impaired facial movement has multiple causes and varied presentations, therefore individualized treatment plans are essential for optimal results. Advances in facial reanimation over the past 4 decades have given rise to new treatments designed to restore balance and function in pediatric patients with facial paralysis. This article provides a comprehensive review of pediatric facial rehabilitation and describes a zone-based approach to assessment and treatment of impaired facial movement.

  5. Development of brain mechanisms for processing affective touch

    Directory of Open Access Journals (Sweden)

    Malin eBjornsdotter

    2014-02-01

    Full Text Available Affective tactile stimulation plays a key role in the maturation of neural circuits, but the development of brain mechanisms processing touch is poorly understood. We therefore used functional magnetic resonance imaging (fMRI to study brain responses to soft brush stroking of both glabrous (palm and hairy (forearm skin in healthy children (5-13 years, adolescents (14-17 years and adults (25-35 years. Adult-defined regions-of-interests in the primary somatosensory cortex (SI, secondary somatosensory cortex (SII, insular cortex and right posterior superior temporal sulcus (pSTS were significantly and similarly activated in all age groups. Whole-brain analyses revealed that responses in the ipsilateral SII were positively correlated with age in both genders, and that responses in bilateral regions near the pSTS correlated significantly and strongly with age in females but not in males. These results suggest that brain mechanisms associated with both sensory-discriminative and affective-motivational aspects of touch are largely established in school-aged children, and that there is a general continuing maturation of SII and a female-specific increase in pSTS sensitivity with age. Our work establishes a groundwork for future comparative studies of tactile processing in developmental disorders characterized by disrupted social perception such as autism.

  6. Affective and executive network processing associated with persuasive antidrug messages.

    Science.gov (United States)

    Ramsay, Ian S; Yzer, Marco C; Luciana, Monica; Vohs, Kathleen D; MacDonald, Angus W

    2013-07-01

    Previous research has highlighted brain regions associated with socioemotional processes in persuasive message encoding, whereas cognitive models of persuasion suggest that executive brain areas may also be important. The current study aimed to identify lateral prefrontal brain areas associated with persuasive message viewing and understand how activity in these executive regions might interact with activity in the amygdala and medial pFC. Seventy adolescents were scanned using fMRI while they watched 10 strongly convincing antidrug public service announcements (PSAs), 10 weakly convincing antidrug PSAs, and 10 advertisements (ads) unrelated to drugs. Antidrug PSAs compared with nondrug ads more strongly elicited arousal-related activity in the amygdala and medial pFC. Within antidrug PSAs, those that were prerated as strongly persuasive versus weakly persuasive showed significant differences in arousal-related activity in executive processing areas of the lateral pFC. In support of the notion that persuasiveness involves both affective and executive processes, functional connectivity analyses showed greater coactivation between the lateral pFC and amygdala during PSAs known to be strongly (vs. weakly) convincing. These findings demonstrate that persuasive messages elicit activation in brain regions responsible for both emotional arousal and executive control and represent a crucial step toward a better understanding of the neural processes responsible for persuasion and subsequent behavior change.

  7. Facial symmetry evaluation after experimentally displaced condylar process fracture in methotrexate treated rats Avaliação da simetria facial após fratura experimental com desvio do processo condilar em ratos tratados com metotrexato

    Directory of Open Access Journals (Sweden)

    Samantha Cristine Santos Xisto Braga Cavalcanti

    2012-03-01

    Full Text Available PURPOSE: To investigate the facial symmetry of high and low dose methotrexate (MTX treated rats submitted to experimentally displaced mandibular condyle fracture through the recording of cephalometric measurements. METHODS: One hundred male Wistar rats underwent surgery using an experimental model of right condylar fracture. Animals were divided into four groups: A - saline solution (1mL/week; B - dexamethasone (DEX (0,15mg/Kg; C - MTX low dose (3 mg/Kg/week; D - MTX high dose (30 mg/Kg. Animals were sacrificed at 1, 7, 15, 30 and 90 days postoperatively (n=5. Body weight was recorded. Specimens were submitted to axial radiographic incidence, and cephalometric mensurations were made using a computer system. Linear measurements of skull and mandible, as well as angular measurements of mandibular deviation were taken. Data were subjected to statistical analyses among the groups, periods of sacrifice and between the sides in each group (α=0.05. RESULTS: Animals regained body weight over time, except in group D. There was reduction in the mandibular length and also changes in the maxilla as well as progressive deviation in the mandible in relation to the skull basis in group D. CONCLUSION: Treatment with high dose methotrexate had deleterious effect on facial symmetry of rats submitted to experimentally displaced condylar process fracture.OBJETIVO: Avaliar a simetria facial de ratos tratados com metotrexato (MTX, em dose alta e baixa, submetidos à fratura experimental do processo condilar com desvio por meio de mensurações cefalométricas. MÉTODOS: Cem ratos Wistar machos foram submetidos a procedimento cirúrgico utilizando modelo experimental de fratura de côndilo do lado direito. Os animais foram distribuídos em quatro grupos: A - soro fisiológico (1mL/semana; B - dexametasona (DEX (0,15mg/Kg; C - MTX baixa dose (3mg/Kg/semana; D - MTX alta dose (30mg/Kg. Os períodos de sacrifício foram de 1, 7, 15, 30 e 90 dias de pós-operatório (n=5

  8. The impact of blocking the activation of facial muscles in the processing of subsequent emotional infromation, and its mechanism

    OpenAIRE

    Domingos, Ana Maria Basílio Cabral

    2012-01-01

    Tese submetida como requisito parcial para obtenção do grau de Doutoramento em Psicologia na área de Psicologia Cognitiva In this thesis we focus on how previous activation of the representation of an emotional state impacts the processing of subsequent emotional information (within a priming paradigm). Our approach is guided by an embodied perspective on cognition. According to embodied cognition theories, affective representations are partial simulations of emotional experien...

  9. Facial Scar Revision: Understanding Facial Scar Treatment

    Science.gov (United States)

    ... more to fully heal and achieve maximum improved appearance. Facial plastic surgery makes it possible to correct facial flaws that can undermine self-confidence. Changing how your scar looks can help change ...

  10. Facial attractiveness: General patterns of facial preferences

    National Research Council Canada - National Science Library

    Kościński, Krzysztof

    2007-01-01

    This review covers universal patterns in facial preferences. Facial attractiveness has fascinated thinkers since antiquity, but has been the subject of intense scientific study for only the last quarter of a century...

  11. Facial rehabilitation: a neuromuscular reeducation, patient-centered approach.

    Science.gov (United States)

    Vanswearingen, Jessie

    2008-05-01

    Individuals with facial paralysis and distorted facial expressions and movements secondary to a facial neuromotor disorder experience substantial physical, psychological, and social disability. Previously, facial rehabilitation has not been widely available or considered to be of much benefit. An emerging rehabilitation science of neuromuscular reeducation and evidence for the efficacy of facial neuromuscular reeducation, a process of facilitating the return of intended facial movement patterns and eliminating unwanted patterns of facial movement and expression, may provide patients with disorders of facial paralysis or facial movement control opportunity for the recovery of facial movement and function. We provide a brief overview of the scientific rationale for facial neuromuscular reeducation in the structure and function of the facial neuromotor system, the neuropsychology of facial expression, and relations among expressions, movement, and emotion. The primary purpose is to describe principles of neuromuscular reeducation, assessment and outcome measures, approach to treatment, the process, including surface-electromyographic biofeedback as an adjunct to reeducation, and the goal of enhancing the recovery of facial expression and function in a patient-centered approach to facial rehabilitation.

  12. [Surgical facial reanimation after persisting facial paralysis].

    Science.gov (United States)

    Pasche, Philippe

    2011-10-01

    Facial reanimation following persistent facial paralysis can be managed with surgical procedures of varying complexity. The choice of the technique is mainly determined by the cause of facial paralysis, the age and desires of the patient. The techniques most commonly used are the nerve grafts (VII-VII, XII-VII, cross facial graft), dynamic muscle transfers (temporal myoplasty, free muscle transfert) and static suspensions. An intensive rehabilitation through specific exercises after all procedures is essential to archieve good results.

  13. Coagulation-flocculation pretreatment of oil sands process affected water

    Energy Technology Data Exchange (ETDEWEB)

    Pourrezaei, P.; El-Din, M.G. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering

    2008-07-01

    This presentation addressed the issue of water use in the oil sands industry and efforts to use this limited resource more efficiently. Three wastewater treatment schemes for oil sands tailings ponds were proposed, notably primary, secondary and tertiary treatment. Primary treatment involves the removal of suspended solids using physical-chemical treatments. Secondary treatment involves the removal of dissolved solids and organics using chemical oxidation, ultrafiltration or nanofiltration. Tertiary treatment involves removal of residual organics/solids using biological activated carbon filtration, sand filtration or reverse osmosis. The composition of oil sands process water (OSPW) was also discussed with reference to suspended solids, salts, hydrocarbons, other dissolved organics (such as naphthenic acids and phenols), ammonia, inorganic compounds and trace elements. The conventional coagulation/flocculation process is essential in industrial wastewater treatment. It is cost effective, easy to operate and energy efficient. The process is used because small suspended and colloidal particles and dissolved constituents cannot be removed quickly by sedimentation. A chemical method must be used. Coagulation/flocculation brings small suspended and colloidal particles into contact so that they collide, stick and grow to a size that settles readily. Alum is the predominant and least expensive water treatment coagulant used for the coagulation/flocculation process. It provides positively charged ions to neutralize the negative charge of colloidal particles resulting in aggregation. It creates big settling flocs that enmesh colloids as it settles. The factors affecting the process include pH, chemical type, chemical concentration, rapid mixing intensity, slow mixing intensity and time. tabs., figs.

  14. Facial porokeratosis.

    Science.gov (United States)

    Carranza, Dafnis C; Haley, Jennifer C; Chiu, Melvin

    2008-01-01

    A 34-year-old man from El Salvador was referred to our clinic with a 10-year history of a pruritic erythematous facial eruption. He reported increased pruritus and scaling of lesions when exposed to the sun. He worked as a construction worker and admitted to frequent sun exposure. Physical examination revealed well-circumscribed erythematous to violaceous papules with raised borders and atrophic centers localized to the nose (Figure 1). He did not have lesions on the arms or legs. He did not report a family history of similar lesions. A biopsy specimen was obtained from the edge of a lesion on the right ala. Histologic examination of the biopsy specimen showed acanthosis of the epidermis with focal invagination of the corneal layer and a homogeneous column of parakeratosis in the center of that layer consistent with a cornoid lamella (Figure 2). Furthermore, the granular layer was absent at the cornoid lamella base. The superficial dermis contained a sparse, perivascular lymphocytic infiltrate. No evidence of dysplasia or malignancy was seen. These findings supported a diagnosis of porokeratosis. The patient underwent a trial of cryotherapy with moderate improvement of the facial lesions.

  15. Impaired perception of facial emotion in developmental prosopagnosia.

    Science.gov (United States)

    Biotti, Federica; Cook, Richard

    2016-08-01

    Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties

  16. Freestyle Local Perforator Flaps for Facial Reconstruction

    Directory of Open Access Journals (Sweden)

    Jun Yong Lee

    2015-01-01

    Full Text Available For the successful reconstruction of facial defects, various perforator flaps have been used in single-stage surgery, where tissues are moved to adjacent defect sites. Our group successfully performed perforator flap surgery on 17 patients with small to moderate facial defects that affected the functional and aesthetic features of their faces. Of four complicated cases, three developed venous congestion, which resolved in the subacute postoperative period, and one patient with partial necrosis underwent minor revision. We reviewed the literature on freestyle perforator flaps for facial defect reconstruction and focused on English articles published in the last five years. With the advance of knowledge regarding the vascular anatomy of pedicled perforator flaps in the face, we found that some perforator flaps can improve functional and aesthetic reconstruction for the facial defects. We suggest that freestyle facial perforator flaps can serve as alternative, safe, and versatile treatment modalities for covering small to moderate facial defects.

  17. Freestyle Local Perforator Flaps for Facial Reconstruction.

    Science.gov (United States)

    Lee, Jun Yong; Kim, Ji Min; Kwon, Ho; Jung, Sung-No; Shim, Hyung Sup; Kim, Sang Wha

    2015-01-01

    For the successful reconstruction of facial defects, various perforator flaps have been used in single-stage surgery, where tissues are moved to adjacent defect sites. Our group successfully performed perforator flap surgery on 17 patients with small to moderate facial defects that affected the functional and aesthetic features of their faces. Of four complicated cases, three developed venous congestion, which resolved in the subacute postoperative period, and one patient with partial necrosis underwent minor revision. We reviewed the literature on freestyle perforator flaps for facial defect reconstruction and focused on English articles published in the last five years. With the advance of knowledge regarding the vascular anatomy of pedicled perforator flaps in the face, we found that some perforator flaps can improve functional and aesthetic reconstruction for the facial defects. We suggest that freestyle facial perforator flaps can serve as alternative, safe, and versatile treatment modalities for covering small to moderate facial defects.

  18. When age matters: differences in facial mimicry and autonomic responses to peers' emotions in teenagers and adults

    National Research Council Canada - National Science Library

    Ardizzi, Martina; Sestito, Mariateresa; Martini, Francesca; Umiltà, Maria Alessandra; Ravera, Roberto; Gallese, Vittorio

    2014-01-01

    .... In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions...

  19. Breathing and affective picture processing across the adult lifespan.

    Science.gov (United States)

    Gomez, Patrick; Filippou, Dimitra; Pais, Bruno; von Gunten, Armin; Danuser, Brigitta

    2016-09-01

    The present study investigated differences between healthy younger, middle-aged, and older adults in their respiratory responses to pictures of different valence and arousal. Expiratory time shortened and end-tidal PCO2 decreased with increasing arousal in all age groups; yet, compared to younger adults, older adults' overall change from baseline was smaller for expiratory time and larger for end-tidal PCO2. Contrary to their younger counterparts, older adults' inspiratory time did not shorten with increasing arousal. Inspiratory duty cycle did not covary with affective ratings for younger adults, increased with unpleasantness for middle-aged adults, and increased with arousal for older adults. Thoracic breathing increased with increasing unpleasantness only among older adults. Age had no effects on mean inspiratory flow and minute ventilation, which both augmented as arousal increased. We discuss how age effects on respiratory response magnitude and pattern may depend on age-associated biological changes or reflect age-related differences in emotional processing.

  20. The Musical Emotional Bursts: A validated set of musical affect bursts to investigate auditory affective processing.

    Directory of Open Access Journals (Sweden)

    Sébastien ePaquette

    2013-08-01

    Full Text Available The Musical Emotional Bursts (MEB consist of 80 brief musical executions expressing basic emotional states (happiness, sadness and fear and neutrality. These musical bursts were designed to be the musical analogue of the Montreal Affective Voices (MAV – a set of brief non-verbal affective vocalizations portraying different basic emotions. The MEB consist of short (mean duration: 1.6 sec improvisations on a given emotion or of imitations of a given MAV stimulus, played on a violin (n:40 or a clarinet (n:40. The MEB arguably represent a primitive form of music emotional expression, just like the MAV represent a primitive form of vocal, nonlinguistic emotional expression. To create the MEB, stimuli were recorded from 10 violinists and 10 clarinetists, and then evaluated by 60 participants. Participants evaluated 240 stimuli (30 stimuli x 4 [3 emotions + neutral] x 2 instruments by performing either a forced-choice emotion categorization task, a valence rating task or an arousal rating task (20 subjects per task; 40 MAVs were also used in the same session with similar task instructions. Recognition accuracy of emotional categories expressed by the MEB (n:80 was lower than for the MAVs but still very high with an average percent correct recognition score of 80.4%. Highest recognition accuracies were obtained for happy clarinet (92.0% and fearful or sad violin (88.0% each MEB stimuli. The MEB can be used to compare the cerebral processing of emotional expressions in music and vocal communication, or used for testing affective perception in patients with communication problems.

  1. The "Musical Emotional Bursts": a validated set of musical affect bursts to investigate auditory affective processing.

    Science.gov (United States)

    Paquette, Sébastien; Peretz, Isabelle; Belin, Pascal

    2013-01-01

    The Musical Emotional Bursts (MEB) consist of 80 brief musical executions expressing basic emotional states (happiness, sadness and fear) and neutrality. These musical bursts were designed to be the musical analog of the Montreal Affective Voices (MAV)-a set of brief non-verbal affective vocalizations portraying different basic emotions. The MEB consist of short (mean duration: 1.6 s) improvisations on a given emotion or of imitations of a given MAV stimulus, played on a violin (10 stimuli × 4 [3 emotions + neutral]), or a clarinet (10 stimuli × 4 [3 emotions + neutral]). The MEB arguably represent a primitive form of music emotional expression, just like the MAV represent a primitive form of vocal, non-linguistic emotional expression. To create the MEB, stimuli were recorded from 10 violinists and 10 clarinetists, and then evaluated by 60 participants. Participants evaluated 240 stimuli [30 stimuli × 4 (3 emotions + neutral) × 2 instruments] by performing either a forced-choice emotion categorization task, a valence rating task or an arousal rating task (20 subjects per task); 40 MAVs were also used in the same session with similar task instructions. Recognition accuracy of emotional categories expressed by the MEB (n:80) was lower than for the MAVs but still very high with an average percent correct recognition score of 80.4%. Highest recognition accuracies were obtained for happy clarinet (92.0%) and fearful or sad violin (88.0% each) MEB stimuli. The MEB can be used to compare the cerebral processing of emotional expressions in music and vocal communication, or used for testing affective perception in patients with communication problems.

  2. Guiding atypical facial growth back to normal. Part 1: Understanding facial growth.

    Science.gov (United States)

    Galella, Steve; Chow, Daniel; Jones, Earl; Enlow, Donald; Masters, Ari

    2011-01-01

    Many practitioners find the complexity of facial growth overwhelming and thus merely observe and accept the clinical features of atypical growth and do not comprehend the long-term consequences. Facial growth and development is a strictly controlled biological process. Normal growth involves ongoing bone remodeling and positional displacement. Atypical growth begins when this biological balance is disturbed With the understanding of these processes, clinicians can adequately assess patients and determine the causes of these atypical facial growth patterns and design effective treatment plans. This is the first of a series of articles which addresses normal facial growth, atypical facial growth, patient assessment, causes of atypical facial growth, and guiding facial growth back to normal.

  3. The Role of the Amygdala in Facial Trustworthiness Processing: A Systematic Review and Meta-Analyses of fMRI Studies

    Science.gov (United States)

    Oliveiros, Bárbara

    2016-01-01

    Background Faces play a key role in signaling social cues such as signals of trustworthiness. Although several studies identify the amygdala as a core brain region in social cognition, quantitative approaches evaluating its role are scarce. Objectives This review aimed to assess the role of the amygdala in the processing of facial trustworthiness, by analyzing its amplitude BOLD response polarity to untrustworthy versus trustworthy facial signals under fMRI tasks through a Meta-analysis of effect sizes (MA). Activation Likelihood Estimation (ALE) analyses were also conducted. Data sources Articles were retrieved from MEDLINE, ScienceDirect and Web-of-Science in January 2016. Following the PRISMA statement guidelines, a systematic review of original research articles in English language using the search string “(face OR facial) AND (trustworthiness OR trustworthy OR untrustworthy OR trustee) AND fMRI” was conducted. Study selection and data extraction The MA concerned amygdala responses to facial trustworthiness for the contrast Untrustworthy vs. trustworthy faces, and included whole-brain and ROI studies. To prevent potential bias, results were considered even when at the single study level they did not survive correction for multiple comparisons or provided non-significant results. ALE considered whole-brain studies, using the same methodology to prevent bias. A summary of the methodological options (design and analysis) described in the articles was finally used to get further insight into the characteristics of the studies and to perform a subgroup analysis. Data were extracted by two authors and checked independently. Data synthesis Twenty fMRI studies were considered for systematic review. An MA of effect sizes with 11 articles (12 studies) showed high heterogeneity between studies [Q(11) = 265.68, p trustworthiness. Six articles/studies showed that posterior cingulate and medial frontal gyrus present positive correlations with increasing facial

  4. Understanding the impact of 5-HTTLPR, antidepressants, and acute tryptophan depletion on brain activation during facial emotion processing: A review of the imaging literature.

    Science.gov (United States)

    Raab, Kyeon; Kirsch, Peter; Mier, Daniela

    2016-12-01

    Detecting and evaluating emotional information from facial expressions as a basis for behavioural adaption belong to the core social-cognitive abilities of mankind. Dysfunctions in emotional face processing are observed in several major psychiatric disorders like depression and schizophrenia. In search for psychiatric disease biomarkers using the imaging genetics approach, serotonergic gene polymorphisms have been associated with altered brain circuit activation during emotional face processing. Especially the 5-HTTLPR gene polymorphism has been extensively investigated in association with emotion regulation processes. In this article, imaging genetics literature on emotional face processing, reporting genetic effects of 5-HTTLPR in healthy volunteers is reviewed. Additionally, these results are regarded in relation to pharmacologic challenge (antidepressants, acute tryptophan depletion) imaging studies and discussed in light of recent neurobiological evidence with a focus on serotonin (5-HT1A, 5-HT2C, 5-HT2A) receptor findings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Neuroticism delays detection of facial expressions

    OpenAIRE

    Sawada, Reiko; Sato, Wataru; Uono, Shota; Kochiyama, Takanori; Kubota, Yasutaka; Yoshimura, Sayaka; Toichi, Motomi

    2016-01-01

    The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a cr...

  6. Measuring Facial Movement

    Science.gov (United States)

    Ekman, Paul; Friesen, Wallace V.

    1976-01-01

    The Facial Action Code (FAC) was derived from an analysis of the anatomical basis of facial movement. The development of the method is explained, contrasting it to other methods of measuring facial behavior. An example of how facial behavior is measured is provided, and ideas about research applications are discussed. (Author)

  7. How processing digital elevation models can affect simulated water budgets.

    Science.gov (United States)

    Kuniansky, Eve L; Lowery, Mark A; Campbell, Bruce G

    2009-01-01

    For regional models, the shallow water table surface is often used as a source/sink boundary condition, as model grid scale precludes simulation of the water table aquifer. This approach is appropriate when the water table surface is relatively stationary. Since water table surface maps are not readily available, the elevation of the water table used in model cells is estimated via a two-step process. First, a regression equation is developed using existing land and water table elevations from wells in the area. This equation is then used to predict the water table surface for each model cell using land surface elevation available from digital elevation models (DEM). Two methods of processing DEM for estimating the land surface for each cell are commonly used (value nearest the cell centroid or mean value in the cell). This article demonstrates how these two methods of DEM processing can affect the simulated water budget. For the example presented, approximately 20% more total flow through the aquifer system is simulated if the centroid value rather than the mean value is used. This is due to the one-third greater average ground water gradients associated with the centroid value than the mean value. The results will vary depending on the particular model area topography and cell size. The use of the mean DEM value in each model cell will result in a more conservative water budget and is more appropriate because the model cell water table value should be representative of the entire cell area, not the centroid of the model cell.

  8. Facial Electromyographic Responses to Emotional Information from Faces and Voices in Individuals with Pervasive Developmental Disorder

    Science.gov (United States)

    Magnee, Maurice J. C. M.; de Gelder, Beatrice; van Engeland, Herman; Kemner, Chantal

    2007-01-01

    Background: Despite extensive research, it is still debated whether impairments in social skills of individuals with pervasive developmental disorder (PDD) are related to specific deficits in the early processing of emotional information. We aimed to test both automatic processing of facial affect as well as the integration of auditory and visual…

  9. Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders.

    Science.gov (United States)

    Hamm, Jihun; Kohler, Christian G; Gur, Ruben C; Verma, Ragini

    2011-09-15

    Facial expression is widely used to evaluate emotional impairment in neuropsychiatric disorders. Ekman and Friesen's Facial Action Coding System (FACS) encodes movements of individual facial muscles from distinct momentary changes in facial appearance. Unlike facial expression ratings based on categorization of expressions into prototypical emotions (happiness, sadness, anger, fear, disgust, etc.), FACS can encode ambiguous and subtle expressions, and therefore is potentially more suitable for analyzing the small differences in facial affect. However, FACS rating requires extensive training, and is time consuming and subjective thus prone to bias. To overcome these limitations, we developed an automated FACS based on advanced computer science technology. The system automatically tracks faces in a video, extracts geometric and texture features, and produces temporal profiles of each facial muscle movement. These profiles are quantified to compute frequencies of single and combined Action Units (AUs) in videos, and they can facilitate a statistical study of large populations in disorders known to impact facial expression. We derived quantitative measures of flat and inappropriate facial affect automatically from temporal AU profiles. Applicability of the automated FACS was illustrated in a pilot study, by applying it to data of videos from eight schizophrenia patients and controls. We created temporal AU profiles that provided rich information on the dynamics of facial muscle movements for each subject. The quantitative measures of flatness and inappropriateness showed clear differences between patients and the controls, highlighting their potential in automatic and objective quantification of symptom severity. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Positive affect and psychosocial processes related to health.

    Science.gov (United States)

    Steptoe, Andrew; O'Donnell, Katie; Marmot, Michael; Wardle, Jane

    2008-05-01

    Positive affect is associated with longevity and favourable physiological function. We tested the hypothesis that positive affect is related to health-protective psychosocial characteristics independently of negative affect and socio-economic status. Both positive and negative affect were measured by aggregating momentary samples collected repeatedly over 1 day, and health-related psychosocial factors were assessed by questionnaire in a sample of 716 men and women aged 58-72 years. Positive affect was associated with greater social connectedness, emotional and practical support, optimism and adaptive coping responses, and lower depression, independently of age, gender, household income, paid employment, smoking status, and negative affect. Negative affect was independently associated with negative relationships, greater exposure to chronic stress, depressed mood, pessimism, and avoidant coping. Positive affect may be beneficial for health outcomes in part because it is a component of a profile of protective psychosocial characteristics.

  11. Magnetic resonance imaging of facial muscles

    Energy Technology Data Exchange (ETDEWEB)

    Farrugia, M.E. [Department of Clinical Neurology, University of Oxford, Radcliffe Infirmary, Oxford (United Kingdom)], E-mail: m.e.farrugia@doctors.org.uk; Bydder, G.M. [Department of Radiology, University of California, San Diego, CA 92103-8226 (United States); Francis, J.M.; Robson, M.D. [OCMR, Department of Cardiovascular Medicine, University of Oxford, John Radcliffe Hospital, Oxford (United Kingdom)

    2007-11-15

    Facial and tongue muscles are commonly involved in patients with neuromuscular disorders. However, these muscles are not as easily accessible for biopsy and pathological examination as limb muscles. We have previously investigated myasthenia gravis patients with MuSK antibodies for facial and tongue muscle atrophy using different magnetic resonance imaging sequences, including ultrashort echo time techniques and image analysis tools that allowed us to obtain quantitative assessments of facial muscles. This imaging study had shown that facial muscle measurement is possible and that useful information can be obtained using a quantitative approach. In this paper we aim to review in detail the methods that we applied to our study, to enable clinicians to study these muscles within the domain of neuromuscular disease, oncological or head and neck specialties. Quantitative assessment of the facial musculature may be of value in improving the understanding of pathological processes occurring within facial muscles in certain neuromuscular disorders.

  12. [Association between intelligence development and facial expression recognition ability in children with autism spectrum disorder].

    Science.gov (United States)

    Pan, Ning; Wu, Gui-Hua; Zhang, Ling; Zhao, Ya-Fen; Guan, Han; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2017-03-01

    To investigate the features of intelligence development, facial expression recognition ability, and the association between them in children with autism spectrum disorder (ASD). A total of 27 ASD children aged 6-16 years (ASD group, full intelligence quotient >70) and age- and gender-matched normally developed children (control group) were enrolled. Wechsler Intelligence Scale for Children Fourth Edition and Chinese Static Facial Expression Photos were used for intelligence evaluation and facial expression recognition test. Compared with the control group, the ASD group had significantly lower scores of full intelligence quotient, verbal comprehension index, perceptual reasoning index (PRI), processing speed index(PSI), and working memory index (WMI) (Pchildren have delayed intelligence development compared with normally developed children and impaired expression recognition ability. Perceptual reasoning and working memory abilities are positively correlated with expression recognition ability, which suggests that insufficient perceptual reasoning and working memory abilities may be important factors affecting facial expression recognition ability in ASD children.

  13. Brain Potentials During Affective Picture Processing in Children

    Science.gov (United States)

    Hajcak, Greg; Dennis, Tracy A.

    2008-01-01

    In adults, emotional (e.g., both unpleasant and pleasant) compared to neutral pictures elicit an increase in the early posterior negativity (EPN) and the late positive potential (LPP); modulation of these ERP components are thought to reflect the facilitated processing of, and increased attention to, motivationally salient stimuli. To determine whether the EPN and LPP are sensitive to emotional content in children, high-density EEG was recorded from 18 children who were 5 to 8 years of age (mean age = 77 months, SD = 11 months) while they viewed developmentally appropriate pictures selected from the International Affective Picture System. Self-reported ratings of valence and arousal were also obtained. An EPN was not evident following emotional compared to neutral pictures; however, a positivity maximal at occipital-parietal recording sites was increased from 500 to 1,000 ms following pleasant pictures and from 500 to 1,500 ms following unpleasant pictures. Comparisons between the EPN and LPP observed in children and adults, and implications for developmental studies of emotion, are discussed. PMID:19103249

  14. Social Use of Facial Expressions in Hylobatids.

    Directory of Open Access Journals (Sweden)

    Linda Scheider

    Full Text Available Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS. We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely 'responded to' by the partner's facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics.

  15. Drug effects on responses to emotional facial expressions: recent findings.

    Science.gov (United States)

    Miller, Melissa A; Bershad, Anya K; de Wit, Harriet

    2015-09-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally.

  16. Holistic Processing of Static and Moving Faces

    Science.gov (United States)

    Zhao, Mintao; Bülthoff, Isabelle

    2017-01-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability--holistic face processing--remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based…

  17. Abnormalities in early visual processes are linked to hypersociability and atypical evaluation of facial trustworthiness: An ERP study with Williams syndrome.

    Science.gov (United States)

    Shore, Danielle M; Ng, Rowena; Bellugi, Ursula; Mills, Debra L

    2017-07-06

    Accurate assessment of trustworthiness is fundamental to successful and adaptive social behavior. Initially, people assess trustworthiness from facial appearance alone. These assessments then inform critical approach or avoid decisions. Individuals with Williams syndrome (WS) exhibit a heightened social drive, especially toward strangers. This study investigated the temporal dynamics of facial trustworthiness evaluation in neurotypic adults (TD) and individuals with WS. We examined whether differences in neural activity during trustworthiness evaluation may explain increased approach motivation in WS compared to TD individuals. Event-related potentials were recorded while participants appraised faces previously rated as trustworthy or untrustworthy. TD participants showed increased sensitivity to untrustworthy faces within the first 65-90 ms, indexed by the negative-going rise of the P1 onset (oP1). The amplitude of the oP1 difference to untrustworthy minus trustworthy faces was correlated with lower approachability scores. In contrast, participants with WS showed increased N170 amplitudes to trustworthy faces. The N170 difference to low-high-trust faces was correlated with low approachability in TD and high approachability in WS. The findings suggest that hypersociability associated with WS may arise from abnormalities in the timing and organization of early visual brain activity during trustworthiness evaluation. More generally, the study provides support for the hypothesis that impairments in low-level perceptual processes can have a cascading effect on social cognition.

  18. Rapid influence of emotional scenes on encoding of facial expressions: an ERP study.

    Science.gov (United States)

    Righart, Ruthger; de Gelder, Beatrice

    2008-09-01

    In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing.

  19. Rapid influence of emotional scenes on encoding of facial expressions: an ERP study

    Science.gov (United States)

    Righart, Ruthger

    2008-01-01

    In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing. PMID:19015119

  20. Sources and Processes Affecting Particulate Matter Pollution over North China

    Science.gov (United States)

    Zhang, L.; Shao, J.; Lu, X.; Zhao, Y.; Gong, S.; Henze, D. K.

    2015-12-01

    Severe fine particulate matter (PM2.5) pollution over North China has received broad attention worldwide in recent years. Better understanding the sources and processes controlling pollution over this region is of great importance with urgent implications for air quality policy. We will present a four-dimensional variational (4D-Var) data assimilation system using the GEOS-Chem chemical transport model and its adjoint model at 0.25° × 0.3125° horizontal resolution, and apply it to analyze the factors affecting PM2.5 concentrations over North China. Hourly surface observations of PM2.5 and sulfur dioxide (SO2) from the China National Environmental Monitoring Center (CNEMC) can be assimilated into the model to evaluate and constrain aerosol (primary and precursors) emissions. Application of the data assimilation system to the APEC period (the Asia-Pacific Economic Cooperation summit; 5-11 November 2014) shows that 46% of the PM2.5 pollution reduction during APEC ("The APEC Blue") can be attributed to meteorology conditions and the rest 54% to emission reductions due to strict emission controls. Ammonia emissions are shown to significantly contribute to PM2.5 over North China in the fall. By converting sulfuric acid and nitric acid to longer-lived ammonium sulfate and ammonium nitrate aerosols, ammonia plays an important role in promoting their regional transport influences. We will also discuss the pathways and mechanisms of external long-range transport influences to the PM2.5 pollution over North China.

  1. Addiction Motivation Reformulated: An Affective Processing Model of Negative Reinforcement

    Science.gov (United States)

    Baker, Timothy B.; Piper, Megan E.; McCarthy, Danielle E.; Majeskie, Matthew R.; Fiore, Michael C.

    2004-01-01

    This article offers a reformulation of the negative reinforcement model of drug addiction and proposes that the escape and avoidance of negative affect is the prepotent motive for addictive drug use. The authors posit that negative affect is the motivational core of the withdrawal syndrome and argue that, through repeated cycles of drug use and…

  2. Affective Signal Processing (ASP): Unraveling the mystery of emotions

    NARCIS (Netherlands)

    Broek, van den Egon L.

    2011-01-01

    Slowly computers are being dressed and becoming huggable and tangible. They are being personalized and are expected to understand more of their users' feelings, emotions, and moods: This we refer to as affective computing. The work and experiences from 50+ publications on affective computing is coll

  3. Toward affective dialogue management using partially observable Markov decision processes

    NARCIS (Netherlands)

    Bui, Trung Huu

    2008-01-01

    Designing and developing affective dialogue systems have recently received much interest from the dialogue research community. A distinctive feature of these systems is affect modeling. Previous work was mainly focused on showing system's emotions to the user in order to achieve the designer's goal

  4. RELATIONSHIP BETWEEN TYPES OF FACIAL PSORIASIS WITH DLQI AND SEVERITY OF PSORIASIS : A STUDY

    Directory of Open Access Journals (Sweden)

    Murugan

    2015-08-01

    Full Text Available Psoriasis is a chronic papulosquamous disorder involving any skin site. Involvement of exposed areas is associated with significant stigma. Facial involvement in psoriasis causes considerable cosmetic imbalance and psychosocial stress to the affected individual. Facial psoriasis has been described as severe psoriasis. KEYWORDS: D IQL facial psoria sis centro facial periorofacial.

  5. [Therapy for atypical facial pain].

    Science.gov (United States)

    Ishida, Satoshi; Kimura, Hiroko

    2009-09-01

    Atypical facial pain is a pain in the head, neck and the face, without organic causes. It is treated at departments of physical medicine, such as dental, oral and maxillofacial surgery, otolaryngology, cerebral surgery, or head and neck surgery. In primary care, it is considered to be a medically unexplained symptom (MUS), or a somatoform disorder, such as somatization caused by a functional somatic syndrome (FSS) by psychiatrists. Usually, patients consult departments of physical medicine complaining of physical pain. Therefore physicians in these departments should examine the patients from the holistic perspective, and identify organic diseases. As atypical facial pain becomes chronic, other complications, including psychiatric complaints other than physical pain, such as depression may develop. Moreover, physical, psychological, and social factors affect the symptoms by interacting with one another. Therefore, in examining atypical facial pain, doctors specializing in dental, oral and maxillofacial medicine are required to provide psychosomatic treatment that is based on integrated knowledge.

  6. Compound facial expressions of emotion.

    Science.gov (United States)

    Du, Shichuan; Tao, Yong; Martinez, Aleix M

    2014-04-15

    Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories--happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.

  7. Aging and facial changes--documenting clinical signs, part 1: clinical changes of the aging face.

    Science.gov (United States)

    Nkengne, Alex; Bertin, Christiane

    2012-01-01

    The process of aging induces the transformation of the face with changes that are usually classified as either chronological or photo induced and that affect the shape, the texture, and the color of the face. Facial shape is mainly transformed by the evolution of bones and soft tissues (muscles, fat, and skin) in addition to noticeable effects of gravity. Skin texture is mainly determined by wrinkles, which arise from atrophy of the skin layers, elastosis, and facial expressions. Skin color is related to the distribution of skin chromophores and the structure of the dermis, which affects light scattering. All facial changes are dependant on sex, ethnicity, and lifestyle. They affect self-perception and social interactions and can sometimes be slowed down or reversed using appropriate clinical procedures (e.g., dermatological, surgical, and cosmetic interventions).

  8. Neuroticism Delays Detection of Facial Expressions.

    Science.gov (United States)

    Sawada, Reiko; Sato, Wataru; Uono, Shota; Kochiyama, Takanori; Kubota, Yasutaka; Yoshimura, Sayaka; Toichi, Motomi

    2016-01-01

    The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a crowd of neutral expressions. Anti-expressions contained an amount of visual changes equivalent to those found in normal expressions compared to neutral expressions, but they were usually recognized as neutral expressions. Subjective emotional ratings in response to each facial expression stimulus were also obtained. Participants with high-neuroticism showed an overall delay in the detection of target facial expressions compared to participants with low-neuroticism. Additionally, the high-neuroticism group showed higher levels of arousal to facial expressions compared to the low-neuroticism group. These data suggest that neuroticism modulates the detection of emotional facial expressions in healthy participants; high levels of neuroticism delay overall detection of facial expressions and enhance emotional arousal in response to facial expressions.

  9. Emotional voices in context: A neurobiological model of multimodal affective information processing

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues.

  10. Facial Image Analysis Based on Local Binary Patterns: A Survey

    NARCIS (Netherlands)

    Huang, D.; Shan, C.; Ardebilian, M.; Chen, L.

    2011-01-01

    Facial image analysis, including face detection, face recognition,facial expression analysis, facial demographic classification, and so on, is an important and interesting research topic in the computervision and image processing area, which has many important applications such as human-computer

  11. Facial Image Analysis Based on Local Binary Patterns: A Survey

    NARCIS (Netherlands)

    Huang, D.; Shan, C.; Ardebilian, M.; Chen, L.

    2011-01-01

    Facial image analysis, including face detection, face recognition,facial expression analysis, facial demographic classification, and so on, is an important and interesting research topic in the computervision and image processing area, which has many important applications such as human-computer

  12. Benthic processes affecting contaminant transport in Upper Klamath Lake, Oregon

    Science.gov (United States)

    Kuwabara, James S.; Topping, Brent R.; Carter, James L.; Carlson, Rick A; Parchaso, Francis; Fend, Steven V.; Stauffer-Olsen, Natalie; Manning, Andrew J.; Land, Jennie M.

    2016-09-30

    Executive SummaryMultiple sampling trips during calendar years 2013 through 2015 were coordinated to provide measurements of interdependent benthic processes that potentially affect contaminant transport in Upper Klamath Lake (UKL), Oregon. The measurements were motivated by recognition that such internal processes (for example, solute benthic flux, bioturbation and solute efflux by benthic invertebrates, and physical groundwater-surface water interactions) were not integrated into existing management models for UKL. Up until 2013, all of the benthic-flux studies generally had been limited spatially to a number of sites in the northern part of UKL and limited temporally to 2–3 samplings per year. All of the benthic invertebrate studies also had been limited to the northern part of the lake; however, intensive temporal (weekly) studies had previously been completed independent of benthic-flux studies. Therefore, knowledge of both the spatial and temporal variability in benthic flux and benthic invertebrate distributions for the entire lake was lacking. To address these limitations, we completed a lakewide spatial study during 2013 and a coordinated temporal study with weekly sampling of benthic flux and benthic invertebrates during 2014. Field design of the spatially focused study in 2013 involved 21 sites sampled three times as the summer cyanobacterial bloom developed (that is, May 23, June 13, and July 3, 2013). Results of the 27-week, temporally focused study of one site in 2014 were summarized and partitioned into three periods (referred to herein as pre-bloom, bloom and post-bloom periods), each period involving 9 weeks of profiler deployments, water column and benthic sampling. Partitioning of the pre-bloom, bloom, and post-bloom periods were based on water-column chlorophyll concentrations and involved the following date intervals, respectively: April 15 through June 10, June 17 through August 13, and August 20 through October 16, 2014. To examine

  13. Priming emotional facial expressions as evidenced by event-related brain potentials.

    Science.gov (United States)

    Werheid, Katja; Alpay, Gamze; Jentzsch, Ines; Sommer, Werner

    2005-02-01

    As human faces are important social signals in everyday life, processing of facial affect has recently entered into the focus of neuroscientific research. In the present study, priming of faces showing the same emotional expression was measured with the help of event-related potentials (ERPs) in order to investigate the temporal characteristics of processing facial expressions. Participants classified portraits of unfamiliar persons according to their emotional expression (happy or angry). The portraits were either preceded by the face of a different person expressing the same affect (primed) or the opposite affect (unprimed). ERPs revealed both early and late priming effects, independent of stimulus valence. The early priming effect was characterized by attenuated frontal ERP amplitudes between 100 and 200 ms in response to primed targets. Its dipole sources were localised in the inferior occipitotemporal cortex, possibly related to the detection of expression-specific facial configurations, and in the insular cortex, considered to be involved in affective processes. The late priming effect, an enhancement of the late positive potential (LPP) following unprimed targets, may evidence greater relevance attributed to a change of emotional expressions. Our results (i) point to the view that a change of affect-related facial configuration can be detected very early during face perception and (ii) support previous findings on the amplitude of the late positive potential being rather related to arousal than to the specific valence of an emotional signal.

  14. Perception of global facial geometry is modulated through experience

    Directory of Open Access Journals (Sweden)

    Meike Ramon

    2015-03-01

    Full Text Available Identification of personally familiar faces is highly efficient across various viewing conditions. While the presence of robust facial representations stored in memory is considered to aid this process, the mechanisms underlying invariant identification remain unclear. Two experiments tested the hypothesis that facial representations stored in memory are associated with differential perceptual processing of the overall facial geometry. Subjects who were personally familiar or unfamiliar with the identities presented discriminated between stimuli whose overall facial geometry had been manipulated to maintain or alter the original facial configuration (see Barton, Zhao & Keenan, 2003. The results demonstrate that familiarity gives rise to more efficient processing of global facial geometry, and are interpreted in terms of increased holistic processing of facial information that is maintained across viewing distances.

  15. Positive, but Not Negative, Facial Expressions Facilitate 3-Month-Olds' Recognition of an Individual Face

    Science.gov (United States)

    Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara

    2013-01-01

    The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…

  16. Positive, but Not Negative, Facial Expressions Facilitate 3-Month-Olds' Recognition of an Individual Face

    Science.gov (United States)

    Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara

    2013-01-01

    The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…

  17. Dynamic modelling of processes in rivers affected by precipitation runoff

    DEFF Research Database (Denmark)

    Jacobsen, Judith L.

    1997-01-01

    In this thesis, models for the dynamics of oxygen and organic matter in receiving waters (such as rivers and creeks), which are affected by rain, are developed. A time series analysis framework is used, but presented with special emphasis on continuous time state space models. Also, the concept...

  18. Facial paralysis in children.

    Science.gov (United States)

    Reddy, Sashank; Redett, Richard

    2015-04-01

    Facial paralysis can have devastating physical and psychosocial consequences. These are particularly severe in children in whom loss of emotional expressiveness can impair social development and integration. The etiologies of facial paralysis, prospects for spontaneous recovery, and functions requiring restoration differ in children as compared with adults. Here we review contemporary management of facial paralysis with a focus on special considerations for pediatric patients.

  19. Facial Action and Emotional Language: ERP Evidence that Blocking Facial Feedback Selectively Impairs Sentence Comprehension.

    Science.gov (United States)

    Davis, Joshua D; Winkielman, Piotr; Coulson, Seana

    2015-11-01

    There is a lively and theoretically important debate about whether, how, and when embodiment contributes to language comprehension. This study addressed these questions by testing how interference with facial action impacts the brain's real-time response to emotional language. Participants read sentences about positive and negative events (e.g., "She reached inside the pocket of her coat from last winter and found some (cash/bugs) inside it.") while ERPs were recorded. Facial action was manipulated within participants by asking participants to hold chopsticks in their mouths using a position that allowed or blocked smiling, as confirmed by EMG. Blocking smiling did not influence ERPs to the valenced words (e.g., cash, bugs) but did influence ERPs to final words of sentences describing positive events. Results show that affectively positive sentences can evoke smiles and that such facial action can facilitate the semantic processing indexed by the N400 component. Overall, this study offers causal evidence that embodiment impacts some aspects of high-level comprehension, presumably involving the construction of the situation model.

  20. Facial aging: A clinical classification

    Directory of Open Access Journals (Sweden)

    Shiffman Melvin

    2007-01-01

    Full Text Available The purpose of this classification of facial aging is to have a simple clinical method to determine the severity of the aging process in the face. This allows a quick estimate as to the types of procedures that the patient would need to have the best results. Procedures that are presently used for facial rejuvenation include laser, chemical peels, suture lifts, fillers, modified facelift and full facelift. The physician is already using his best judgment to determine which procedure would be best for any particular patient. This classification may help to refine these decisions.

  1. Inhibitory repetitive transcranial magnetic stimulation (rTMS) of the dorsolateral prefrontal cortex modulates early affective processing.

    Science.gov (United States)

    Zwanzger, Peter; Steinberg, Christian; Rehbein, Maimu Alissa; Bröckelmann, Ann-Kathrin; Dobel, Christian; Zavorotnyy, Maxim; Domschke, Katharina; Junghöfer, Markus

    2014-11-01

    The dorsolateral prefrontal cortex (dlPFC) has often been suggested as a key modulator of emotional stimulus appraisal and regulation. Therefore, in clinical trials, it is one of the most frequently targeted regions for non-invasive brain stimulation such as repetitive transcranial magnetic stimulation (rTMS). In spite of various encouraging reports that demonstrate beneficial effects of rTMS in anxiety disorders, psychophysiological studies exploring the underlying neural mechanisms are sparse. Here we investigated how inhibitory rTMS influences early affective processing when applied over the right dlPFC. Before and after rTMS or sham stimulation, subjects viewed faces with fearful or neutral expressions while whole-head magnetoencephalography (MEG) was recorded. Due to the disrupted functioning of the right dlPFC, visual processing in bilateral parietal, temporal, and occipital areas was amplified starting at around 90 ms after stimulus onset. Moreover, increased fear-specific activation was found in the right TPJ area in a time-interval between 110 and 170 ms. These neurophysiological effects were reflected in slowed reaction times for fearful, but not for neutral faces in a facial expression identification task while there was no such effect on a gender discrimination control task. Our study confirms the specific and important role of the dlPFC in regulation of early emotional attention and encourages future clinical research to use minimal invasive methods such as transcranial magnetic (TMS) or direct current stimulation (tDCS). Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Exploring facial emotion perception in schizophrenia using transcranial magnetic stimulation and spatial filtering.

    Science.gov (United States)

    Rassovsky, Yuri; Lee, Junghee; Nori, Poorang; Wu, Allan D; Iacoboni, Marco; Breitmeyer, Bruno G; Hellemann, Gerhard; Green, Michael F

    2014-11-01

    Schizophrenia patients have difficulty extracting emotional information from facial expressions. Perception of facial emotion can be examined by systematically altering the spatial frequency of stimuli and suppressing visual processing with temporal precision using transcranial magnetic stimulation (TMS). In the present study, we compared 25 schizophrenia patients and 27 healthy controls using a facial emotion identification task. Spatial processing was examined by presenting facial photographs that contained either high (HSF), low (LSF), or broadband/unfiltered (BSF) spatial frequencies. Temporal processing was manipulated using a single-pulse TMS delivered to the visual cortex either before (forward masking) or after (backward masking) photograph presentation. Consistent with previous studies, schizophrenia patients performed significantly below controls across all three spatial frequencies. A spatial frequency by forward/backward masking interaction effect demonstrated reduced performance in the forward masking component in the BSF condition and a reversed performance pattern in the HSF condition, with no significant differences between forward and backward masking in the LSF condition. However, the group by spatial frequency interaction was not significant. These findings indicate that manipulating visual suppression of emotional information at the level of the primary visual cortex results in comparable effects on both groups. This suggests that patients' deficits in facial emotion identification are not explained by low-level processes in the retino-geniculo-striate projection, but may rather depend on deficits of affect perception occurring at later integrative processing stages.

  3. Amblyopia Associated with Congenital Facial Nerve Paralysis.

    Science.gov (United States)

    Iwamura, Hitoshi; Kondo, Kenji; Sawamura, Hiromasa; Baba, Shintaro; Yasuhara, Kazuo; Yamasoba, Tatsuya

    2016-01-01

    The association between congenital facial paralysis and visual development has not been thoroughly studied. Of 27 pediatric cases of congenital facial paralysis, we identified 3 patients who developed amblyopia, a visual acuity decrease caused by abnormal visual development, as comorbidity. These 3 patients had facial paralysis in the periocular region and developed amblyopia on the paralyzed side. They started treatment by wearing an eye patch immediately after diagnosis and before the critical visual developmental period; all patients responded to the treatment. Our findings suggest that the incidence of amblyopia in the cases of congenital facial paralysis, particularly the paralysis in the periocular region, is higher than that in the general pediatric population. Interestingly, 2 of the 3 patients developed anisometropic amblyopia due to the hyperopia of the affected eye, implying that the periocular facial paralysis may have affected the refraction of the eye through yet unspecified mechanisms. Therefore, the physicians who manage facial paralysis should keep this pathology in mind, and when they see pediatric patients with congenital facial paralysis involving the periocular region, they should consult an ophthalmologist as soon as possible. © 2016 S. Karger AG, Basel.

  4. Process Formulations And Curing Conditions That Affect Saltstone Properties

    Energy Technology Data Exchange (ETDEWEB)

    Reigel, M. M.; Pickenheim, B. R.; Daniel, W. E.

    2012-09-28

    The first objective of this study was to analyze saltstone fresh properties to determine the feasibility of reducing the formulation water to premix (w/p) ratio while varying the amount of extra water and admixtures used during processing at the Saltstone Production Facility (SPF). The second part of this study was to provide information for understanding the impact of curing conditions (cure temperature, relative humidity (RH)) and processing formulation on the performance properties of cured saltstone.

  5. Benthic processes affecting contaminant transport in Upper Klamath Lake, Oregon

    Science.gov (United States)

    Kuwabara, James S.; Topping, Brent R.; Carter, James L.; Carlson, Rick A; Parchaso, Francis; Fend, Steven V.; Stauffer-Olsen, Natalie; Manning, Andrew J.; Land, Jennie M.

    2016-09-30

    Executive SummaryMultiple sampling trips during calendar years 2013 through 2015 were coordinated to provide measurements of interdependent benthic processes that potentially affect contaminant transport in Upper Klamath Lake (UKL), Oregon. The measurements were motivated by recognition that such internal processes (for example, solute benthic flux, bioturbation and solute efflux by benthic invertebrates, and physical groundwater-surface water interactions) were not integrated into existing management models for UKL. Up until 2013, all of the benthic-flux studies generally had been limited spatially to a number of sites in the northern part of UKL and limited temporally to 2–3 samplings per year. All of the benthic invertebrate studies also had been limited to the northern part of the lake; however, intensive temporal (weekly) studies had previously been completed independent of benthic-flux studies. Therefore, knowledge of both the spatial and temporal variability in benthic flux and benthic invertebrate distributions for the entire lake was lacking. To address these limitations, we completed a lakewide spatial study during 2013 and a coordinated temporal study with weekly sampling of benthic flux and benthic invertebrates during 2014. Field design of the spatially focused study in 2013 involved 21 sites sampled three times as the summer cyanobacterial bloom developed (that is, May 23, June 13, and July 3, 2013). Results of the 27-week, temporally focused study of one site in 2014 were summarized and partitioned into three periods (referred to herein as pre-bloom, bloom and post-bloom periods), each period involving 9 weeks of profiler deployments, water column and benthic sampling. Partitioning of the pre-bloom, bloom, and post-bloom periods were based on water-column chlorophyll concentrations and involved the following date intervals, respectively: April 15 through June 10, June 17 through August 13, and August 20 through October 16, 2014. To examine

  6. Psychological issues in acquired facial trauma

    Science.gov (United States)

    De Sousa, Avinash

    2010-01-01

    The face is a vital component of one’s personality and body image. There are a vast number of variables that influence recovery and rehabilitation from acquired facial trauma many of which are psychological in nature. The present paper presents the various psychological issues one comes across in facial trauma patients. These may range from body image issues to post-traumatic stress disorder symptoms accompanied by anxiety and depression. Issues related to facial and body image affecting social life and general quality of life are vital and the plastic surgeon should be aware of such issues and competent to deal with them in patients and families. PMID:21217982

  7. Razi's description and treatment of facial paralysis.

    Science.gov (United States)

    Tabatabaei, Seyed Mahmood; Kalantar Hormozi, Abdoljalil; Asadi, Mohsen

    2011-01-01

    In the modern medical era, facial paralysis is linked with the name of Charles Bell. This disease, which is usually unilateral and is a peripheral facial palsy, causes facial muscle weakness in the affected side. Bell gave a complete description of the disease; but historically other physicians had described it several hundred years prior although it had been ignored for different reasons, such as the difficulty of the original text language. The first and the most famous of these physicians who described this disease was Mohammad Ibn Zakaryya Razi (Rhazes). In this article, we discuss his opinion.

  8. Robust facial expression recognition via compressive sensing.

    Science.gov (United States)

    Zhang, Shiqing; Zhao, Xiaoming; Lei, Bicheng

    2012-01-01

    Recently, compressive sensing (CS) has attracted increasing attention in the areas of signal processing, computer vision and pattern recognition. In this paper, a new method based on the CS theory is presented for robust facial expression recognition. The CS theory is used to construct a sparse representation classifier (SRC). The effectiveness and robustness of the SRC method is investigated on clean and occluded facial expression images. Three typical facial features, i.e., the raw pixels, Gabor wavelets representation and local binary patterns (LBP), are extracted to evaluate the performance of the SRC method. Compared with the nearest neighbor (NN), linear support vector machines (SVM) and the nearest subspace (NS), experimental results on the popular Cohn-Kanade facial expression database demonstrate that the SRC method obtains better performance and stronger robustness to corruption and occlusion on robust facial expression recognition tasks.

  9. A Content Analysis of Factors Affecting New Product Development Process

    Directory of Open Access Journals (Sweden)

    Eda Atilgan-Inan

    2010-07-01

    Full Text Available The objective of this study is to review the international marketing literature on new product development process and compare the changes in the important factors in the process with the changes in the management approaches. For this purpose, the articles in three international marketing journals were selected and “new product development” and “new product performance” were searched for in the abstracts. After grouping the variables in the process, they were compared with the perspectives of management in the related periods. The results indicated that organizational factors have always been important for new product development process, which is in line with the nature of the innovation process. But the emphasis on internal factors has increased in the 21st century which is congruent with the change in management perspective foregrounding resource based view. The study differs from the similar literature review studies on the point that it deals with the topic from international marketing perspective. Therefore, R&D and other marketing studies are not included in the review and the study proposes the important factors from international firms’ point of view.

  10. Flawed processing of airborne EM data affecting hydrogeological interpretation.

    Science.gov (United States)

    Viezzoli, Andrea; Jørgensen, Flemming; Sørensen, Camilla

    2013-03-01

    Airborne electromagnetics (AEMs) is increasingly being used across the globe as a tool for groundwater and environmental management. Focus is on ensuring the quality of the source data, their processing and modeling, and the integration of results with ancillary information to generate accurate and relevant products. Accurate processing and editing of raw AEM data, the topic of this article, is one of the crucial steps in obtaining quantitative information for groundwater modeling and management. In this article, we examine the consequences that different levels of processing of helicopter transient electromagnetic method data have on the resulting electrical models and subsequently on hydrogeological models. We focus on different approaches used in the industry for processing of the raw data and show how the electrical resistivity-depth models, which is the end "geophysical" product (after data inversion) of an AEM survey, change with different levels of processing of the raw data. We then extend the study to show the impact on some of the hydrogeological parameters or models, which can be derived from the geophysical results. The consequences of improper handling of raw data to groundwater and environmental management can be significant and expensive.

  11. Immunobiology of Facial Nerve Repair and Regeneration

    Institute of Scientific and Technical Information of China (English)

    QUAN Shi-ming; GAO Zhi-qiang

    2006-01-01

    Immunobiological study is a key to revealing the important basis of facial nerve repair and regeneration for both research and development of clinic treatments. The microenvironmental changes around an injuried facial motoneuron, i.e., the aggregation and expression of various types of immune cells and molecules in a dynamic equilibrium, impenetrate from the start to the end of the repair of an injured facial nerve. The concept of "immune microenvironment for facial nerve repair and regeneration", mainly concerns with the dynamic exchange between expression and regulation networks and a variaty of immune cells and immune molecules in the process of facial nerve repair and regeneration for the maintenance of a immune microenvironment favorable for nerve repair.Investigation on microglial activation and recruitment, T cell behavior, cytokine networks, and immunological cellular and molecular signaling pathways in facial nerve repair and regeneration are the current hot spots in the research on immunobiology of facial nerve injury. The current paper provides a comprehensive review of the above mentioned issues. Research of these issues will eventually make immunological interventions practicable treatments for facial nerve injury in the clinic.

  12. The Facial Adipose Tissue: A Revision.

    Science.gov (United States)

    Kruglikov, Ilja; Trujillo, Oscar; Kristen, Quick; Isac, Kerelos; Zorko, Julia; Fam, Maria; Okonkwo, Kasie; Mian, Asima; Thanh, Hyunh; Koban, Konstantin; Sclafani, Anthony P; Steinke, Hanno; Cotofana, Sebastian

    2016-12-01

    Recent advantages in the anatomical understanding of the face have turned the focus toward the subcutaneous and deep facial fat compartments. During facial aging, these fat-filled compartments undergo substantial changes along with other structures in the face. Soft tissue filler and fat grafting are valid methods to fight the signs of facial aging, but little is known about their precise effect on the facial fat. This narrative review summarizes the current knowledge about the facial fat compartments in terms of anatomical location, histologic appearance, immune-histochemical characteristics, cellular interactions, and therapeutic options. Three different types of facial adipose tissue can be identified, which are located either superficially (dermal white adipose tissue) or deep (subcutaneous white adipose tissue): fibrous (perioral locations), structural (major parts of the midface), and deposit (buccal fat pad and deep temporal fat pad). These various fat types differ in the size of the adipocytes and the collagenous composition of their extracellular matrix and thus in their mechanical properties. Minimal invasive (e.g., soft tissue fillers or fat grafting) and surgical interventions aiming to restore the youthful face have to account for the different fat properties in various facial areas. However, little is known about the macro- and microscopic characteristics of the facial fat tissue in different compartments and future studies are needed to reveal new insights to better understand the process of aging and how to fight its signs best.

  13. Enzymatic biodiesel synthesis. Key factors affecting efficiency of the process

    Energy Technology Data Exchange (ETDEWEB)

    Szczesna Antczak, Miroslawa; Kubiak, Aneta; Antczak, Tadeusz; Bielecki, Stanislaw [Institute of Technical Biochemistry, Faculty of Biotechnology and Food Sciences, Technical University of Lodz, Stefanowskiego 4/10, 90-924 Lodz (Poland)

    2009-05-15

    Chemical processes of biodiesel production are energy-consuming and generate undesirable by-products such as soaps and polymeric pigments that retard separation of pure methyl or ethyl esters of fatty acids from glycerol and di- and monoacylglycerols. Enzymatic, lipase-catalyzed biodiesel synthesis has no such drawbacks. Comprehension of the latter process and an appreciable progress in production of robust preparations of lipases may soon result in the replacement of chemical catalysts with enzymes in biodiesel synthesis. Engineering of enzymatic biodiesel synthesis processes requires optimization of such factors as: molar ratio of substrates (triacylglycerols: alcohol), temperature, type of organic solvent (if any) and water activity. All of them are correlated with properties of lipase preparation. This paper reports on the interplay between the crucial parameters of the lipase-catalyzed reactions carried out in non-aqueous systems and the yield of biodiesel synthesis. (author)

  14. The Influence of Parameters Affecting Boron Removal by Electrocoagulation Process

    KAUST Repository

    Zeboudji, B.

    2013-04-01

    Boron removal in seawater desalination presents a particular challenge. In seawater reverse osmosis (SWRO) systems boron removal at low concentration (<0.5 mg/L) is usually achieved by a second pass using brackish water RO membranes. However, this process requires chemical addition and important additional investment, operation and maintenance, and energy costs. Electrocoagulation (EC) process can be used to achieve such low boron concentration. In this work, the removal of boron from aqueous solution was carried out by EC process using aluminum and iron electrodes. Several operating parameters on the removal efficiency such as initial pH, current density, initial boron ion concentration, feed concentration, gap between electrodes, and electrode material, were investigated. In the case of bipolar electrocoagulation (BEC), an optimum removal efficiency of 96% corresponding to a final boron concentration of 0.4 mg/L was achieved at a current density of 6 mA/cm2 and pH = 8 using aluminum electrodes. The concentration of NaCl was 2,500 mg/L and the gap between the electrodes of 0.5 cm. Furthermore, a comparison between monopolar electrocoagulation (MEC) and BEC using both aluminum and iron electrodes was carried out. Results showed that the BEC process has reduced the current density applied to obtain high level of boron removal in a short reaction time compared to MEC process. The high performance of the EC showed that the process could be used to reduce boron concentration to acceptable levels at low-cost and more environmentally friendly. © 2013 Copyright Taylor and Francis Group, LLC.

  15. Psychometric Characteristics of the EEAA (Scale of Affective Strategies in the Learning Process)

    Science.gov (United States)

    Villardón-Gallego, Lourdes; Yániz, Concepción

    2014-01-01

    Introduction: Affective strategies for coping with affective states linked to the learning process may be oriented toward controlling emotions or toward controlling motivation. Both types affect performance, directly and indirectly. The objective of this research was to design an instrument for measuring the affective strategies used by university…

  16. Stimulus Characteristics Affect Humor Processing in Individuals with Asperger Syndrome

    Science.gov (United States)

    Samson, Andrea C.; Hegenloh, Michael

    2010-01-01

    The present paper aims to investigate whether individuals with Asperger syndrome (AS) show global humor processing deficits or whether humor comprehension and appreciation depends on stimulus characteristics. Non-verbal visual puns, semantic and Theory of Mind cartoons were rated on comprehension, funniness and the punchlines were explained. AS…

  17. Distal Prosodic Context Affects Word Segmentation and Lexical Processing

    Science.gov (United States)

    Dilley, Laura C.; McAuley, J. Devin

    2008-01-01

    Three experiments investigated the role of distal (i.e., nonlocal) prosody in word segmentation and lexical processing. In Experiment 1, prosodic characteristics of the initial five syllables of eight-syllable sequences were manipulated; the final portions of these sequences were lexically ambiguous (e.g., "note bookworm", "notebook worm"). Distal…

  18. Can Process Portfolios Affect Students' Writing Self-Efficacy?

    Science.gov (United States)

    Nicolaidou, Iolie

    2012-01-01

    Can process portfolios that support students in goal setting, reflection, self-evaluation and feedback have a positive impact on students' writing self-efficacy? This article presents the findings of a yearlong study conducted in three 4th grade elementary classes in Cyprus where paper-based and web-based portfolios were implemented to help…

  19. Sensitivity analysis on parameters and processes affecting vapor intrusion risk

    NARCIS (Netherlands)

    Picone, S.; Valstar, J.R.; Gaans, van P.; Grotenhuis, J.T.C.; Rijnaarts, H.H.M.

    2012-01-01

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the v

  20. Reading faces: differential lateral gaze bias in processing canine and human facial expressions in dogs and 4-year-old children.

    Directory of Open Access Journals (Sweden)

    Anaïs Racca

    Full Text Available Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.

  1. Freestyle Local Perforator Flaps for Facial Reconstruction

    OpenAIRE

    Jun Yong Lee; Ji Min Kim; Ho Kwon; Sung-No Jung; Hyung Sup Shim; Sang Wha Kim

    2015-01-01

    For the successful reconstruction of facial defects, various perforator flaps have been used in single-stage surgery, where tissues are moved to adjacent defect sites. Our group successfully performed perforator flap surgery on 17 patients with small to moderate facial defects that affected the functional and aesthetic features of their faces. Of four complicated cases, three developed venous congestion, which resolved in the subacute postoperative period, and one patient with partial necrosi...

  2. Right Hemispheric Dominance in Processing of Unconscious Negative Emotion

    Science.gov (United States)

    Sato, Wataru; Aoki, Satoshi

    2006-01-01

    Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or…

  3. How gender-expectancy affects the processing of "them".

    Science.gov (United States)

    Doherty, Alice; Conklin, Kathy

    2017-04-01

    How sensitive is pronoun processing to expectancies based on real-world knowledge and language usage? The current study links research on the integration of gender stereotypes and number-mismatch to explore this question. It focuses on the use of them to refer to antecedents of different levels of gender-expectancy (low-cyclist, high-mechanic, known-spokeswoman). In a rating task, them is considered increasingly unnatural with greater gender-expectancy. However, participants might not be able to differentiate high-expectancy and gender-known antecedents online because they initially search for plural antecedents (e.g., Sanford & Filik), and they make all-or-nothing gender inferences. An eye-tracking study reveals early differences in the processing of them with antecedents of high gender-expectancy compared with gender-known antecedents. This suggests that participants have rapid access to the expected gender of the antecedent and the level of that expectancy.

  4. Identity information content depends on the type of facial movement

    Science.gov (United States)

    Dobs, Katharina; Bülthoff, Isabelle; Schultz, Johannes

    2016-09-01

    Facial movements convey information about many social cues, including identity. However, how much information about a person’s identity is conveyed by different kinds of facial movements is unknown. We addressed this question using a recent motion capture and animation system, with which we animated one avatar head with facial movements of three types: (1) emotional, (2) emotional in social interaction and (3) conversational, all recorded from several actors. In a delayed match-to-sample task, observers were best at matching actor identity across conversational movements, worse with emotional movements in social interactions, and at chance level with emotional facial expressions. Model observers performing this task showed similar performance profiles, indicating that performance variation was due to differences in information content, rather than processing. Our results suggest that conversational facial movements transmit more dynamic identity information than emotional facial expressions, thus suggesting different functional roles and processing mechanisms for different types of facial motion.

  5. Understanding processes affecting mineral deposits in humid environments

    Science.gov (United States)

    Seal, Robert R., II; Ayuso, Robert A.

    2011-01-01

    Recent interdisciplinary studies by the U.S. Geological Survey have resulted in substantial progress toward understanding the influence that climate and hydrology have on the geochemical signatures of mineral deposits and the resulting mine wastes in the eastern United States. Specific areas of focus include the release, transport, and fate of acid, metals, and associated elements from inactive mines in temperate coastal areas and of metals from unmined mineral deposits in tropical to subtropical areas; the influence of climate, geology, and hydrology on remediation options for abandoned mines; and the application of radiogenic isotopes to uniquely apportion source contributions that distinguish natural from mining sources and extent of metal transport. The environmental effects of abandoned mines and unmined mineral deposits result from a complex interaction of a variety of chemical and physical factors. These include the geology of the mineral deposit, the hydrologic setting of the mineral deposit and associated mine wastes, the chemistry of waters interacting with the deposit and associated waste material, the engineering of a mine as it relates to the reactivity of mine wastes, and climate, which affects such factors as temperature and the amounts of precipitation and evapotranspiration; these factors, in turn, influence the environmental behavior of mineral deposits. The role of climate is becoming increasingly important in environmental investigations of mineral deposits because of the growing concerns about climate change.

  6. Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores

    Science.gov (United States)

    Flores, Lorea; Bailey, R. A.; Elosegi, Arturo; Larrañaga, Aitor; Reiss, Julia

    2016-01-01

    Habitat complexity can influence predation rates (e.g. by providing refuge) but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants), in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants); and 3. as the spatial configuration of structures (measured as fractal dimension). The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology). We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, ‘habitat complexity’ by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems. PMID:27802267

  7. Altered cortical activation from the hand after facial botulinum toxin treatment.

    Science.gov (United States)

    Haenzi, Sara; Stefanics, Gabor; Lanaras, Tatjana; Calcagni, Maurizio; Ghosh, Arko

    2014-01-01

    Plastic interactions between face and hand cortical tactile circuits occur after severe injuries that affect the hand such as in amputation or spinal cord injury. However, whether loss of facial movements alters the cortical circuits involved in processing tactile inputs from the hand remains unknown. In this prospective observational study we used electroencephalography (EEG) to measure cortical activity evoked by tactile stimulation of the hands before and after botulinum toxin-A-induced facial paralysis. We found a reduction in the tactile event-related potentials (ERPs) 6 weeks after the treatment. This suggests that the limited paralysis of facial muscles induced during cosmetic interventions designed to smooth lines and wrinkles on the face is sufficient to alter the cortical processing of tactile inputs from the hand.

  8. Suprasegmental information affects processing of talking faces at birth.

    Science.gov (United States)

    Guellai, Bahia; Mersad, Karima; Streri, Arlette

    2015-02-01

    From birth, newborns show a preference for faces talking a native language compared to silent faces. The present study addresses two questions that remained unanswered by previous research: (a) Does the familiarity with the language play a role in this process and (b) Are all the linguistic and paralinguistic cues necessary in this case? Experiment 1 extended newborns' preference for native speakers to non-native ones. Given that fetuses and newborns are sensitive to the prosodic characteristics of speech, Experiments 2 and 3 presented faces talking native and nonnative languages with the speech stream being low-pass filtered. Results showed that newborns preferred looking at a person who talked to them even when only the prosodic cues were provided for both languages. Nonetheless, a familiarity preference for the previously talking face is observed in the "normal speech" condition (i.e., Experiment 1) and a novelty preference in the "filtered speech" condition (Experiments 2 and 3). This asymmetry reveals that newborns process these two types of stimuli differently and that they may already be sensitive to a mismatch between the articulatory movements of the face and the corresponding speech sounds.

  9. Left hand dominance affects supra-second time processing

    Directory of Open Access Journals (Sweden)

    Carmelo Mario Vicario

    2011-10-01

    Full Text Available Previous works exploring the brain functions of left-handed and right-handed people have shown variances in spatial and motor abilities that might be explained according to consistent structural and functional differences. Given the role of both spatial and motor information in the processing temporal intervals, we designed a study investigating timing abilities in left-handed subjects. To this purpose left-handed and right-handed subjects were asked to perform a time reproduction of sub-second vs. supra-second time intervals with their left and right hand.Our results show that left-handed participants sub-estimated the processing of the supra-second intervals, independently of the hand used to perform the task, while no differences were reported for the sub-second intervals. These results are discussed according to recent advances on suprasecond motor timing research as well as emerging evidences which suggest a linear representation of time with a left-to-right displacing.

  10. Sensitivity analysis on parameters and processes affecting vapor intrusion risk.

    Science.gov (United States)

    Picone, Sara; Valstar, Johan; van Gaans, Pauline; Grotenhuis, Tim; Rijnaarts, Huub

    2012-05-01

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the vast majority of previous studies, this model accounts for vertical variation of soil water saturation and includes aerobic biodegradation. The attenuation factor (ratio between concentration in the crawl space and source concentration) and the characteristic time to approach maximum concentrations were calculated and compared for a variety of scenarios. These concepts allow an understanding of controlling mechanisms and aid in the identification of critical parameters to be collected for field situations. The relative distance of the source to the nearest gas-filled pores of the unsaturated zone is the most critical parameter because diffusive contaminant transport is significantly slower in water-filled pores than in gas-filled pores. Therefore, attenuation factors decrease and characteristic times increase with increasing relative distance of the contaminant dissolved source to the nearest gas diffusion front. Aerobic biodegradation may decrease the attenuation factor by up to three orders of magnitude. Moreover, the occurrence of water table oscillations is of importance. Dynamic processes leading to a retreating water table increase the attenuation factor by two orders of magnitude because of the enhanced gas phase diffusion.

  11. Does Signal Degradation Affect Top-Down Processing of Speech?

    Science.gov (United States)

    Wagner, Anita; Pals, Carina; de Blecourt, Charlotte M; Sarampalis, Anastasios; Başkent, Deniz

    2016-01-01

    Speech perception is formed based on both the acoustic signal and listeners' knowledge of the world and semantic context. Access to semantic information can facilitate interpretation of degraded speech, such as speech in background noise or the speech signal transmitted via cochlear implants (CIs). This paper focuses on the latter, and investigates the time course of understanding words, and how sentential context reduces listeners' dependency on the acoustic signal for natural and degraded speech via an acoustic CI simulation.In an eye-tracking experiment we combined recordings of listeners' gaze fixations with pupillometry, to capture effects of semantic information on both the time course and effort of speech processing. Normal-hearing listeners were presented with sentences with or without a semantically constraining verb (e.g., crawl) preceding the target (baby), and their ocular responses were recorded to four pictures, including the target, a phonological (bay) competitor and a semantic (worm) and an unrelated distractor.The results show that in natural speech, listeners' gazes reflect their uptake of acoustic information, and integration of preceding semantic context. Degradation of the signal leads to a later disambiguation of phonologically similar words, and to a delay in integration of semantic information. Complementary to this, the pupil dilation data show that early semantic integration reduces the effort in disambiguating phonologically similar words. Processing degraded speech comes with increased effort due to the impoverished nature of the signal. Delayed integration of semantic information further constrains listeners' ability to compensate for inaudible signals.

  12. Simultanagnosia does not affect processes of auditory Gestalt perception.

    Science.gov (United States)

    Rennig, Johannes; Bleyer, Anna Lena; Karnath, Hans-Otto

    2017-05-01

    Simultanagnosia is a neuropsychological deficit of higher visual processes caused by temporo-parietal brain damage. It is characterized by a specific failure of recognition of a global visual Gestalt, like a visual scene or complex objects, consisting of local elements. In this study we investigated to what extend this deficit should be understood as a deficit related to specifically the visual domain or whether it should be seen as defective Gestalt processing per se. To examine if simultanagnosia occurs across sensory domains, we designed several auditory experiments sharing typical characteristics of visual tasks that are known to be particularly demanding for patients suffering from simultanagnosia. We also included control tasks for auditory working memory deficits and for auditory extinction. We tested four simultanagnosia patients who suffered from severe symptoms in the visual domain. Two of them indeed showed significant impairments in recognition of simultaneously presented sounds. However, the same two patients also suffered from severe auditory working memory deficits and from symptoms comparable to auditory extinction, both sufficiently explaining the impairments in simultaneous auditory perception. We thus conclude that deficits in auditory Gestalt perception do not appear to be characteristic for simultanagnosia and that the human brain obviously uses independent mechanisms for visual and for auditory Gestalt perception. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Sensitivity analysis on parameters and processes affecting vapor intrusion risk

    KAUST Repository

    Picone, Sara

    2012-03-30

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the vast majority of previous studies, this model accounts for vertical variation of soil water saturation and includes aerobic biodegradation. The attenuation factor (ratio between concentration in the crawl space and source concentration) and the characteristic time to approach maximum concentrations were calculated and compared for a variety of scenarios. These concepts allow an understanding of controlling mechanisms and aid in the identification of critical parameters to be collected for field situations. The relative distance of the source to the nearest gas-filled pores of the unsaturated zone is the most critical parameter because diffusive contaminant transport is significantly slower in water-filled pores than in gas-filled pores. Therefore, attenuation factors decrease and characteristic times increase with increasing relative distance of the contaminant dissolved source to the nearest gas diffusion front. Aerobic biodegradation may decrease the attenuation factor by up to three orders of magnitude. Moreover, the occurrence of water table oscillations is of importance. Dynamic processes leading to a retreating water table increase the attenuation factor by two orders of magnitude because of the enhanced gas phase diffusion. © 2012 SETAC.

  14. From neurons to epidemics: How trophic coherence affects spreading processes

    Science.gov (United States)

    Klaise, Janis; Johnson, Samuel

    2016-06-01

    Trophic coherence, a measure of the extent to which the nodes of a directed network are organised in levels, has recently been shown to be closely related to many structural and dynamical aspects of complex systems, including graph eigenspectra, the prevalence or absence of feedback cycles, and linear stability. Furthermore, non-trivial trophic structures have been observed in networks of neurons, species, genes, metabolites, cellular signalling, concatenated words, P2P users, and world trade. Here, we consider two simple yet apparently quite different dynamical models—one a susceptible-infected-susceptible epidemic model adapted to include complex contagion and the other an Amari-Hopfield neural network—and show that in both cases the related spreading processes are modulated in similar ways by the trophic coherence of the underlying networks. To do this, we propose a network assembly model which can generate structures with tunable trophic coherence, limiting in either perfectly stratified networks or random graphs. We find that trophic coherence can exert a qualitative change in spreading behaviour, determining whether a pulse of activity will percolate through the entire network or remain confined to a subset of nodes, and whether such activity will quickly die out or endure indefinitely. These results could be important for our understanding of phenomena such as epidemics, rumours, shocks to ecosystems, neuronal avalanches, and many other spreading processes.

  15. Misreading the facial signs: specific impairments and error patterns in recognition of facial emotions with negative valence in borderline personality disorder.

    Science.gov (United States)

    Unoka, Zsolt; Fogd, Dóra; Füzy, Melinda; Csukly, Gábor

    2011-10-30

    Patients with borderline personality disorder (BPD) exhibit impairment in labeling of facial emotional expressions. However, it is not clear whether these deficits affect the whole domain of basic emotions, are valence-specific, or specific to individual emotions. Whether BPD patients' errors in a facial emotion recognition task create a specific pattern also remains to be elucidated. Our study tested two hypotheses: first, we hypothesized, that the emotion perception impairment in borderline personality disorder is specific to the negative emotion domain. Second, we hypothesized, that BPD patients would show error patterns in a facial emotion recognition task more commonly and more systematically than healthy comparison subjects. Participants comprised 33 inpatients with BPD and 32 matched healthy control subjects who performed a computerized version of the Ekman 60 Faces test. The indices of emotion recognition and the direction of errors were processed in separate analyses. Clinical symptoms and personality functioning were assessed using the Symptom Checklist-90-Revised and the Young Schema Questionnaire Long Form. Results showed that patients with BPD were less accurate than control participants in emotion recognition, in particular, in the discrimination of negative emotions, while they were not impaired in the recognition of happy facial expressions. In addition, patients over-attributed disgust and surprise and under-attributed fear to the facial expressions relative to controls. These findings suggest the importance of carefully considering error patterns, besides measuring recognition accuracy, especially among emotions with negative affective valence, when assessing facial affect recognition in BPD. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Surgical treatment of facial paralysis.

    Science.gov (United States)

    Mehta, Ritvik P

    2009-03-01

    The management of facial paralysis is one of the most complex areas of reconstructive surgery. Given the wide variety of functional and cosmetic deficits in the facial paralysis patient, the reconstructive surgeon requires a thorough understanding of the surgical techniques available to treat this condition. This review article will focus on surgical management of facial paralysis and the treatment options available for acute facial paralysis (facial paralysis (3 weeks to 2 yr) and chronic facial paralysis (>2 yr). For acute facial paralysis, the main surgical therapies are facial nerve decompression and facial nerve repair. For facial paralysis of intermediate duration, nerve transfer procedures are appropriate. For chronic facial paralysis, treatment typically requires regional or free muscle transfer. Static techniques of facial reanimation can be used for acute, intermediate, or chronic facial paralysis as these techniques are often important adjuncts to the overall management strategy.

  17. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity--Evidence from Gazing Patterns.

    Directory of Open Access Journals (Sweden)

    Sanni Somppi

    Full Text Available Appropriate response to companions' emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs' gaze fixation distribution among the facial features (eyes, midface and mouth. We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral. We found that dogs' gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics' faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel

  18. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity--Evidence from Gazing Patterns.

    Science.gov (United States)

    Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V; Hänninen, Laura; Krause, Christina M; Vainio, Outi

    2016-01-01

    Appropriate response to companions' emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs' gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs' gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics' faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on

  19. Processes affecting coastal wetland loss in the Louisiana deltaic plain

    Science.gov (United States)

    Williams, S. Jeffress; Penland, Shea; Roberts, Harry H.

    1993-01-01

    Nowhere are the problems of coastal wetland loss more serious and dramatic than in the Mississippi River deltaic plain region of south-central Louisiana. In that area, rates of shoreline erosion of 20 m.yr and loss of land area of up to 75 km/yr result from a complex combination of natural (delta switching, subsidence, sea-level rise, storms) and human (flood control, navigation, oil and gas development, land reclamation) factors. The U.S. Geological Survey (USGS), as part of the National Coastal Geology Program, has undertaken joint filed investigations with Federal, State, and university partners. The objective of these long-term studies is to gather and interpret baseline information in order to improve our scientific understanding of the critical processes and responses responsible for creation, maintenance, and deterioration of coastal wetlands.

  20. From neurons to epidemics: How trophic coherence affects spreading processes

    CERN Document Server

    Klaise, Janis

    2016-01-01

    Trophic coherence, a measure of the extent to which the nodes of a directed network are organised in levels, has recently been shown to be closely related to many structural and dynamical aspects of complex systems, including graph eigenspectra, the prevalence or absence of feed-back cycles, and linear stability. Furthermore, non-trivial trophic structures have been observed in networks of neurons, species, genes, metabolites, cellular signalling, concatenated words, P2P users, and world trade. Here we consider two simple yet apparently quite different dynamical models -- one a Susceptible-Infected-Susceptible (SIS) epidemic model adapted to include complex contagion, the other an Amari-Hopfield neural network -- and show that in both cases the related spreading processes are modulated in similar ways by the trophic coherence of the underlying networks. To do this, we propose a network assembly model which can generate structures with tunable trophic coherence, limiting in either perfectly stratified networks...

  1. Facial speech gestures: the relation between visual speech processing, phonological awareness, and developmental dyslexia in 10-year-olds.

    Science.gov (United States)

    Schaadt, Gesa; Männel, Claudia; van der Meer, Elke; Pannekamp, Ann; Friederici, Angela D

    2016-11-01

    Successful communication in everyday life crucially involves the processing of auditory and visual components of speech. Viewing our interlocutor and processing visual components of speech facilitates speech processing by triggering auditory processing. Auditory phoneme processing, analyzed by event-related brain potentials (ERP), has been shown to be associated with impairments in reading and spelling (i.e. developmental dyslexia), but visual aspects of phoneme processing have not been investigated in individuals with such deficits. The present study analyzed the passive visual Mismatch Response (vMMR) in school children with and without developmental dyslexia in response to video-recorded mouth movements pronouncing syllables silently. Our results reveal that both groups of children showed processing of visual speech stimuli, but with different scalp distribution. Children without developmental dyslexia showed a vMMR with typical posterior distribution. In contrast, children with developmental dyslexia showed a vMMR with anterior distribution, which was even more pronounced in children with severe phonological deficits and very low spelling abilities. As anterior scalp distributions are typically reported for auditory speech processing, the anterior vMMR of children with developmental dyslexia might suggest an attempt to anticipate potentially upcoming auditory speech information in order to support phonological processing, which has been shown to be deficient in children with developmental dyslexia. © 2015 John Wiley & Sons Ltd.

  2. Quantitative facial asymmetry: using three-dimensional photogrammetry to measure baseline facial surface symmetry.

    Science.gov (United States)

    Taylor, Helena O; Morrison, Clinton S; Linden, Olivia; Phillips, Benjamin; Chang, Johnny; Byrne, Margaret E; Sullivan, Stephen R; Forrest, Christopher R

    2014-01-01

    subjectively, can be easily and reproducibly measured using three-dimensional photogrammetry. The RMSD for facial asymmetry of healthy volunteers clusters at approximately 0.80 ± 0.24 mm. Patients with facial asymmetry due to a pathologic process can be differentiated from normative facial asymmetry based on their RMSDs.

  3. [Neurological disease and facial recognition].

    Science.gov (United States)

    Kawamura, Mitsuru; Sugimoto, Azusa; Kobayakawa, Mutsutaka; Tsuruya, Natsuko

    2012-07-01

    To discuss the neurological basis of facial recognition, we present our case reports of impaired recognition and a review of previous literature. First, we present a case of infarction and discuss prosopagnosia, which has had a large impact on face recognition research. From a study of patient symptoms, we assume that prosopagnosia may be caused by unilateral right occipitotemporal lesion and right cerebral dominance of facial recognition. Further, circumscribed lesion and degenerative disease may also cause progressive prosopagnosia. Apperceptive prosopagnosia is observed in patients with posterior cortical atrophy (PCA), pathologically considered as Alzheimer's disease, and associative prosopagnosia in frontotemporal lobar degeneration (FTLD). Second, we discuss face recognition as part of communication. Patients with Parkinson disease show social cognitive impairments, such as difficulty in facial expression recognition and deficits in theory of mind as detected by the reading the mind in the eyes test. Pathological and functional imaging studies indicate that social cognitive impairment in Parkinson disease is possibly related to damages in the amygdalae and surrounding limbic system. The social cognitive deficits can be observed in the early stages of Parkinson disease, and even in the prodromal stage, for example, patients with rapid eye movement (REM) sleep behavior disorder (RBD) show impairment in facial expression recognition. Further, patients with myotonic dystrophy type 1 (DM 1), which is a multisystem disease that mainly affects the muscles, show social cognitive impairment similar to that of Parkinson disease. Our previous study showed that facial expression recognition impairment of DM 1 patients is associated with lesion in the amygdalae and insulae. Our study results indicate that behaviors and personality traits in DM 1 patients, which are revealed by social cognitive impairment, are attributable to dysfunction of the limbic system.

  4. Camouflaging Facial Emphysema: a new syndrome.

    Science.gov (United States)

    Martínez-Carpio, Pedro A; del Campillo, Ángel F Bedoya; Leal, María Jesús; Lleopart, Núria; Marrón, María T; Trelles, Mario A

    2012-10-10

    Camouflaging Facial Emphysema, as is defined in this paper, is the result of a simple technique used by the patient to deform his face in order to prevent recognition at a police identity parade. The patient performs two punctures in the mucosa at the rear of the upper lip and, after several Valsalva manoeuvres, manages to deform his face in less than 15 min by inducing subcutaneous facial emphysema. The examination shows an accumulation of air in the face, with no laterocervical, mediastinal or thoracic affectations. The swelling is primarily observed in the eyelids and the orbital and zygomatic regions, whereas it is less prominent in other areas of the face. Patients therefore manage to avoid recognition in properly conducted police identity parades. Only isolated cases of self-induced facial emphysema have been reported to date among psychiatric patients and prison inmates. However, the facial emphysema herein described exhibits specific characteristics of significant medical, deontological, social, police-related, and legal implications.

  5. Facial Reconstruction and Rehabilitation.

    Science.gov (United States)

    Guntinas-Lichius, Orlando; Genther, Dane J; Byrne, Patrick J

    2016-01-01

    Extracranial infiltration of the facial nerve by salivary gland tumors is the most frequent cause of facial palsy secondary to malignancy. Nevertheless, facial palsy related to salivary gland cancer is uncommon. Therefore, reconstructive facial reanimation surgery is not a routine undertaking for most head and neck surgeons. The primary aims of facial reanimation are to restore tone, symmetry, and movement to the paralyzed face. Such restoration should improve the patient's objective motor function and subjective quality of life. The surgical procedures for facial reanimation rely heavily on long-established techniques, but many advances and improvements have been made in recent years. In the past, published experiences on strategies for optimizing functional outcomes in facial paralysis patients were primarily based on small case series and described a wide variety of surgical techniques. However, in the recent years, larger series have been published from high-volume centers with significant and specialized experience in surgical and nonsurgical reanimation of the paralyzed face that have informed modern treatment. This chapter reviews the most important diagnostic methods used for the evaluation of facial paralysis to optimize the planning of each individual's treatment and discusses surgical and nonsurgical techniques for facial rehabilitation based on the contemporary literature.

  6. [Facial tics and spasms].

    Science.gov (United States)

    Potgieser, Adriaan R E; van Dijk, J Marc C; Elting, Jan Willem J; de Koning-Tijssen, Marina A J

    2014-01-01

    Facial tics and spasms are socially incapacitating, but effective treatment is often available. The clinical picture is sufficient for distinguishing between the different diseases that cause this affliction.We describe three cases of patients with facial tics or spasms: one case of tics, which are familiar to many physicians; one case of blepharospasms; and one case of hemifacial spasms. We discuss the differential diagnosis and the treatment possibilities for facial tics and spasms. Early diagnosis and treatment is important, because of the associated social incapacitation. Botulin toxin should be considered as a treatment option for facial tics and a curative neurosurgical intervention should be considered for hemifacial spasms.

  7. Altering sensorimotor feedback disrupts visual discrimination of facial expressions.

    Science.gov (United States)

    Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula

    2016-08-01

    Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.

  8. Bodily action penetrates affective perception

    Directory of Open Access Journals (Sweden)

    Carlo Fantoni

    2016-02-01

    Full Text Available Fantoni & Gerbino (2014 showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP, they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015 would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions, in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top

  9. History of facial pain diagnosis

    DEFF Research Database (Denmark)

    Zakrzewska, Joanna M; Jensen, Troels S

    2017-01-01

    Premise Facial pain refers to a heterogeneous group of clinically and etiologically different conditions with the common clinical feature of pain in the facial area. Among these conditions, trigeminal neuralgia (TN), persistent idiopathic facial pain, temporomandibular joint pain, and trigeminal...

  10. Electrocortical evidence for preferential processing of dynamic pain expressions compared to other emotional expressions.

    Science.gov (United States)

    Reicherts, Philipp; Wieser, Matthias J; Gerdes, Antje B M; Likowski, Katja U; Weyers, Peter; Mühlberger, Andreas; Pauli, Paul

    2012-09-01

    Decoding pain in others is of high individual and social benefit in terms of harm avoidance and demands for accurate care and protection. The processing of facial expressions includes both specific neural activation and automatic congruent facial muscle reactions. While a considerable number of studies investigated the processing of emotional faces, few studies specifically focused on facial expressions of pain. Analyses of brain activity and facial responses elicited by the perception of facial pain expressions in contrast to other emotional expressions may unravel the processing specificities of pain-related information in healthy individuals and may contribute to explaining attentional biases in chronic pain patients. In the present study, 23 participants viewed short video clips of neutral, emotional (joy, fear), and painful facial expressions while affective ratings, event-related brain responses, and facial electromyography (Musculus corrugator supercilii, M. orbicularis oculi, M. zygomaticus major, M. levator labii) were recorded. An emotion recognition task indicated that participants accurately decoded all presented facial expressions. Electromyography analysis suggests a distinct pattern of facial response detected in response to happy faces only. However, emotion-modulated late positive potentials revealed a differential processing of pain expressions compared to the other facial expressions, including fear. Moreover, pain faces were rated as most negative and highly arousing. Results suggest a general processing bias in favor of pain expressions. Findings are discussed in light of attentional demands of pain-related information and communicative aspects of pain expressions.

  11. The Turner Syndrome: Cognitive Deficits, Affective Discrimination, and Behavior Problems.

    Science.gov (United States)

    McCauley, Elizabeth; And Others

    1987-01-01

    The study attemped to link cognitive and social problems seen in girls with Turner syndrome by assessing the girls' ability to process affective cues. Seventeen 9- to 17-year-old girls diagnosed with Turner syndrome were compared to a matched control group on a task which required interpretation of affective intention from facial expression.…

  12. The face-specific N170 component is modulated by emotional facial expression

    Directory of Open Access Journals (Sweden)

    Tottenham Nim

    2007-01-01

    Full Text Available Abstract Background According to the traditional two-stage model of face processing, the face-specific N170 event-related potential (ERP is linked to structural encoding of face stimuli, whereas later ERP components are thought to reflect processing of facial affect. This view has recently been challenged by reports of N170 modulations by emotional facial expression. This study examines the time-course and topography of the influence of emotional expression on the N170 response to faces. Methods Dense-array ERPs were recorded in response to a set (n = 16 of fear and neutral faces. Stimuli were normalized on dimensions of shape, size and luminance contrast distribution. To minimize task effects related to facial or emotional processing, facial stimuli were irrelevant to a primary task of learning associative pairings between a subsequently presented visual character and a spoken word. Results N170 to faces showed a strong modulation by emotional facial expression. A split half analysis demonstrates that this effect was significant both early and late in the experiment and was therefore not associated with only the initial exposures of these stimuli, demonstrating a form of robustness against habituation. The effect of emotional modulation of the N170 to faces did not show significant interaction with the gender of the face stimulus, or hemisphere of recording sites. Subtracting the fear versus neutral topography provided a topography that itself was highly similar to the face N170. Conclusion The face N170 response can be influenced by emotional expressions contained within facial stimuli. The topography of this effect is consistent with the notion that fear stimuli exaggerates the N170 response itself. This finding stands in contrast to previous models suggesting that N170 processes linked to structural analysis of faces precede analysis of emotional expression, and instead may reflect early top-down modulation from neural systems involved in

  13. Recognition of Facial Expressions of Different Emotional Intensities in Patients with Frontotemporal Lobar Degeneration

    Directory of Open Access Journals (Sweden)

    Roy P. C. Kessels

    2007-01-01

    Full Text Available Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD. Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.

  14. Facial expression and sarcasm.

    Science.gov (United States)

    Rockwell, P

    2001-08-01

    This study examined facial expression in the presentation of sarcasm. 60 responses (sarcastic responses = 30, nonsarcastic responses = 30) from 40 different speakers were coded by two trained coders. Expressions in three facial areas--eyebrow, eyes, and mouth--were evaluated. Only movement in the mouth area significantly differentiated ratings of sarcasm from nonsarcasm.

  15. Holistic facial expression classification

    Science.gov (United States)

    Ghent, John; McDonald, J.

    2005-06-01

    This paper details a procedure for classifying facial expressions. This is a growing and relatively new type of problem within computer vision. One of the fundamental problems when classifying facial expressions in previous approaches is the lack of a consistent method of measuring expression. This paper solves this problem by the computation of the Facial Expression Shape Model (FESM). This statistical model of facial expression is based on an anatomical analysis of facial expression called the Facial Action Coding System (FACS). We use the term Action Unit (AU) to describe a movement of one or more muscles of the face and all expressions can be described using the AU's described by FACS. The shape model is calculated by marking the face with 122 landmark points. We use Principal Component Analysis (PCA) to analyse how the landmark points move with respect to each other and to lower the dimensionality of the problem. Using the FESM in conjunction with Support Vector Machines (SVM) we classify facial expressions. SVMs are a powerful machine learning technique based on optimisation theory. This project is largely concerned with statistical models, machine learning techniques and psychological tools used in the classification of facial expression. This holistic approach to expression classification provides a means for a level of interaction with a computer that is a significant step forward in human-computer interaction.

  16. Facial talon cusps.

    LENUS (Irish Health Repository)

    McNamara, T

    1997-12-01

    This is a report of two patients with isolated facial talon cusps. One occurred on a permanent mandibular central incisor; the other on a permanent maxillary canine. The locations of these talon cusps suggests that the definition of a talon cusp include teeth in addition to the incisor group and be extended to include the facial aspect of teeth.

  17. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

    Directory of Open Access Journals (Sweden)

    Mar Saneiro

    2014-01-01

    Full Text Available We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners’ affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  18. Event-related alpha suppression in response to facial motion.

    Science.gov (United States)

    Girges, Christine; Wright, Michael J; Spencer, Janine V; O'Brien, Justin M D

    2014-01-01

    While biological motion refers to both face and body movements, little is known about the visual perception of facial motion. We therefore examined alpha wave suppression as a reduction in power is thought to reflect visual activity, in addition to attentional reorienting and memory processes. Nineteen neurologically healthy adults were tested on their ability to discriminate between successive facial motion captures. These animations exhibited both rigid and non-rigid facial motion, as well as speech expressions. The structural and surface appearance of these facial animations did not differ, thus participants decisions were based solely on differences in facial movements. Upright, orientation-inverted and luminance-inverted facial stimuli were compared. At occipital and parieto-occipital regions, upright facial motion evoked a transient increase in alpha which was then followed by a significant reduction. This finding is discussed in terms of neural efficiency, gating mechanisms and neural synchronization. Moreover, there was no difference in the amount of alpha suppression evoked by each facial stimulus at occipital regions, suggesting early visual processing remains unaffected by manipulation paradigms. However, upright facial motion evoked greater suppression at parieto-occipital sites, and did so in the shortest latency. Increased activity within this region may reflect higher attentional reorienting to natural facial motion but also involvement of areas associated with the visual control of body effectors.

  19. Mothers' pupillary responses to infant facial expressions.

    Science.gov (United States)

    Yrttiaho, Santeri; Niehaus, Dana; Thomas, Eileen; Leppänen, Jukka M

    2017-02-06

    Human parental care relies heavily on the ability to monitor and respond to a child's affective states. The current study examined pupil diameter as a potential physiological index of mothers' affective response to infant facial expressions. Pupillary time-series were measured from 86 mothers of young infants in response to an array of photographic infant faces falling into four emotive categories based on valence (positive vs. negative) and arousal (mild vs. strong). Pupil dilation was highly sensitive to the valence of facial expressions, being larger for negative vs. positive facial expressions. A separate control experiment with luminance-matched non-face stimuli indicated that the valence effect was specific to facial expressions and cannot be explained by luminance confounds. Pupil response was not sensitive to the arousal level of facial expressions. The results show the feasibility of using pupil diameter as a marker of mothers' affective responses to ecologically valid infant stimuli and point to a particularly prompt maternal response to infant distress cues.

  20. Time perception and dynamics of facial expressions of emotions.

    Directory of Open Access Journals (Sweden)

    Sophie L Fayolle

    Full Text Available Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant, but one was high-arousing (expressing anger and the other low-arousing (expressing sadness. Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.

  1. Spectrum of facial paralysis in chronic suppurative otitis media

    Directory of Open Access Journals (Sweden)

    Shyam S Kumar

    2012-01-01

    Full Text Available Surgical management of facial paralysis associated with Chronic suppurative otitis media (CSOM may vary depending on the duration and extent of paralysis and also the pathology affecting the nerve. Four illustrative cases are described. The literature is reviewed with regard to the management of the facial nerve in different situations.

  2. Evaluation and management of the patient with postoperative facial paralysis.

    Science.gov (United States)

    Hadlock, Tessa

    2012-05-01

    Postoperative facial paralysis comprises a spectrum of injuries ranging from mild, temporary weakness to severe, permanent paralysis, affecting as little as one muscle group to as much as the full hemiface. Herein is presented an introductory review of iatrogenic facial paralysis, from initial evaluation and decision making to the full range of conservative and operative management.

  3. Spontaneous Facial Mimicry in Response to Dynamic Facial Expressions

    Science.gov (United States)

    Sato, Wataru; Yoshikawa, Sakiko

    2007-01-01

    Based on previous neuroscientific evidence indicating activation of the mirror neuron system in response to dynamic facial actions, we hypothesized that facial mimicry would occur while subjects viewed dynamic facial expressions. To test this hypothesis, dynamic/static facial expressions of anger/happiness were presented using computer-morphing…

  4. Sound-induced facial synkinesis following facial nerve paralysis

    NARCIS (Netherlands)

    Ma, Ming-San; van der Hoeven, Johannes H.; Nicolai, Jean-Philippe A.; Meek, Marcel F.

    2009-01-01

    Facial synkinesis (or synkinesia) (FS) occurs frequently after paresis or paralysis of the facial nerve and is in most cases due to aberrant regeneration of (branches of) the facial nerve. Patients suffer from inappropriate and involuntary synchronous facial muscle contractions. Here we describe two

  5. Basic Abnormalities in Visual Processing Affect Face Processing at an Early Age in Autism Spectrum Disorder

    NARCIS (Netherlands)

    Vlamings, Petra Hendrika Johanna Maria; Jonkman, Lisa Marthe; van Daalen, Emma; van der Gaag, Rutger Jan; Kemner, Chantal

    2010-01-01

    Background: A detailed visual processing style has been noted in autism spectrum disorder (ASD); this contributes to problems in face processing and has been directly related to abnormal processing of spatial frequencies (SFs). Little is known about the early development of face processing in ASD an

  6. Positive and negative facial emotional expressions: the effect on infants' and children's facial identity recognition

    OpenAIRE

    Brenna,

    2013-01-01

    Aim of the present study was to investigate the origin and the development of the interdipendence between identity recognition and facial emotional expression processing, suggested by recent models on face processing (Calder & Young, 2005) and supported by outcomes on adults (e.g. Baudouin, Gilibert, Sansone, & Tiberghien, 2000; Schweinberger & Soukup, 1998). Particularly the effect of facial emotional expressions on infants’ and children’s ability to recognize identity of a face was explored...

  7. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  8. Does Parkinson's disease lead to alterations in the facial expression of pain?

    NARCIS (Netherlands)

    Priebe, Janosch A; Kunz, Miriam; Morcinek, Christian; Rieckmann, Peter; Lautenbacher, Stefan

    2015-01-01

    Hypomimia which refers to a reduced degree in facial expressiveness is a common sign in Parkinson's disease (PD). The objective of our study was to investigate how hypomimia affects PD patients' facial expression of pain. The facial expressions of 23 idiopathic PD patients in the Off-phase (without

  9. The effects of an action video game on visual and affective information processing.

    Science.gov (United States)

    Bailey, Kira; West, Robert

    2013-04-04

    Playing action video games can have beneficial effects on visuospatial cognition and negative effects on social information processing. However, these two effects have not been demonstrated in the same individuals in a single study. The current study used event-related brain potentials (ERPs) to examine the effects of playing an action or non-action video game on the processing of emotion in facial expression. The data revealed that 10h of playing an action or non-action video game had differential effects on the ERPs relative to a no-contact control group. Playing an action game resulted in two effects: one that reflected an increase in the amplitude of the ERPs following training over the right frontal and posterior regions that was similar for angry, happy, and neutral faces; and one that reflected a reduction in the allocation of attention to happy faces. In contrast, playing a non-action game resulted in changes in slow wave activity over the central-parietal and frontal regions that were greater for targets (i.e., angry and happy faces) than for non-targets (i.e., neutral faces). These data demonstrate that the contrasting effects of action video games on visuospatial and emotion processing occur in the same individuals following the same level of gaming experience. This observation leads to the suggestion that caution should be exercised when using action video games to modify visual processing, as this experience could also have unintended effects on emotion processing. Published by Elsevier B.V.

  10. PCA facial expression recognition

    Science.gov (United States)

    El-Hori, Inas H.; El-Momen, Zahraa K.; Ganoun, Ali

    2013-12-01

    This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. The comparative study of Facial Expression Recognition (FER) techniques namely Principal Component's analysis (PCA) and PCA with Gabor filters (GF) is done. The objective of this research is to show that PCA with Gabor filters is superior to the first technique in terms of recognition rate. To test and evaluates their performance, experiments are performed using real database by both techniques. The universally accepted five principal emotions to be recognized are: Happy, Sad, Disgust and Angry along with Neutral. The recognition rates are obtained on all the facial expressions.

  11. Searching for Judy: How Small Mysteries Affect Narrative Processes and Memory

    Science.gov (United States)

    Love, Jessica; McKoon, Gail; Gerrig, Richard J.

    2010-01-01

    Current theories of text processing say little about how authors' narrative choices, including the introduction of small mysteries, can affect readers' narrative experiences. Gerrig, Love, and McKoon (2009) provided evidence that 1 type of small mystery--a character introduced without information linking him or her to the story--affects readers'…

  12. Treatment of impaired affective information processing and social cognition in neuropsychiatric patients: A systematic review

    NARCIS (Netherlands)

    Wingbermuhle, Ellen; Roelofs, R.L.; Kessels, R.P.C.; Egger, J.I.M.

    2014-01-01

    Objective: Impairments in affective information processing (AIP) and social cognition (SC) have been associated with psychiatric disorders, inadequate social interaction, and lowered self-esteem. Consequently, problems in AIP and SC impede daily functioning and affect quality of life. Promoting impr

  13. A comparison of affective information processing in Noonan and Turner syndromes: Evidence of alexithymia

    NARCIS (Netherlands)

    Roelofs, R.L.; Wingbermuhle, Ellen; Freriks, K.; Verhaak, C.M.; Kessels, R.P.C.; Egger, J.I.M.

    2014-01-01

    Objective: Noonan (NS) and Turner syndrome (TS) are associated with cognitive problems and difficulties in affective information processing. While both phenotypes share physical features, genetic etiology and neuropsychological phenotype differ significantly. The present study examines putative

  14. Study of individual and group affective processes in the crew of a simulated mission to Mars: Positive affectivity as a valuable indicator of changes in the crew affectivity

    Science.gov (United States)

    Poláčková Šolcová, Iva; Lačev, Alek; Šolcová, Iva

    2014-07-01

    The success of a long-duration space mission depends on various technical demands as well as on the psychological (cognitive, affective, and motivational) adaptation of crewmembers and the quality of interactions within the crew. We examined the ways crewmembers of a 520-day simulated spaceflight to Mars (held in the Institute for Biomedical Problems, in Moscow) experienced and regulated their moods and emotions. Results show that crewmembers experienced predominantly positive emotions throughout their 520-day isolation and the changes in mood of the crewmembers were asynchronous and balanced. The study suggests that during the simulation, crewmembers experienced and regulated their emotions differently than they usually do in their everyday life. In isolation, crewmembers preferred to suppress and neutralize their negative emotions and express overtly only emotions with positive valence. Although the affective processes were almost invariable throughout the simulation, two periods of time when the level of positive emotions declined were identified. Regarding the findings, the paper suggests that changes in positive affectivity could be a more valuable indicator of human experience in demanding but professional environments than changes in negative affectivity. Finally, the paper discusses the phenomenology of emotions during a real space mission.

  15. Surgical Treatment of Facial Paralysis

    OpenAIRE

    Mehta, Ritvik P.

    2009-01-01

    The management of facial paralysis is one of the most complex areas of reconstructive surgery. Given the wide variety of functional and cosmetic deficits in the facial paralysis patient, the reconstructive surgeon requires a thorough understanding of the surgical techniques available to treat this condition. This review article will focus on surgical management of facial paralysis and the treatment options available for acute facial paralysis (2 yr). For acute facial paralysis, the main surgi...

  16. Improvement of chronic facial pain and facial dyskinesia with the help of botulinum toxin application

    Directory of Open Access Journals (Sweden)

    Ellies Maik

    2007-08-01

    Full Text Available Abstract Background Facial pain syndromes can be very heterogeneous and need individual diagnosis and treatment. This report describes an interesting case of facial pain associated with eczema and an isolated dyskinesia of the lower facial muscles following dental surgery. Different aspects of the pain, spasms and the eczema will be discussed. Case presentation In this patient, persistent intense pain arose in the lower part of her face following a dental operation. The patient also exhibited dyskinesia of her caudal mimic musculature that was triggered by specific movements. Several attempts at therapy had been unsuccessful. We performed local injections of botulinum toxin type A (BTX-A into the affected region of the patient's face. Pain relief was immediate following each set of botulinum toxin injections. The follow up time amounts 62 weeks. Conclusion Botulinum toxin type A (BTX-A can be a safe and effective therapy for certain forms of facial pain syndromes.

  17. Persistent facial pain conditions

    DEFF Research Database (Denmark)

    Forssell, Heli; Alstergren, Per; Bakke, Merete

    2016-01-01

    , clinical features, consequences, central and peripheral mechanisms, diagnostic criteria (DC/TMD), and principles of management. For each of the neuropathic facial pain entities, the definitions, prevalence, clinical features, and diagnostics are described. The current understanding of the pathophysiology......Persistent facial pains, especially temporomandibular disorders (TMD), are common conditions. As dentists are responsible for the treatment of most of these disorders, up-to date knowledge on the latest advances in the field is essential for successful diagnosis and management. The review covers...... TMD, and different neuropathic or putative neuropathic facial pains such as persistent idiopathic facial pain and atypical odontalgia, trigeminal neuralgia and painful posttraumatic trigeminal neuropathy. The article presents an overview of TMD pain as a biopsychosocial condition, its prevalence...

  18. Persistent facial pain conditions

    DEFF Research Database (Denmark)

    Forssell, Heli; Alstergren, Per; Bakke, Merete

    2016-01-01

    TMD, and different neuropathic or putative neuropathic facial pains such as persistent idiopathic facial pain and atypical odontalgia, trigeminal neuralgia and painful posttraumatic trigeminal neuropathy. The article presents an overview of TMD pain as a biopsychosocial condition, its prevalence......Persistent facial pains, especially temporomandibular disorders (TMD), are common conditions. As dentists are responsible for the treatment of most of these disorders, up-to date knowledge on the latest advances in the field is essential for successful diagnosis and management. The review covers......, clinical features, consequences, central and peripheral mechanisms, diagnostic criteria (DC/TMD), and principles of management. For each of the neuropathic facial pain entities, the definitions, prevalence, clinical features, and diagnostics are described. The current understanding of the pathophysiology...

  19. Persistent facial pain conditions

    DEFF Research Database (Denmark)

    Forssell, Heli; Alstergren, Per; Bakke, Merete;

    2016-01-01

    , clinical features, consequences, central and peripheral mechanisms, diagnostic criteria (DC/TMD), and principles of management. For each of the neuropathic facial pain entities, the definitions, prevalence, clinical features, and diagnostics are described. The current understanding of the pathophysiology...

  20. Identification based on facial parts

    Directory of Open Access Journals (Sweden)

    Stevanov Zorica

    2007-01-01

    Full Text Available Two opposing views dominate face identification literature, one suggesting that the face is processed as a whole and another suggesting analysis based on parts. Our research tried to establish which of these two is the dominant strategy and our results fell in the direction of analysis based on parts. The faces were covered with a mask and the participants were uncovering different parts, one at the time, in an attempt to identify a person. Already at the level of a single facial feature, such as mouth or eye and top of the nose, some observers were capable to establish the identity of a familiar face. Identification is exceptionally successful when a small assembly of facial parts is visible, such as eye, eyebrow and the top of the nose. Some facial parts are not very informative on their own but do enhance recognition when given as a part of such an assembly. Novel finding here is importance of the top of the nose for the face identification. Additionally observers have a preference toward the left side of the face. Typically subjects view the elements in the following order: left eye, left eyebrow, right eye, lips, region between the eyes, right eyebrow, region between the eyebrows, left check, right cheek. When observers are not in a position to see eyes, eyebrows or top of the nose, they go for lips first and then region between the eyebrows, region between the eyes, left check, right cheek and finally chin.

  1. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    OpenAIRE

    Roberta eDaini; Chiara Maddalena Comparetti; Paola eRicciardelli

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently fro...

  2. Management of facial blushing

    DEFF Research Database (Denmark)

    Licht, Peter B; Pilegaard, Hans K

    2008-01-01

    people. Side effects are frequent, but most patients are satisfied with the operation. In the short term, the key to success in sympathetic surgery for facial blushing lies in a meticulous and critical patient selection and in ensuring that the patient is thoroughly informed about the high risk of side...... effects. In the long term, the key to success in sympathetic surgery for facial blushing lies in more quality research comparing surgical, pharmacologic, and psychotherapeutic treatments....

  3. Simultaneous facial feature tracking and facial expression recognition.

    Science.gov (United States)

    Li, Yongqiang; Wang, Shangfei; Zhao, Yongping; Ji, Qiang

    2013-07-01

    The tracking and recognition of facial activities from images or videos have attracted great attention in computer vision field. Facial activities are characterized by three levels. First, in the bottom level, facial feature points around each facial component, i.e., eyebrow, mouth, etc., capture the detailed face shape information. Second, in the middle level, facial action units, defined in the facial action coding system, represent the contraction of a specific set of facial muscles, i.e., lid tightener, eyebrow raiser, etc. Finally, in the top level, six prototypical facial expressions represent the global facial muscle movement and are commonly used to describe the human emotion states. In contrast to the mainstream approaches, which usually only focus on one or two levels of facial activities, and track (or recognize) them separately, this paper introduces a unified probabilistic framework based on the dynamic Bayesian network to simultaneously and coherently represent the facial evolvement in different levels, their interactions and their observations. Advanced machine learning methods are introduced to learn the model based on both training data and subjective prior knowledge. Given the model and the measurements of facial motions, all three levels of facial activities are simultaneously recognized through a probabilistic inference. Extensive experiments are performed to illustrate the feasibility and effectiveness of the proposed model on all three level facial activities.

  4. Changes in attentional processing and affective reactivity in pregnancy and postpartum

    Directory of Open Access Journals (Sweden)

    Gollan JK

    2014-11-01

    Full Text Available Jackie K Gollan, Laina Rosebrock, Denada Hoxha, Katherine L Wisner Asher Center for the Study and Treatment of Depressive Disorders, Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA Abstract: The aim of this review is to provide an overview of the research in attentional processing and affective reactivity in pregnancy and postpartum to inform future research. Numerous changes occur in attentional processing and affective reactivity across the childbearing period. This review focuses on the definition and methods of measuring attentional processing and affective reactivity. We discuss research studies that have examined the changes in these two processes during the perinatal phases of pregnancy and postpartum, with and without depression and anxiety. We evaluate the importance of using multiple levels of measurement, including physiological and neuroimaging techniques, to study these processes via implicit and explicit tasks. Research that has identified regions of brain activation using functional magnetic resonance imaging as well as other physiological assessments is integrated into the discussion. The importance of using sophisticated methodological techniques in future studies, such as multiple mediation models, for the purpose of elucidating mechanisms of change during these processes in pregnancy and postpartum is emphasized. We conclude with a discussion of the effect of these processes on maternal psychological functioning and infant outcomes. These processes support a strategy for individualizing treatment for pregnant and postpartum women suffering from depression and anxiety. Keywords: attentional processing, emotion, affective reactivity, depression, pregnancy, postpartum

  5. Mutual influences of pain and emotional face processing

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2014-10-01

    Full Text Available The perception of unpleasant stimuli enhances whereas the perception of pleasant stimuli decreases pain perception. In contrast, the effects of pain on the processing of emotional stimuli are much less known. Especially given the recent interest in facial expressions of pain as a special category of emotional stimuli, a main topic in this research line is the mutual influence of pain and facial expression processing. Therefore, in this mini-review we selectively summarize research on the effects of emotional stimuli on pain, but more extensively turn to the opposite direction namely how pain influences concurrent processing of affective stimuli such as facial expressions. Based on the motivational priming theory one may hypothesize that the perception of pain enhances the processing of unpleasant stimuli and decreases the processing of pleasant stimuli. This review reveals that the literature is only partly consistent with this assumption: Pain reduces the processing of pleasant pictures and happy facial expressions, but does not - or only partly - affect processing of unpleasant pictures. However, it was demonstrated that pain selectively enhances the processing of facial expressions if these are pain-related (i.e. facial expressions of pain. Extending a mere affective modulation theory, the latter results suggest pain-specific effects which may be explained by the perception-action model of empathy. Together, these results underscore the important mutual influence of pain and emotional face processing.

  6. Mutual influences of pain and emotional face processing.

    Science.gov (United States)

    Wieser, Matthias J; Gerdes, Antje B M; Reicherts, Philipp; Pauli, Paul

    2014-01-01

    The perception of unpleasant stimuli enhances whereas the perception of pleasant stimuli decreases pain perception. In contrast, the effects of pain on the processing of emotional stimuli are much less known. Especially given the recent interest in facial expressions of pain as a special category of emotional stimuli, a main topic in this research line is the mutual influence of pain and facial expression processing. Therefore, in this mini-review we selectively summarize research on the effects of emotional stimuli on pain, but more extensively turn to the opposite direction namely how pain influences concurrent processing of affective stimuli such as facial expressions. Based on the motivational priming theory one may hypothesize that the perception of pain enhances the processing of unpleasant stimuli and decreases the processing of pleasant stimuli. This review reveals that the literature is only partly consistent with this assumption: pain reduces the processing of pleasant pictures and happy facial expressions, but does not - or only partly - affect processing of unpleasant pictures. However, it was demonstrated that pain selectively enhances the processing of facial expressions if these are pain-related (i.e., facial expressions of pain). Extending a mere affective modulation theory, the latter results suggest pain-specific effects which may be explained by the perception-action model of empathy. Together, these results underscore the important mutual influence of pain and emotional face processing.

  7. The Impact of Affect on Out-Group Judgments Depends on Dominant Information-Processing Styles: Evidence From Incidental and Integral Affect Paradigms.

    Science.gov (United States)

    Isbell, Linda M; Lair, Elicia C; Rovenpor, Daniel R

    2016-04-01

    Two studies tested the affect-as-cognitive-feedback model, in which positive and negative affective states are not uniquely associated with particular processing styles, but rather serve as feedback about currently accessible processing styles. The studies extend existing work by investigating (a) both incidental and integral affect, (b) out-group judgments, and (c) downstream consequences. We manipulated processing styles and either incidental (Study 1) or integral (Study 2) affect and measured perceptions of out-group homogeneity. Positive (relative to negative) affect increased out-group homogeneity judgments when global processing was primed, but under local priming, the effect reversed (Studies 1 and 2). A similar interactive effect emerged on attributions, which had downstream consequences for behavioral intentions (Study 2). These results demonstrate that both incidental and integral affect do not directly produce specific processing styles, but rather influence thinking by providing feedback about currently accessible processing styles.

  8. Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts

    DEFF Research Database (Denmark)

    Brond, J. C.; Arvidsson, D.

    2016-01-01

    the amount of activity counts generated was less, indicating that raw data stored in the GT3X+ monitor is processed. Between 600 and 1,600 more counts per minute were generated with the sampling frequencies 40 and 100 Hz compared with 30 Hz during running. Sampling frequency affects the processing of Acti...

  9. PFAPA with facial swelling- a new association?

    Science.gov (United States)

    Khodaghalian, B; Tewary, K K; Narchi, H

    2013-05-01

    PFAPA (periodic fever, apthous stomatitis, pharyngitis, cervical adenitis) is a rare condition of unknown cause affecting children. Although the exact etiology is unknown, inflammatory, immunological or genetic causes have been suggested. The diagnosis is made by exclusion of other causes of periodic fever. Although management is essentially symptomatic, single corticosteroid dose, tonsillectomy and Cimetidine has been shown to be associated with resolution of symptoms. Although abdominal pain and genital ulcers have been reported in association with PFAPA, unilateral transient facial swelling has not been previously reported. The authors present a hitherto unreported association of PFAPA with recurrent episodes of unilateral facial swelling.

  10. Peripheral facial nerve palsy after therapeutic endoscopy.

    Science.gov (United States)

    Kim, Eun Jeong; Lee, Jun; Lee, Ji Woon; Lee, Jun Hyung; Park, Chol Jin; Kim, Young Dae; Lee, Hyun Jin

    2015-03-01

    Peripheral facial nerve palsy (FNP) is a mononeuropathy that affects the peripheral part of the facial nerve. Primary causes of peripheral FNP remain largely unknown, but detectable causes include systemic infections (viral and others), trauma, ischemia, tumor, and extrinsic compression. Peripheral FNP in relation to extrinsic compression has rarely been described in case reports. Here, we report a case of a 71-year-old man who was diagnosed with peripheral FNP following endoscopic submucosal dissection. This case is the first report of the development of peripheral FNP in a patient undergoing therapeutic endoscopy. We emphasize the fact that physicians should be attentive to the development of peripheral FNP following therapeutic endoscopy.

  11. Emotional communication in the context of joint attention for food stimuli: effects on attentional and affective processing.

    Science.gov (United States)

    Soussignan, Robert; Schaal, Benoist; Boulanger, Véronique; Garcia, Samuel; Jiang, Tao

    2015-01-01

    Guided by distinct theoretical frameworks (the embodiment theories, shared-signal hypothesis, and appraisal theories), we examined the effects of gaze direction and emotional expressions (joy, disgust, and neutral) of virtual characters on attention orienting and affective reactivity of participants while they were engaged in joint attention for food stimuli contrasted by preference (disliked, moderately liked, and liked). The participants were exposed to videos of avatars looking at food and displaying facial expressions with their gaze directed either toward the food only or toward the food and participants consecutively. We recorded eye-tracking responses, heart rate, facial electromyography (zygomatic, corrugator, and levator labii regions), and food wanting/liking. The avatars' joy faces increased the participants' zygomatic reactions and food liking, with mutual eye contact boosting attentional responses. Eye contact also fostered disgust reactions to disliked food, regardless of the avatars' expressions. The findings show that joint attention for food accompanied by face-to-face emotional communication elicits differential attentional and affective responses. The findings appear consistent with the appraisal theories of emotion.

  12. Effects of the condylar process fracture on facial symmetry in rats submitted to protein undernutrition Efeitos da fratura do processo condilar na simetria facial em ratos submetidos à desnutrição protéica

    Directory of Open Access Journals (Sweden)

    Lucimar Rodrigues

    2011-04-01

    Full Text Available PURPOSE: To investigate the facial symmetry of rats submitted to experimental mandibular condyle fracture and with protein undernutrition (8% of protein by means of cephalometric measurements. METHODS: Forty-five adult Wistar rats were distributed in three groups: fracture group, submitted to condylar fracture with no changes in diet; undernourished fracture group, submitted to hypoproteic diet and condylar fracture; undernourished group, kept until the end of experiment, without condylar fracture. Displaced fractures of the right condyle were induced under general anesthesia. The specimens were submitted to axial radiographic incidence, and cephalometric mensurations were made using a computer system. The values obtained were subjected to statistical analyses among the groups and between the sides in each group. RESULTS: There was significative decrease of the values of serum proteins and albumin in the undernourished fracture group. There was deviation of the median line of the mandible relative to the median line of the maxilla, significative to undernutrition fracture group, as well as asymmetry of the maxilla and mandible, in special in the final period of experiment. CONCLUSION: The mandibular condyle fracture in rats with proteic undernutrition induced an asymmetry of the mandible, also leading to consequences in the maxilla.OBJETIVO: Investigar a simetria facial de ratos submetidos à fratura experimental de côndilo mandibular e com desnutrição protéica (8% de proteína por meio de mensurações cefalométricas. MÉTODOS: 45 ratos Wistar adultos foram distribuídos em três grupos: grupo fraturado, submetido a fratura condilar sem alteração na dieta; grupo fraturado desnutrido, submetido a dieta hipoprotéica e fratura condilar; grupo desnutrido, mantido até o final do experimento, sem fratura condilar. Fraturas com desvio foram feitas no côndilo direito com anestesia geral. Os espécimes foram submetidos à incidência radiogr

  13. Judgment of facial expressions and depression persistence

    NARCIS (Netherlands)

    Hale, WW

    1998-01-01

    In research it has been demonstrated that cognitive and interpersonal processes play significant roles in depression development and persistence. The judgment of emotions displayed in facial expressions by depressed patients allows for a better understanding of these processes. In this study, 48 maj

  14. Psychocentricity and participant profiles: Implications for lexical processing among multilinguals

    Directory of Open Access Journals (Sweden)

    Gary eLibben

    2014-06-01

    Full Text Available Lexical processing among bilinguals is often affected by complex patterns of individual experience. In this paper we discuss the psychocentric perspective on language representation and processing, which highlights the centrality of individual experience in psycholinguistic experimentation. We discuss applications to the investigation of lexical processing among multilinguals and explore the advantages of using high-density experiments with multilinguals. High density experiments are designed to co-index measures of lexical perception and production, as well as participant profiles. We discuss the challenges associated with the characterization of participant profiles and present a new data visualization technique, that we term Facial Profiles. This technique is based on Chernoff faces developed over forty years ago. The Facial Profile technique seeks to overcome some of the challenges associated with the use of Chernoff faces, while maintaining the core insight that recoding multivariate data as facial features can engage the human face recognition system and thus enhance our ability to detect and interpret patterns within multivariate datasets. We demonstrate that Facial Profiles can code participant characteristics in lexical processing studies by recoding variables such as reading ability, speaking ability, and listening ability into iconically-related relative sizes of eye, mouth, and ear respectively. The balance of ability in bilinguals can be captured by creating composite facial profiles or Janus Facial Profiles. We demonstrate the use of Facial Profiles and Janus Facial Profiles in the characterization of participant effects in the study of lexical perception and production.

  15. Altered cortical activation from the hand after facial Botulinum Toxin treatment

    OpenAIRE

    Haenzi, Sara; Stefanics, Gabor; Lanaras, Tatjana; Calcagni, Maurizio; Ghosh, Arko

    2014-01-01

    Plastic interactions between face and hand cortical tactile circuits occur after severe injuries that affect the hand such as in amputation or spinal cord injury. However, whether loss of facial movements alters the cortical circuits involved in processing tactile inputs from the hand remains unknown. In this prospective observational study we used electroencephalography (EEG) to measure cortical activity evoked by tactile stimulation of the hands before and after botulinum toxin-A-induced fa...

  16. Invasive facial fungal infections: Orofacial soft-tissue infiltration in immunocompromised patients

    OpenAIRE

    Jun, Peter; Russell, Matthew; El-Sayed, Ivan; Dillon, William; Glastonbury, Christine

    2015-01-01

    Invasive facial fungal infections affect the orofacial soft tissues in immunocompromised patients and can cause significant morbidity and mortality. Primary infection occurs from direct inoculation of the skin surface, while secondary infection occurs from extension from an adjacent sinonasal process. The imaging features of secondary infection are similar to acute fulminant invasive fungal sinusitis with infiltration of the orofacial soft tissues in combination with sinonasal disease. Howeve...

  17. The mechanism of valence-space metaphors: ERP evidence for affective word processing.

    Science.gov (United States)

    Xie, Jiushu; Wang, Ruiming; Chang, Song

    2014-01-01

    Embodied cognition contends that the representation and processing of concepts involve perceptual, somatosensory, motoric, and other physical re-experiencing information. In this view, affective concepts are also grounded in physical information. For instance, people often say "feeling down" or "cheer up" in daily life. These phrases use spatial information to understand affective concepts. This process is referred to as valence-space metaphor. Valence-space metaphors refer to the employment of spatial information (lower/higher space) to elaborate affective concepts (negative/positive concepts). Previous studies have demonstrated that processing affective words affects performance on a spatial detection task. However, the mechanism(s) behind this effect remain unclear. In the current study, we hypothesized that processing affective words might produce spatial information. Consequently, spatial information would affect the following spatial cue detection/discrimination task. In Experiment 1, participants were asked to remember an affective word. Then, they completed a spatial cue detection task while event-related potentials were recorded. The results indicated that the top cues induced enhanced amplitude of P200 component while participants kept positive words relative to negative words in mind. On the contrary, the bottom cues induced enhanced P200 amplitudes while participants kept negative words relative to positive words in mind. In Experiment 2, we conducted a behavioral experiment that employed a similar paradigm to Experiment 1, but used arrows instead of dots to test the attentional nature of the valence-space metaphor. We found a similar facilitation effect as found in Experiment 1. Positive words facilitated the discrimination of upper arrows, whereas negative words facilitated the discrimination of lower arrows. In summary, affective words might activate spatial information and cause participants to allocate their attention to corresponding locations

  18. Facial dysostoses: Etiology, pathogenesis and management.

    Science.gov (United States)

    Trainor, Paul A; Andrews, Brian T

    2013-11-01

    Approximately 1% of all live births exhibit a minor or major congenital anomaly. Of these approximately one-third display craniofacial abnormalities which are a significant cause of infant mortality and dramatically affect national health care budgets. To date, more than 700 distinct craniofacial syndromes have been described and in this review, we discuss the etiology, pathogenesis and management of facial dysostoses with a particular emphasis on Treacher Collins, Nager and Miller syndromes. As we continue to develop and improve medical and surgical care for the management of individual conditions, it is essential at the same time to better characterize their etiology and pathogenesis. Here we describe recent advances in our understanding of the development of facial dysostosis with a view towards early in utero identification and intervention which could minimize the manifestation of anomalies prior to birth. The ultimate management for any craniofacial anomaly however, would be prevention and we discuss this possibility in relation to facial dysostosis.

  19. Facial Bell’s palsy affects default mode network connectivity%贝尔麻痹影响大脑默认模式网络的功能连接

    Institute of Scientific and Technical Information of China (English)

    Abdalla Z Mohamed; Chuanfu Li; Jeungchan Lee; Seulgi Eun; Yifang Zhu; Yuanyuan Wu; Jun Yang; Kyungmo Park

    2014-01-01

    目的:贝尔麻痹是一种常见的特发性面神经病变,导致病变侧的面部肌肉运动功能丧失。作者探讨了贝尔麻痹对大脑默认模式网络(DMN)功能连接的影响。材料与方法使用1.5 T MR成像仪获取静息态fMRI数据,包括健康志愿者35名及不同病理状态下的面瘫患者52例次。使用双重回归独立成分分析方法处理数据。结果发现面瘫不同病理状态下DMN的功能连接发生了不同的变化。在面瘫疾病早期,DMN功能连接在右侧初级感觉皮层、初级运动皮层、背外侧前额叶显著增强;在面瘫后期,DMN功能连接在双侧中扣带回、楔前叶、右侧富内侧前额叶、后扣带回显著增强;在面瘫康复后,DMN功能连接在左侧舌回、小脑显著增强。结论在面瘫不同病理阶段,DMN的功能连接存在着明显的不同,涉及到感觉运动,运动映射,情感以及认知等不同功能脑区。%Objective:Bell’s palsy (BP) is common peripheral idiopathic disease affecting the facial nerve (CN VII) causing loss of control facial muscles on the affected side. We investigated BP’s effects on the resting state default mode network (DMN) connectivity due to neuroplasticity in brain.Materials and Methods:1.5 T- MRI scanner was used to aquire fMRI data over 35 healthy volunteers and 52 BP patients (Some of patients participated more than once) at different pathological stages (based on disease duration and House-Brackmann Score) in resting state. Dual regression independent component analysis (ICA) approach was used for functional connectivity analysis.Results: DMN connectivity had varied changes for different stages of BP. In early group, DMN connectivity was increased with right (r.) SI, r. MI, and r. DLPFC; while for late group it was increased with bilateral MCC, r. VMPFC, r. PCC, bilateral precuneus. For recovered group, DMN connectivity was increased with left (l.) lingual gyrus and l. cerebellum

  20. Perception of health from facial cues

    Science.gov (United States)

    Henderson, Audrey J.; Holzleitner, Iris J.; Talamas, Sean N.

    2016-01-01

    Impressions of health are integral to social interactions, yet poorly understood. A review of the literature reveals multiple facial characteristics that potentially act as cues to health judgements. The cues vary in their stability across time: structural shape cues including symmetry and sexual dimorphism alter slowly across the lifespan and have been found to have weak links to actual health, but show inconsistent effects on perceived health. Facial adiposity changes over a medium time course and is associated with both perceived and actual health. Skin colour alters over a short time and has strong effects on perceived health, yet links to health outcomes have barely been evaluated. Reviewing suggested an additional influence of demeanour as a perceptual cue to health. We, therefore, investigated the association of health judgements with multiple facial cues measured objectively from two-dimensional and three-dimensional facial images. We found evidence for independent contributions of face shape and skin colour cues to perceived health. Our empirical findings: (i) reinforce the role of skin yellowness; (ii) demonstrate the utility of global face shape measures of adiposity; and (iii) emphasize the role of affect in facial images with nominally neutral expression in impressions of health. PMID:27069057