WorldWideScience

Sample records for facial affect processing

  1. Facial affect processing and depression susceptibility: cognitive biases and cognitive neuroscience.

    Science.gov (United States)

    Bistricky, Steven L; Ingram, Rick E; Atchley, Ruth Ann

    2011-11-01

    Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal experience, cognition, and social behavior. We therefore review the burgeoning depressive facial affect processing literature and examine its potential for integrating disciplines, theories, and research. In particular, we evaluate studies in which information processing or cognitive neuroscience paradigms were used to assess facial affect processing in depressed and depression-susceptible populations. Most studies have assessed and supported cognitive models. This research suggests that depressed and depression-vulnerable groups show abnormal facial affect interpretation, attention, and memory, although findings vary based on depression severity, comorbid anxiety, or length of time faces are viewed. Facial affect processing biases appear to correspond with distinct neural activity patterns and increased depressive emotion and thought. Biases typically emerge in depressed moods but are occasionally found in the absence of such moods. Indirect evidence suggests that childhood neglect might cultivate abnormal facial affect processing, which can impede social functioning in ways consistent with cognitive-interpersonal and interpersonal models. However, reviewed studies provide mixed support for the social risk model prediction that depressive states prompt cognitive hypervigilance to social threat information. We recommend prospective interdisciplinary research examining whether facial affect processing abnormalities promote-or are promoted by-depressogenic attachment experiences, negative thinking, and social dysfunction.

  2. Relation between facial affect recognition and configural face processing in antipsychotic-free schizophrenia.

    Science.gov (United States)

    Fakra, Eric; Jouve, Elisabeth; Guillaume, Fabrice; Azorin, Jean-Michel; Blin, Olivier

    2015-03-01

    Deficit in facial affect recognition is a well-documented impairment in schizophrenia, closely connected to social outcome. This deficit could be related to psychopathology, but also to a broader dysfunction in processing facial information. In addition, patients with schizophrenia inadequately use configural information-a type of processing that relies on spatial relationships between facial features. To date, no study has specifically examined the link between symptoms and misuse of configural information in the deficit in facial affect recognition. Unmedicated schizophrenia patients (n = 30) and matched healthy controls (n = 30) performed a facial affect recognition task and a face inversion task, which tests aptitude to rely on configural information. In patients, regressions were carried out between facial affect recognition, symptom dimensions and inversion effect. Patients, compared with controls, showed a deficit in facial affect recognition and a lower inversion effect. Negative symptoms and lower inversion effect could account for 41.2% of the variance in facial affect recognition. This study confirms the presence of a deficit in facial affect recognition, and also of dysfunctional manipulation in configural information in antipsychotic-free patients. Negative symptoms and poor processing of configural information explained a substantial part of the deficient recognition of facial affect. We speculate that this deficit may be caused by several factors, among which independently stand psychopathology and failure in correctly manipulating configural information. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  3. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia.

    Science.gov (United States)

    Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue

    2009-06-15

    Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.

  4. Exploring the nature of facial affect processing deficits in schizophrenia

    NARCIS (Netherlands)

    Wout, Mascha van 't; Aleman, Andre; Kessels, Roy P. C.; Cahn, Wiepke; Haan, Edward H. F. de; Kahn, Rene S.

    2007-01-01

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as

  5. Exploring the nature of facial affect processing deficits in schizophrenia.

    NARCIS (Netherlands)

    Wout, M. van 't; Aleman, A.; Kessels, R.P.C.; Cahn, W.; Haan, E.H.F. de; Kahn, R.S.

    2007-01-01

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as

  6. Automatic processing of facial affects in patients with borderline personality disorder: associations with symptomatology and comorbid disorders.

    Science.gov (United States)

    Donges, Uta-Susan; Dukalski, Bibiana; Kersting, Anette; Suslow, Thomas

    2015-01-01

    Instability of affects and interpersonal relations are important features of borderline personality disorder (BPD). Interpersonal problems of individuals suffering from BPD might develop based on abnormalities in the processing of facial affects and high sensitivity to negative affective expressions. The aims of the present study were to examine automatic evaluative shifts and latencies as a function of masked facial affects in patients with BPD compared to healthy individuals. As BPD comorbidity rates for mental and personality disorders are high, we investigated also the relationships of affective processing characteristics with specific borderline symptoms and comorbidity. Twenty-nine women with BPD and 38 healthy women participated in the study. The majority of patients suffered from additional Axis I disorders and/or additional personality disorders. In the priming experiment, angry, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces that had to be evaluated. Evaluative decisions and response latencies were registered. Borderline-typical symptomatology was assessed with the Borderline Symptom List. In the total sample, valence-congruent evaluative shifts and delays of evaluative decision due to facial affect were observed. No between-group differences were obtained for evaluative decisions and latencies. The presence of comorbid anxiety disorders was found to be positively correlated with evaluative shifting owing to masked happy primes, regardless of baseline-neutral or no facial expression condition. The presence of comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression were significantly correlated with response delay due to masked angry faces, regardless of baseline. In the present affective priming study, no abnormalities in the automatic recognition and processing of facial affects were observed in BPD patients compared to healthy individuals

  7. Neural bases of different cognitive strategies for facial affect processing in schizophrenia.

    Science.gov (United States)

    Fakra, Eric; Salgado-Pineda, Pilar; Delaveau, Pauline; Hariri, Ahmad R; Blin, Olivier

    2008-03-01

    To examine the neural basis and dynamics of facial affect processing in schizophrenic patients as compared to healthy controls. Fourteen schizophrenic patients and fourteen matched controls performed a facial affect identification task during fMRI acquisition. The emotional task included an intuitive emotional condition (matching emotional faces) and a more cognitively demanding condition (labeling emotional faces). Individual analysis for each emotional condition, and second-level t-tests examining both within-, and between-group differences, were carried out using a random effects approach. Psychophysiological interactions (PPI) were tested for variations in functional connectivity between amygdala and other brain regions as a function of changes in experimental conditions (labeling versus matching). During the labeling condition, both groups engaged similar networks. During the matching condition, schizophrenics failed to activate regions of the limbic system implicated in the automatic processing of emotions. PPI revealed an inverse functional connectivity between prefrontal regions and the left amygdala in healthy volunteers but there was no such change in patients. Furthermore, during the matching condition, and compared to controls, patients showed decreased activation of regions involved in holistic face processing (fusiform gyrus) and increased activation of regions associated with feature analysis (inferior parietal cortex, left middle temporal lobe, right precuneus). Our findings suggest that schizophrenic patients invariably adopt a cognitive approach when identifying facial affect. The distributed neocortical network observed during the intuitive condition indicates that patients may resort to feature-based, rather than configuration-based, processing and may constitute a compensatory strategy for limbic dysfunction.

  8. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    Science.gov (United States)

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  9. Neurobiological mechanisms associated with facial affect recognition deficits after traumatic brain injury.

    Science.gov (United States)

    Neumann, Dawn; McDonald, Brenna C; West, John; Keiski, Michelle A; Wang, Yang

    2016-06-01

    The neurobiological mechanisms that underlie facial affect recognition deficits after traumatic brain injury (TBI) have not yet been identified. Using functional magnetic resonance imaging (fMRI), study aims were to 1) determine if there are differences in brain activation during facial affect processing in people with TBI who have facial affect recognition impairments (TBI-I) relative to people with TBI and healthy controls who do not have facial affect recognition impairments (TBI-N and HC, respectively); and 2) identify relationships between neural activity and facial affect recognition performance. A facial affect recognition screening task performed outside the scanner was used to determine group classification; TBI patients who performed greater than one standard deviation below normal performance scores were classified as TBI-I, while TBI patients with normal scores were classified as TBI-N. An fMRI facial recognition paradigm was then performed within the 3T environment. Results from 35 participants are reported (TBI-I = 11, TBI-N = 12, and HC = 12). For the fMRI task, TBI-I and TBI-N groups scored significantly lower than the HC group. Blood oxygenation level-dependent (BOLD) signals for facial affect recognition compared to a baseline condition of viewing a scrambled face, revealed lower neural activation in the right fusiform gyrus (FG) in the TBI-I group than the HC group. Right fusiform gyrus activity correlated with accuracy on the facial affect recognition tasks (both within and outside the scanner). Decreased FG activity suggests facial affect recognition deficits after TBI may be the result of impaired holistic face processing. Future directions and clinical implications are discussed.

  10. Representing affective facial expressions for robots and embodied conversational agents by facial landmarks

    NARCIS (Netherlands)

    Liu, C.; Ham, J.R.C.; Postma, E.O.; Midden, C.J.H.; Joosten, B.; Goudbeek, M.

    2013-01-01

    Affective robots and embodied conversational agents require convincing facial expressions to make them socially acceptable. To be able to virtually generate facial expressions, we need to investigate the relationship between technology and human perception of affective and social signals. Facial

  11. Neural mechanism for judging the appropriateness of facial affect.

    Science.gov (United States)

    Kim, Ji-Woong; Kim, Jae-Jin; Jeong, Bum Seok; Ki, Seon Wan; Im, Dong-Mi; Lee, Soo Jung; Lee, Hong Shick

    2005-12-01

    Questions regarding the appropriateness of facial expressions in particular situations arise ubiquitously in everyday social interactions. To determine the appropriateness of facial affect, first of all, we should represent our own or the other's emotional state as induced by the social situation. Then, based on these representations, we should infer the possible affective response of the other person. In this study, we identified the brain mechanism mediating special types of social evaluative judgments of facial affect in which the internal reference is related to theory of mind (ToM) processing. Many previous ToM studies have used non-emotional stimuli, but, because so much valuable social information is conveyed through nonverbal emotional channels, this investigation used emotionally salient visual materials to tap ToM. Fourteen right-handed healthy subjects volunteered for our study. We used functional magnetic resonance imaging to examine brain activation during the judgmental task for the appropriateness of facial affects as opposed to gender matching tasks. We identified activation of a brain network, which includes both medial frontal cortex, left temporal pole, left inferior frontal gyrus, and left thalamus during the judgmental task for appropriateness of facial affect compared to the gender matching task. The results of this study suggest that the brain system involved in ToM plays a key role in judging the appropriateness of facial affect in an emotionally laden situation. In addition, our result supports that common neural substrates are involved in performing diverse kinds of ToM tasks irrespective of perceptual modalities and the emotional salience of test materials.

  12. Stability of Facial Affective Expressions in Schizophrenia

    Directory of Open Access Journals (Sweden)

    H. Fatouros-Bergman

    2012-01-01

    Full Text Available Thirty-two videorecorded interviews were conducted by two interviewers with eight patients diagnosed with schizophrenia. Each patient was interviewed four times: three weekly interviews by the first interviewer and one additional interview by the second interviewer. 64 selected sequences where the patients were speaking about psychotic experiences were scored for facial affective behaviour with Emotion Facial Action Coding System (EMFACS. In accordance with previous research, the results show that patients diagnosed with schizophrenia express negative facial affectivity. Facial affective behaviour seems not to be dependent on temporality, since within-subjects ANOVA revealed no substantial changes in the amount of affects displayed across the weekly interview occasions. Whereas previous findings found contempt to be the most frequent affect in patients, in the present material disgust was as common, but depended on the interviewer. The results suggest that facial affectivity in these patients is primarily dominated by the negative emotions of disgust and, to a lesser extent, contempt and implies that this seems to be a fairly stable feature.

  13. Changing facial affect recognition in schizophrenia: Effects of training on brain dynamics

    Directory of Open Access Journals (Sweden)

    Petia Popova

    2014-01-01

    Full Text Available Deficits in social cognition including facial affect recognition and their detrimental effects on functional outcome are well established in schizophrenia. Structured training can have substantial effects on social cognitive measures including facial affect recognition. Elucidating training effects on cortical mechanisms involved in facial affect recognition may identify causes of dysfunctional facial affect recognition in schizophrenia and foster remediation strategies. In the present study, 57 schizophrenia patients were randomly assigned to (a computer-based facial affect training that focused on affect discrimination and working memory in 20 daily 1-hour sessions, (b similarly intense, targeted cognitive training on auditory-verbal discrimination and working memory, or (c treatment as usual. Neuromagnetic activity was measured before and after training during a dynamic facial affect recognition task (5 s videos showing human faces gradually changing from neutral to fear or to happy expressions. Effects on 10–13 Hz (alpha power during the transition from neutral to emotional expressions were assessed via MEG based on previous findings that alpha power increase is related to facial affect recognition and is smaller in schizophrenia than in healthy subjects. Targeted affect training improved overt performance on the training tasks. Moreover, alpha power increase during the dynamic facial affect recognition task was larger after affect training than after treatment-as-usual, though similar to that after targeted perceptual–cognitive training, indicating somewhat nonspecific benefits. Alpha power modulation was unrelated to general neuropsychological test performance, which improved in all groups. Results suggest that specific neural processes supporting facial affect recognition, evident in oscillatory phenomena, are modifiable. This should be considered when developing remediation strategies targeting social cognition in schizophrenia.

  14. Serotonin transporter gene-linked polymorphism affects detection of facial expressions.

    Directory of Open Access Journals (Sweden)

    Ai Koizumi

    Full Text Available Previous studies have demonstrated that the serotonin transporter gene-linked polymorphic region (5-HTTLPR affects the recognition of facial expressions and attention to them. However, the relationship between 5-HTTLPR and the perceptual detection of others' facial expressions, the process which takes place prior to emotional labeling (i.e., recognition, is not clear. To examine whether the perceptual detection of emotional facial expressions is influenced by the allelic variation (short/long of 5-HTTLPR, happy and sad facial expressions were presented at weak and mid intensities (25% and 50%. Ninety-eight participants, genotyped for 5-HTTLPR, judged whether emotion in images of faces was present. Participants with short alleles showed higher sensitivity (d' to happy than to sad expressions, while participants with long allele(s showed no such positivity advantage. This effect of 5-HTTLPR was found at different facial expression intensities among males and females. The results suggest that at the perceptual stage, a short allele enhances the processing of positive facial expressions rather than that of negative facial expressions.

  15. Modulation of α power and functional connectivity during facial affect recognition.

    Science.gov (United States)

    Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte; Weisz, Nathan

    2013-04-03

    Research has linked oscillatory activity in the α frequency range, particularly in sensorimotor cortex, to processing of social actions. Results further suggest involvement of sensorimotor α in the processing of facial expressions, including affect. The sensorimotor face area may be critical for perception of emotional face expression, but the role it plays is unclear. The present study sought to clarify how oscillatory brain activity contributes to or reflects processing of facial affect during changes in facial expression. Neuromagnetic oscillatory brain activity was monitored while 30 volunteers viewed videos of human faces that changed their expression from neutral to fearful, neutral, or happy expressions. Induced changes in α power during the different morphs, source analysis, and graph-theoretic metrics served to identify the role of α power modulation and cross-regional coupling by means of phase synchrony during facial affect recognition. Changes from neutral to emotional faces were associated with a 10-15 Hz power increase localized in bilateral sensorimotor areas, together with occipital power decrease, preceding reported emotional expression recognition. Graph-theoretic analysis revealed that, in the course of a trial, the balance between sensorimotor power increase and decrease was associated with decreased and increased transregional connectedness as measured by node degree. Results suggest that modulations in α power facilitate early registration, with sensorimotor cortex including the sensorimotor face area largely functionally decoupled and thereby protected from additional, disruptive input and that subsequent α power decrease together with increased connectedness of sensorimotor areas facilitates successful facial affect recognition.

  16. Event-related theta synchronization predicts deficit in facial affect recognition in schizophrenia.

    Science.gov (United States)

    Csukly, Gábor; Stefanics, Gábor; Komlósi, Sarolta; Czigler, István; Czobor, Pál

    2014-02-01

    Growing evidence suggests that abnormalities in the synchronized oscillatory activity of neurons in schizophrenia may lead to impaired neural activation and temporal coding and thus lead to neurocognitive dysfunctions, such as deficits in facial affect recognition. To gain an insight into the neurobiological processes linked to facial affect recognition, we investigated both induced and evoked oscillatory activity by calculating the Event Related Spectral Perturbation (ERSP) and the Inter Trial Coherence (ITC) during facial affect recognition. Fearful and neutral faces as well as nonface patches were presented to 24 patients with schizophrenia and 24 matched healthy controls while EEG was recorded. The participants' task was to recognize facial expressions. Because previous findings with healthy controls showed that facial feature decoding was associated primarily with oscillatory activity in the theta band, we analyzed ERSP and ITC in this frequency band in the time interval of 140-200 ms, which corresponds to the N170 component. Event-related theta activity and phase-locking to facial expressions, but not to nonface patches, predicted emotion recognition performance in both controls and patients. Event-related changes in theta amplitude and phase-locking were found to be significantly weaker in patients compared with healthy controls, which is in line with previous investigations showing decreased neural synchronization in the low frequency bands in patients with schizophrenia. Neural synchrony is thought to underlie distributed information processing. Our results indicate a less effective functioning in the recognition process of facial features, which may contribute to a less effective social cognition in schizophrenia. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. [Measuring impairment of facial affects recognition in schizophrenia. Preliminary study of the facial emotions recognition task (TREF)].

    Science.gov (United States)

    Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N

    2015-06-01

    without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  18. Perceptual and affective mechanisms in facial expression recognition: An integrative review.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2016-09-01

    Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.

  19. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    Science.gov (United States)

    van Dijke, Annemiek; van 't Wout, Mascha; Ford, Julian D; Aleman, André

    2016-01-01

    Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD), these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57) and patients with BPD (N = 30) were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25) and healthy control participants (N = 41) on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  20. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    Directory of Open Access Journals (Sweden)

    Annemiek van Dijke

    Full Text Available Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD, these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57 and patients with BPD (N = 30 were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25 and healthy control participants (N = 41 on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  1. Categorical Perception of Affective and Linguistic Facial Expressions

    Science.gov (United States)

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  2. Facial Expression at Retrieval Affects Recognition of Facial Identity

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2015-06-01

    Full Text Available It is well known that memory can be modulated by emotional stimuli at the time of encoding and consolidation. For example, happy faces create better identity recognition than faces with certain other expressions. However, the influence of facial expression at the time of retrieval remains unknown in the literature. To separate the potential influence of expression at retrieval from its effects at earlier stages, we had participants learn neutral faces but manipulated facial expression at the time of memory retrieval in a standard old/new recognition task. The results showed a clear effect of facial expression, where happy test faces were identified more successfully than angry test faces. This effect is unlikely due to greater image similarity between the neutral learning face and the happy test face, because image analysis showed that the happy test faces are in fact less similar to the neutral learning faces relative to the angry test faces. In the second experiment, we investigated whether this emotional effect is influenced by the expression at the time of learning. We employed angry or happy faces as learning stimuli, and angry, happy, and neutral faces as test stimuli. The results showed that the emotional effect at retrieval is robust across different encoding conditions with happy or angry expressions. These findings indicate that emotional expressions affect the retrieval process in identity recognition, and identity recognition does not rely on emotional association between learning and test faces.

  3. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Directory of Open Access Journals (Sweden)

    Uta-Susan Donges

    Full Text Available There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  4. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Science.gov (United States)

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  5. On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information

    Science.gov (United States)

    Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.

    Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.

  6. Facial Affect Recognition and Social Anxiety in Preschool Children

    Science.gov (United States)

    Ale, Chelsea M.; Chorney, Daniel B.; Brice, Chad S.; Morris, Tracy L.

    2010-01-01

    Research relating anxiety and facial affect recognition has focused mostly on school-aged children and adults and has yielded mixed results. The current study sought to demonstrate an association among behavioural inhibition and parent-reported social anxiety, shyness, social withdrawal and facial affect recognition performance in 30 children,…

  7. Cognitive Processing about Classroom-Relevant Contexts: Teachers' Attention to and Utilization of Girls' Body Size, Ethnicity, Attractiveness, and Facial Affect

    Science.gov (United States)

    Wang, Shirley S.; Treat, Teresa A.; Brownell, Kelly D.

    2008-01-01

    This study examines 2 aspects of cognitive processing in person perception--attention and decision making--in classroom-relevant contexts. Teachers completed 2 implicit, performance-based tasks that characterized attention to and utilization of 4 student characteristics of interest: ethnicity, facial affect, body size, and attractiveness. Stimuli…

  8. Distinct facial processing in schizophrenia and schizoaffective disorders

    Science.gov (United States)

    Chen, Yue; Cataldo, Andrea; Norton, Daniel J; Ongur, Dost

    2011-01-01

    Although schizophrenia and schizoaffective disorders have both similar and differing clinical features, it is not well understood whether similar or differing pathophysiological processes mediate patients’ cognitive functions. Using psychophysical methods, this study compared the performances of schizophrenia (SZ) patients, patients with schizoaffective disorder (SA), and a healthy control group in two face-related cognitive tasks: emotion discrimination, which tested perception of facial affect, and identity discrimination, which tested perception of non-affective facial features. Compared to healthy controls, SZ patients, but not SA patients, exhibited deficient performance in both fear and happiness discrimination, as well as identity discrimination. SZ patients, but not SA patients, also showed impaired performance in a theory-of-mind task for which emotional expressions are identified based upon the eye regions of face images. This pattern of results suggests distinct processing of face information in schizophrenia and schizoaffective disorders. PMID:21868199

  9. Schizophrenia and processing of facial emotions : Sex matters

    NARCIS (Netherlands)

    Scholten, MRM; Aleman, A; Montagne, B; Kahn, RS

    2005-01-01

    The aim of this study was to examine sex differences in emotion processing in patients with schizophrenia and control subjects. To this end, 53 patients with schizophrenia (28 men and 25 women), and 42 controls (21 men and 21 women) were assessed with the use of a facial affect recognition morphing

  10. Sex Differences in Affective Facial Reactions Are Present in Childhood

    Directory of Open Access Journals (Sweden)

    Luigi Cattaneo

    2018-05-01

    Full Text Available Adults exposed to affective facial displays produce specific rapid facial reactions (RFRs which are of lower intensity in males compared to females. We investigated such sex difference in a population of 60 primary school children (30 F; 30 M, aged 7–10 years. We recorded the surface electromyographic (EMG signal from the corrugator supercilii and the zygomatici muscles, while children watched affective facial displays. Results showed the expected smiling RFR to smiling faces and the expected frowning RFR to sad faces. A systematic difference between male and female participants was observed, with boys showing less ample EMG responses than age-matched girls. We demonstrate that sex differences in the somatic component of affective motor patterns are present also in childhood.

  11. Processing of unattended facial emotions: a visual mismatch negativity study.

    Science.gov (United States)

    Stefanics, Gábor; Csukly, Gábor; Komlósi, Sarolta; Czobor, Pál; Czigler, István

    2012-02-01

    Facial emotions express our internal states and are fundamental in social interactions. Here we explore whether the repetition of unattended facial emotions builds up a predictive representation of frequently encountered emotions in the visual system. Participants (n=24) were presented peripherally with facial stimuli expressing emotions while they performed a visual detection task presented in the center of the visual field. Facial stimuli consisted of four faces of different identity, but expressed the same emotion (happy or fearful). Facial stimuli were presented in blocks of oddball sequence (standard emotion: p=0.9, deviant emotion: p=0.1). Event-related potentials (ERPs) to the same emotions were compared when the emotions were deviant and standard, respectively. We found visual mismatch negativity (vMMN) responses to unattended deviant emotions in the 170-360 ms post-stimulus range over bilateral occipito-temporal sites. Our results demonstrate that information about the emotional content of unattended faces presented at the periphery of the visual field is rapidly processed and stored in a predictive memory representation by the visual system. We also found evidence that differential processing of deviant fearful faces starts already at 70-120 ms after stimulus onset. This finding shows a 'negativity bias' under unattended conditions. Differential processing of fearful deviants were more pronounced in the right hemisphere in the 195-275 ms and 360-390 ms intervals, whereas processing of happy deviants evoked larger differential response in the left hemisphere in the 360-390 ms range, indicating differential hemispheric specialization for automatic processing of positive and negative affect. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Affective theory of mind inferences contextually influence the recognition of emotional facial expressions.

    Science.gov (United States)

    Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J

    2018-03-14

    The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.

  13. Monkeys preferentially process body information while viewing affective displays.

    Science.gov (United States)

    Bliss-Moreau, Eliza; Moadab, Gilda; Machado, Christopher J

    2017-08-01

    Despite evolutionary claims about the function of facial behaviors across phylogeny, rarely are those hypotheses tested in a comparative context-that is, by evaluating how nonhuman animals process such behaviors. Further, while increasing evidence indicates that humans make meaning of faces by integrating contextual information, including that from the body, the extent to which nonhuman animals process contextual information during affective displays is unknown. In the present study, we evaluated the extent to which rhesus macaques (Macaca mulatta) process dynamic affective displays of conspecifics that included both facial and body behaviors. Contrary to hypotheses that they would preferentially attend to faces during affective displays, monkeys looked for longest, most frequently, and first at conspecifics' bodies rather than their heads. These findings indicate that macaques, like humans, attend to available contextual information during the processing of affective displays, and that the body may also be providing unique information about affective states. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Facial identity and facial expression are initially integrated at visual perceptual stages of face processing.

    Science.gov (United States)

    Fisher, Katie; Towler, John; Eimer, Martin

    2016-01-08

    It is frequently assumed that facial identity and facial expression are analysed in functionally and anatomically distinct streams within the core visual face processing system. To investigate whether expression and identity interact during the visual processing of faces, we employed a sequential matching procedure where participants compared either the identity or the expression of two successively presented faces, and ignored the other irrelevant dimension. Repetitions versus changes of facial identity and expression were varied independently across trials, and event-related potentials (ERPs) were recorded during task performance. Irrelevant facial identity and irrelevant expression both interfered with performance in the expression and identity matching tasks. These symmetrical interference effects show that neither identity nor expression can be selectively ignored during face matching, and suggest that they are not processed independently. N250r components to identity repetitions that reflect identity matching mechanisms in face-selective visual cortex were delayed and attenuated when there was an expression change, demonstrating that facial expression interferes with visual identity matching. These findings provide new evidence for interactions between facial identity and expression within the core visual processing system, and question the hypothesis that these two attributes are processed independently. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Focal Length Affects Depicted Shape and Perception of Facial Images.

    Directory of Open Access Journals (Sweden)

    Vít Třebický

    Full Text Available Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject's facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males. Facial width-to-height ratio (fWHR was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM. Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits.

  16. Identification and intensity of disgust: Distinguishing visual, linguistic and facial expressions processing in Parkinson disease.

    Science.gov (United States)

    Sedda, Anna; Petito, Sara; Guarino, Maria; Stracciari, Andrea

    2017-07-14

    Most of the studies since now show an impairment for facial displays of disgust recognition in Parkinson disease. A general impairment in disgust processing in patients with Parkinson disease might adversely affect their social interactions, given the relevance of this emotion for human relations. However, despite the importance of faces, disgust is also expressed through other format of visual stimuli such as sentences and visual images. The aim of our study was to explore disgust processing in a sample of patients affected by Parkinson disease, by means of various tests tackling not only facial recognition but also other format of visual stimuli through which disgust can be recognized. Our results confirm that patients are impaired in recognizing facial displays of disgust. Further analyses show that patients are also impaired and slower for other facial expressions, with the only exception of happiness. Notably however, patients with Parkinson disease processed visual images and sentences as controls. Our findings show a dissociation within different formats of visual stimuli of disgust, suggesting that Parkinson disease is not characterized by a general compromising of disgust processing, as often suggested. The involvement of the basal ganglia-frontal cortex system might spare some cognitive components of emotional processing, related to memory and culture, at least for disgust. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  18. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  19. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  20. Design of a Virtual Reality System for Affect Analysis in Facial Expressions (VR-SAAFE); Application to Schizophrenia.

    Science.gov (United States)

    Bekele, E; Bian, D; Peterman, J; Park, S; Sarkar, N

    2017-06-01

    Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.

  1. Predicting the Accuracy of Facial Affect Recognition: The Interaction of Child Maltreatment and Intellectual Functioning

    Science.gov (United States)

    Shenk, Chad E.; Putnam, Frank W.; Noll, Jennie G.

    2013-01-01

    Previous research demonstrates that both child maltreatment and intellectual performance contribute uniquely to the accurate identification of facial affect by children and adolescents. The purpose of this study was to extend this research by examining whether child maltreatment affects the accuracy of facial recognition differently at varying…

  2. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure.

    Directory of Open Access Journals (Sweden)

    Yanling Liu

    Full Text Available Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces.

  3. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure.

    Science.gov (United States)

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces.

  4. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure

    Science.gov (United States)

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces. PMID:28249033

  5. Facial emotion processing in pediatric social anxiety disorder: Relevance of situational context.

    Science.gov (United States)

    Schwab, Daniela; Schienle, Anne

    2017-08-01

    Social anxiety disorder (SAD) typically begins in childhood. Previous research has demonstrated that adult patients respond with elevated late positivity (LP) to negative facial expressions. In the present study on pediatric SAD, we investigated responses to negative facial expressions and the role of social context information. Fifteen children with SAD and 15 non-anxious controls were first presented with images of negative facial expressions with masked backgrounds. Following this, the complete images which included context information, were shown. The negative expressions were either a result of an emotion-relevant (e.g., social exclusion) or emotion-irrelevant elicitor (e.g., weight lifting). Relative to controls, the clinical group showed elevated parietal LP during face processing with and without context information. Both groups differed in their frontal LP depending on the type of context. In SAD patients, frontal LP was lower in emotion-relevant than emotion-irrelevant contexts. We conclude that SAD patients direct more automatic attention towards negative facial expressions (parietal effect) and are less capable in integrating affective context information (frontal effect). Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.

    Science.gov (United States)

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  7. Appraisals Generate Specific Configurations of Facial Muscle Movements in a Gambling Task: Evidence for the Component Process Model of Emotion.

    Science.gov (United States)

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2015-01-01

    Scherer's Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect

  8. The Child Affective Facial Expression (CAFE Set: Validity and Reliability from Untrained Adults

    Directory of Open Access Journals (Sweden)

    Vanessa eLoBue

    2015-01-01

    Full Text Available Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE. The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for 6 emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  9. Sex, Sexual Orientation, and Identification of Positive and Negative Facial Affect

    Science.gov (United States)

    Rahman, Qazi; Wilson, Glenn D.; Abrahams, Sharon

    2004-01-01

    Sex and sexual orientation related differences in processing of happy and sad facial emotions were examined using an experimental facial emotion recognition paradigm with a large sample (N=240). Analysis of covariance (controlling for age and IQ) revealed that women (irrespective of sexual orientation) had faster reaction times than men for…

  10. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  11. Affective priming using facial expressions modulates liking for abstract art.

    Science.gov (United States)

    Flexas, Albert; Rosselló, Jaume; Christensen, Julia F; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric

    2013-01-01

    We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20 ms) and extended (SOA = 300 ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation.

  12. Affective priming using facial expressions modulates liking for abstract art.

    Directory of Open Access Journals (Sweden)

    Albert Flexas

    Full Text Available We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion were presented under brief (Stimulus Onset Asynchrony, SOA = 20 ms and extended (SOA = 300 ms conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation.

  13. Facial Expression of Affect in Children with Cornelia de Lange Syndrome

    Science.gov (United States)

    Collis, L.; Moss, J.; Jutley, J.; Cornish, K.; Oliver, C.

    2008-01-01

    Background: Individuals with Cornelia de Lange syndrome (CdLS) have been reported to show comparatively high levels of flat and negative affect but there have been no empirical evaluations. In this study, we use an objective measure of facial expression to compare affect in CdLS with that seen in Cri du Chat syndrome (CDC) and a group of…

  14. Approach-avoidance of facial affect is moderated by the presence of an observer-irrelevant trigger

    NARCIS (Netherlands)

    Renard, S.B.; de Jong, P.J.; Pijnenborg, G.H.M.

    This study examined whether approach-avoidance related behaviour elicited by facial affect is moderated by the presence of an observer-irrelevant trigger that may influence the observer's attributions of the actor's emotion. Participants were shown happy, disgusted, and neutral facial expressions.

  15. Facial Affect Displays during Tutoring Sessions

    NARCIS (Netherlands)

    Ghijsen, M.; Heylen, Dirk K.J.; Nijholt, Antinus; op den Akker, Hendrikus J.A.

    2005-01-01

    An emotionally intelligent tutoring system should be able to provide feedback to students, taking into account relevant aspects of the mental state of the student. Facial expressions, put in context, might provide some cues with respect to this state. We discuss the analysis of the facial expression

  16. Processing of emotional facial expressions in Korsakoff's syndrome.

    NARCIS (Netherlands)

    Montagne, B.; Kessels, R.P.C.; Wester, A.J.; Haan, E.H.F. de

    2006-01-01

    Interpersonal contacts depend to a large extent on understanding emotional facial expressions of others. Several neurological conditions may affect proficiency in emotional expression recognition. It has been shown that chronic alcoholics are impaired in labelling emotional expressions. More

  17. Poor Facial Affect Recognition among Boys with Duchenne Muscular Dystrophy

    Science.gov (United States)

    Hinton, V. J.; Fee, R. J.; De Vivo, D. C.; Goldstein, E.

    2007-01-01

    Children with Duchenne or Becker muscular dystrophy (MD) have delayed language and poor social skills and some meet criteria for Pervasive Developmental Disorder, yet they are identified by molecular, rather than behavioral, characteristics. To determine whether comprehension of facial affect is compromised in boys with MD, children were given a…

  18. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Patrick eJohnston

    2011-02-01

    Full Text Available Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching and butterfly wing matching to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety two children aged 5 to 15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

  19. Uncertainty Flow Facilitates Zero-Shot Multi-Label Learning in Affective Facial Analysis

    Directory of Open Access Journals (Sweden)

    Wenjun Bai

    2018-02-01

    Full Text Available Featured Application: The proposed Uncertainty Flow framework may benefit the facial analysis with its promised elevation in discriminability in multi-label affective classification tasks. Moreover, this framework also allows the efficient model training and between tasks knowledge transfer. The applications that rely heavily on continuous prediction on emotional valance, e.g., to monitor prisoners’ emotional stability in jail, can be directly benefited from our framework. Abstract: To lower the single-label dependency on affective facial analysis, it urges the fruition of multi-label affective learning. The impediment to practical implementation of existing multi-label algorithms pertains to scarcity of scalable multi-label training datasets. To resolve this, an inductive transfer learning based framework, i.e.,Uncertainty Flow, is put forward in this research to allow knowledge transfer from a single labelled emotion recognition task to a multi-label affective recognition task. I.e., the model uncertainty—which can be quantified in Uncertainty Flow—is distilled from a single-label learning task. The distilled model uncertainty ensures the later efficient zero-shot multi-label affective learning. On the theoretical perspective, within our proposed Uncertainty Flow framework, the feasibility of applying weakly informative priors, e.g., uniform and Cauchy prior, is fully explored in this research. More importantly, based on the derived weight uncertainty, three sets of prediction related uncertainty indexes, i.e., soft-max uncertainty, pure uncertainty and uncertainty plus are proposed to produce reliable and accurate multi-label predictions. Validated on our manual annotated evaluation dataset, i.e., the multi-label annotated FER2013, our proposed Uncertainty Flow in multi-label facial expression analysis exhibited superiority to conventional multi-label learning algorithms and multi-label compatible neural networks. The success of our

  20. Processing of facial affect in social drinkers: a dose-response study of alcohol using dynamic emotion expressions.

    Science.gov (United States)

    Kamboj, Sunjeev K; Joye, Alyssa; Bisby, James A; Das, Ravi K; Platt, Bradley; Curran, H Valerie

    2013-05-01

    Studies of affect recognition can inform our understanding of the interpersonal effects of alcohol and help develop a more complete neuropsychological profile of this drug. The objective of the study was to examine affect recognition in social drinkers using a novel dynamic affect-recognition task, sampling performance across a range of evolutionarily significant target emotions and neutral expressions. Participants received 0, 0.4 or 0.8 g/kg alcohol in a double-blind, independent groups design. Relatively naturalistic changes in facial expression-from neutral (mouth open) to increasing intensities of target emotions, as well as neutral (mouth closed)-were simulated using computer-generated dynamic morphs. Accuracy and reaction time were measured and a two-high-threshold model applied to hits and false-alarm data to determine sensitivity and response bias. While there was no effect on the principal emotion expressions (happiness, sadness, fear, anger and disgust), compared to those receiving 0.8 g/kg of alcohol and placebo, participants administered with 0.4 g/kg alcohol tended to show an enhanced response bias to neutral expressions. Exploration of this effect suggested an accompanying tendency to misattribute neutrality to sad expressions following the 0.4-g/kg dose. The 0.4-g/kg alcohol-but not 0.8 g/kg-produced a limited and specific modification in affect recognition evidenced by a neutral response bias and possibly an accompanying tendency to misclassify sad expressions as neutral. In light of previous findings on involuntary negative memory following the 0.4-g/kg dose, we suggest that moderate-but not high-doses of alcohol have a special relevance to emotional processing in social drinkers.

  1. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    Science.gov (United States)

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects’ personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs’ emotional facial expressions. PMID:28114335

  2. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    Directory of Open Access Journals (Sweden)

    Miiamaaria V Kujala

    Full Text Available Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory, empathy (Interpersonal Reactivity Index and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  3. Magnetoencephalographic study on facial movements

    Directory of Open Access Journals (Sweden)

    Kensaku eMiki

    2014-07-01

    Full Text Available In this review, we introduced our three studies that focused on facial movements. In the first study, we examined the temporal characteristics of neural responses elicited by viewing mouth movements, and assessed differences between the responses to mouth opening and closing movements and an averting eyes condition. Our results showed that the occipitotemporal area, the human MT/V5 homologue, was active in the perception of both mouth and eye motions. Viewing mouth and eye movements did not elicit significantly different activity in the occipitotemporal area, which indicated that perception of the movement of facial parts may be processed in the same manner, and this is different from motion in general. In the second study, we investigated whether early activity in the occipitotemporal region evoked by eye movements was influenced by a face contour and/or features such as the mouth. Our results revealed specific information processing for eye movements in the occipitotemporal region, and this activity was significantly influenced by whether movements appeared with the facial contour and/or features, in other words, whether the eyes moved, even if the movement itself was the same. In the third study, we examined the effects of inverting the facial contour (hair and chin and features (eyes, nose, and mouth on processing for static and dynamic face perception. Our results showed the following: (1 In static face perception, activity in the right fusiform area was affected more by the inversion of features while that in the left fusiform area was affected more by a disruption in the spatial relationship between the contour and features, and (2 In dynamic face perception, activity in the right occipitotemporal area was affected by the inversion of the facial contour.

  4. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder

    Directory of Open Access Journals (Sweden)

    Xiaozhe Peng

    2017-06-01

    Full Text Available Internet Gaming Disorder (IGD is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC in the processing of subliminally presented facial expressions (sad, happy, and neutral with event-related potentials (ERPs. The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad–neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing in response to neutral expressions compared to happy expressions in the happy–neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy–neutral expressions context, as well as sad and neutral expressions in the sad–neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy–neutral expressions context.Highlights:• The present study investigated whether the unconscious processing of facial expressions is influenced by

  5. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder.

    Science.gov (United States)

    Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can

    2017-01-01

    Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy-neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy-neutral expressions context, as well as sad and neutral expressions in the sad-neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy-neutral expressions context. • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive online gaming. A validated

  6. Processing of Facial Expressions of Emotions by Adults with Down Syndrome and Moderate Intellectual Disability

    Science.gov (United States)

    Carvajal, Fernando; Fernandez-Alcaraz, Camino; Rueda, Maria; Sarrion, Louise

    2012-01-01

    The processing of facial expressions of emotions by 23 adults with Down syndrome and moderate intellectual disability was compared with that of adults with intellectual disability of other etiologies (24 matched in cognitive level and 26 with mild intellectual disability). Each participant performed 4 tasks of the Florida Affect Battery and an…

  7. Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention.

    Science.gov (United States)

    Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang

    2016-11-16

    Past research has proven human's extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention.

  8. Assessing the Utility of a Virtual Environment for Enhancing Facial Affect Recognition in Adolescents with Autism

    Science.gov (United States)

    Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan

    2014-01-01

    Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…

  9. Facial affect interpretation in boys with attention deficit/hyperactivity disorder.

    Science.gov (United States)

    Boakes, Jolee; Chapman, Elaine; Houghton, Stephen; West, John

    2008-01-01

    Recent studies have produced mixed evidence of impairments in facial affect interpretation for children with attention deficit/hyperactivity disorder (ADHD). This study investigated the presence and nature of such impairments across different stimulus formats. Twenty-four boys with ADHD and 24 age-matched comparison boys completed a 72-trial task that included facial expressions of happiness, sadness, fear, anger, surprise, and disgust. Three versions of each expression were used: a static version, a dynamic version, and a dynamic version presented within a relevant situational context. Expressions were also presented in one of two portrayal modes (cartoon versus real-life). Results indicated significant impairments for boys with ADHD on two of the six emotions (fear and disgust), which were consistent across stimulus formats. Directions for further research to identify mediating factors in the expression of such impairments in children with ADHD are discussed.

  10. The influence of a working memory task on affective perception of facial expressions.

    Directory of Open Access Journals (Sweden)

    Seung-Lark Lim

    Full Text Available In a dual-task paradigm, participants performed a spatial location working memory task and a forced two-choice perceptual decision task (neutral vs. fearful with gradually morphed emotional faces (neutral ∼ fearful. Task-irrelevant word distractors (negative, neutral, and control were experimentally manipulated during spatial working memory encoding. We hypothesized that, if affective perception is influenced by concurrent cognitive load using a working memory task, task-irrelevant emotional distractors would bias subsequent perceptual decision-making on ambiguous facial expression. We found that when either neutral or negative emotional words were presented as task-irrelevant working-memory distractors, participants more frequently reported fearful face perception - but only at the higher emotional intensity levels of morphed faces. Also, the affective perception bias due to negative emotional distractors correlated with a decrease in working memory performance. Taken together, our findings suggest that concurrent working memory load by task-irrelevant distractors has an impact on affective perception of facial expressions.

  11. Holistic face processing can inhibit recognition of forensic facial composites.

    Science.gov (United States)

    McIntyre, Alex H; Hancock, Peter J B; Frowd, Charlie D; Langton, Stephen R H

    2016-04-01

    Facial composite systems help eyewitnesses to show the appearance of criminals. However, likenesses created by unfamiliar witnesses will not be completely accurate, and people familiar with the target can find them difficult to identify. Faces are processed holistically; we explore whether this impairs identification of inaccurate composite images and whether recognition can be improved. In Experiment 1 (n = 64) an imaging technique was used to make composites of celebrity faces more accurate and identification was contrasted with the original composite images. Corrected composites were better recognized, confirming that errors in production of the likenesses impair identification. The influence of holistic face processing was explored by misaligning the top and bottom parts of the composites (cf. Young, Hellawell, & Hay, 1987). Misalignment impaired recognition of corrected composites but identification of the original, inaccurate composites significantly improved. This effect was replicated with facial composites of noncelebrities in Experiment 2 (n = 57). We conclude that, like real faces, facial composites are processed holistically: recognition is impaired because unlike real faces, composites contain inaccuracies and holistic face processing makes it difficult to perceive identifiable features. This effect was consistent across composites of celebrities and composites of people who are personally familiar. Our findings suggest that identification of forensic facial composites can be enhanced by presenting composites in a misaligned format. (c) 2016 APA, all rights reserved).

  12. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  13. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    Science.gov (United States)

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  14. Facial decoding in schizophrenia is underpinned by basic visual processing impairments.

    Science.gov (United States)

    Belge, Jan-Baptist; Maurage, Pierre; Mangelinckx, Camille; Leleux, Dominique; Delatte, Benoît; Constant, Eric

    2017-09-01

    Schizophrenia is associated with a strong deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specific for emotions or due to a more general impairment for any type of facial processing. This study was designed to clarify this issue. Thirty patients suffering from schizophrenia and 30 matched healthy controls performed several tasks evaluating the recognition of both changeable (i.e. eyes orientation and emotions) and stable (i.e. gender, age) facial characteristics. Accuracy and reaction times were recorded. Schizophrenic patients presented a performance deficit (accuracy and reaction times) in the perception of both changeable and stable aspects of faces, without any specific deficit for emotional decoding. Our results demonstrate a generalized face recognition deficit in schizophrenic patients, probably caused by a perceptual deficit in basic visual processing. It seems that the deficit in the decoding of emotional facial expression (EFE) is not a specific deficit of emotion processing, but is at least partly related to a generalized perceptual deficit in lower-level perceptual processing, occurring before the stage of emotion processing, and underlying more complex cognitive dysfunctions. These findings should encourage future investigations to explore the neurophysiologic background of these generalized perceptual deficits, and stimulate a clinical approach focusing on more basic visual processing. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  15. Gender differences in the motivational processing of facial beauty☆

    Science.gov (United States)

    Levy, Boaz; Ariely, Dan; Mazar, Nina; Chi, Won; Lukas, Scott; Elman, Igor

    2013-01-01

    Gender may be involved in the motivational processing of facial beauty. This study applied a behavioral probe, known to activate brain motivational regions, to healthy heterosexual subjects. Matched samples of men and women were administered two tasks: (a) key pressing to change the viewing time of average or beautiful female or male facial images, and (b) rating the attractiveness of these images. Men expended more effort (via the key-press task) to extend the viewing time of the beautiful female faces. Women displayed similarly increased effort for beautiful male and female images, but the magnitude of this effort was substantially lower than that of men for beautiful females. Heterosexual facial attractiveness ratings were comparable in both groups. These findings demonstrate heterosexual specificity of facial motivational targets for men, but not for women. Moreover, heightened drive for the pursuit of heterosexual beauty in the face of regular valuational assessments, displayed by men, suggests a gender-specific incentive sensitization phenomenon. PMID:24282336

  16. Dissociating Face Identity and Facial Expression Processing Via Visual Adaptation

    Directory of Open Access Journals (Sweden)

    Hong Xu

    2012-10-01

    Full Text Available Face identity and facial expression are processed in two distinct neural pathways. However, most of the existing face adaptation literature studies them separately, despite the fact that they are two aspects from the same face. The current study conducted a systematic comparison between these two aspects by face adaptation, investigating how top- and bottom-half face parts contribute to the processing of face identity and facial expression. A real face (sad, “Adam” and its two size-equivalent face parts (top- and bottom-half were used as the adaptor in separate conditions. For face identity adaptation, the test stimuli were generated by morphing Adam's sad face with another person's sad face (“Sam”. For facial expression adaptation, the test stimuli were created by morphing Adam's sad face with his neutral face and morphing the neutral face with his happy face. In each trial, after exposure to the adaptor, observers indicated the perceived face identity or facial expression of the following test face via a key press. They were also tested in a baseline condition without adaptation. Results show that the top- and bottom-half face each generated a significant face identity aftereffect. However, the aftereffect by top-half face adaptation is much larger than that by the bottom-half face. On the contrary, only the bottom-half face generated a significant facial expression aftereffect. This dissociation of top- and bottom-half face adaptation suggests that face parts play different roles in face identity and facial expression. It thus provides further evidence for the distributed systems of face perception.

  17. Impaired Emotional Mirroring in Parkinson’s Disease—A Study on Brain Activation during Processing of Facial Expressions

    Directory of Open Access Journals (Sweden)

    Anna Pohl

    2017-12-01

    Full Text Available BackgroundAffective dysfunctions are common in patients with Parkinson’s disease, but the underlying neurobiological deviations have rarely been examined. Parkinson’s disease is characterized by a loss of dopamine neurons in the substantia nigra resulting in impairment of motor and non-motor basal ganglia-cortical loops. Concerning emotional deficits, some studies provide evidence for altered brain processing in limbic- and lateral-orbitofrontal gating loops. In a second line of evidence, human premotor and inferior parietal homologs of mirror neuron areas were involved in processing and understanding of emotional facial expressions. We examined deviations in brain activation during processing of facial expressions in patients and related these to emotion recognition accuracy.Methods13 patients and 13 healthy controls underwent an emotion recognition task and a functional magnetic resonance imaging (fMRI measurement. In the Emotion Hexagon test, participants were presented with blends of two emotions and had to indicate which emotion best described the presented picture. Blended pictures with three levels of difficulty were included. During fMRI scanning, participants observed video clips depicting emotional, non-emotional, and neutral facial expressions or were asked to produce these facial expressions themselves.ResultsPatients performed slightly worse in the emotion recognition task, but only when judging the most ambiguous facial expressions. Both groups activated inferior frontal and anterior inferior parietal homologs of mirror neuron areas during observation and execution of the emotional facial expressions. During observation, responses in the pars opercularis of the right inferior frontal gyrus, in the bilateral inferior parietal lobule and in the bilateral supplementary motor cortex were decreased in patients. Furthermore, in patients, activation of the right anterior inferior parietal lobule was positively related to accuracy in

  18. Role of temporal processing stages by inferior temporal neurons in facial recognition

    Directory of Open Access Journals (Sweden)

    Yasuko eSugase-Miyamoto

    2011-06-01

    Full Text Available In this review, we focus on the role of temporal stages of encoded facial information in the visual system, which might enable the efficient determination of species, identity, and expression. Facial recognition is an important function of our brain and is known to be processed in the ventral visual pathway, where visual signals are processed through areas V1, V2, V4, and the inferior temporal (IT cortex. In the IT cortex, neurons show selective responses to complex visual images such as faces, and at each stage along the pathway the stimulus selectivity of the neural responses becomes sharper, particularly in the later portion of the responses.In the IT cortex of the monkey, facial information is represented by different temporal stages of neural responses, as shown in our previous study: the initial transient response of face-responsive neurons represents information about global categories, i.e., human vs. monkey vs. simple shapes, whilst the later portion of these responses represents information about detailed facial categories, i.e., expression and/or identity. This suggests that the temporal stages of the neuronal firing pattern play an important role in the coding of visual stimuli, including faces. This type of coding may be a plausible mechanism underlying the temporal dynamics of recognition, including the process of detection/categorization followed by the identification of objects. Recent single-unit studies in monkeys have also provided evidence consistent with the important role of the temporal stages of encoded facial information. For example, view-invariant facial identity information is represented in the response at a later period within a region of face-selective neurons. Consistent with these findings, temporally modulated neural activity has also been observed in human studies. These results suggest a close correlation between the temporal processing stages of facial information by IT neurons and the temporal dynamics of

  19. Visual and associated affective processing of face information in schizophrenia: A selective review.

    Science.gov (United States)

    Chen, Yue; Ekstrom, Tor

    Perception of facial features is crucial in social life. In past decades, extensive research showed that the ability to perceive facial emotion expression was compromised in schizophrenia patients. Given that face perception involves visual/cognitive and affective processing, the roles of these two processing domains in the compromised face perception in schizophrenia were studied and discussed, but not clearly defined. One particular issue was whether face-specific processing is implicated in this psychiatric disorder. Recent investigations have probed into the components of face perception processes such as visual detection, identity recognition, emotion expression discrimination and working memory conveyed from faces. Recent investigations have further assessed the associations between face processing and basic visual processing and between face processing and social cognitive processing such as Theory of Mind. In this selective review, we discuss the investigative findings relevant to the issues of cognitive and affective association and face-specific processing. We highlight the implications of multiple processing domains and face-specific processes as potential mechanisms underlying compromised face perception in schizophrenia. These findings suggest a need for a domain-specific therapeutic approach to the improvement of face perception in schizophrenia.

  20. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    Science.gov (United States)

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  1. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    Science.gov (United States)

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  2. Reaction Time of Facial Affect Recognition in Asperger's Disorder for Cartoon and Real, Static and Moving Faces

    Science.gov (United States)

    Miyahara, Motohide; Bray, Anne; Tsujii, Masatsugu; Fujita, Chikako; Sugiyama, Toshiro

    2007-01-01

    This study used a choice reaction-time paradigm to test the perceived impairment of facial affect recognition in Asperger's disorder. Twenty teenagers with Asperger's disorder and 20 controls were compared with respect to the latency and accuracy of response to happy or disgusted facial expressions, presented in cartoon or real images and in…

  3. Facial Affect Recognition Training Through Telepractice: Two Case Studies of Individuals with Chronic Traumatic Brain Injury

    Directory of Open Access Journals (Sweden)

    John Williamson

    2015-07-01

    Full Text Available The use of a modified Facial Affect Recognition (FAR training to identify emotions was investigated with two case studies of adults with moderate to severe chronic (> five years traumatic brain injury (TBI.  The modified FAR training was administered via telepractice to target social communication skills.  Therapy consisted of identifying emotions through static facial expressions, personally reflecting on those emotions, and identifying sarcasm and emotions within social stories and role-play.  Pre- and post-therapy measures included static facial photos to identify emotion and the Prutting and Kirchner Pragmatic Protocol for social communication.  Both participants with chronic TBI showed gains on identifying facial emotions on the static photos.               

  4. Facial biometrics of Yorubas of Nigeria using Akinlolu-Raji image-processing algorithm

    Directory of Open Access Journals (Sweden)

    Adelaja Abdulazeez Akinlolu

    2016-01-01

    Full Text Available Background: Forensic anthropology deals with the establishment of human identity using genetics, biometrics, and face recognition technology. This study aims to compute facial biometrics of Yorubas of Osun State of Nigeria using a novel Akinlolu-Raji image-processing algorithm. Materials and Methods: Three hundred Yorubas of Osun State (150 males and 150 females, aged 15–33 years were selected as subjects for the study with informed consents and when established as Yorubas by parents and grandparents. Height, body weight, and facial biometrics (evaluated on three-dimensional [3D] facial photographs were measured on all subjects. The novel Akinlolu-Raji image-processing algorithm for forensic face recognition was developed using the modified row method of computer programming. Facial width, total face height, short forehead height, long forehead height, upper face height, nasal bridge length, nose height, morphological face height, and lower face height computed from readings of the Akinlolu-Raji image-processing algorithm were analyzed using z-test (P ≤ 0.05 of 2010 Microsoft Excel statistical software. Results: Statistical analyzes of facial measurements showed nonsignificant higher mean values (P > 0.05 in Yoruba males compared to females. Yoruba males and females have the leptoprosopic face type based on classifications of face types from facial indices. Conclusions: Akinlolu-Raji image-processing algorithm can be employed for computing anthropometric, forensic, diagnostic, or any other measurements on 2D and 3D images, and data computed from its readings can be converted to actual or life sizes as obtained in 1D measurements. Furthermore, Yoruba males and females have the leptoprosopic face type.

  5. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  6. Positive facial affect - an fMRI study on the involvement of insula and amygdala.

    Directory of Open Access Journals (Sweden)

    Anna Pohl

    Full Text Available Imitation of facial expressions engages the putative human mirror neuron system as well as the insula and the amygdala as part of the limbic system. The specific function of the latter two regions during emotional actions is still under debate. The current study investigated brain responses during imitation of positive in comparison to non-emotional facial expressions. Differences in brain activation of the amygdala and insula were additionally examined during observation and execution of facial expressions. Participants imitated, executed and observed happy and non-emotional facial expressions, as well as neutral faces. During imitation, higher right hemispheric activation emerged in the happy compared to the non-emotional condition in the right anterior insula and the right amygdala, in addition to the pre-supplementary motor area, middle temporal gyrus and the inferior frontal gyrus. Region-of-interest analyses revealed that the right insula was more strongly recruited by (i imitation and execution than by observation of facial expressions, that (ii the insula was significantly stronger activated by happy than by non-emotional facial expressions during observation and imitation and that (iii the activation differences in the right amygdala between happy and non-emotional facial expressions were increased during imitation and execution, in comparison to sole observation. We suggest that the insula and the amygdala contribute specifically to the happy emotional connotation of the facial expressions depending on the task. The pattern of the insula activity might reflect increased bodily awareness during active execution compared to passive observation and during visual processing of the happy compared to non-emotional facial expressions. The activation specific for the happy facial expression of the amygdala during motor tasks, but not in the observation condition, might reflect increased autonomic activity or feedback from facial muscles to the

  7. Relationship between individual differences in functional connectivity and facial-emotion recognition abilities in adults with traumatic brain injury.

    Science.gov (United States)

    Rigon, A; Voss, M W; Turkstra, L S; Mutlu, B; Duff, M C

    2017-01-01

    Although several studies have demonstrated that facial-affect recognition impairment is common following moderate-severe traumatic brain injury (TBI), and that there are diffuse alterations in large-scale functional brain networks in TBI populations, little is known about the relationship between the two. Here, in a sample of 26 participants with TBI and 20 healthy comparison participants (HC) we measured facial-affect recognition abilities and resting-state functional connectivity (rs-FC) using fMRI. We then used network-based statistics to examine (A) the presence of rs-FC differences between individuals with TBI and HC within the facial-affect processing network, and (B) the association between inter-individual differences in emotion recognition skills and rs-FC within the facial-affect processing network. We found that participants with TBI showed significantly lower rs-FC in a component comprising homotopic and within-hemisphere, anterior-posterior connections within the facial-affect processing network. In addition, within the TBI group, participants with higher emotion-labeling skills showed stronger rs-FC within a network comprised of intra- and inter-hemispheric bilateral connections. Findings indicate that the ability to successfully recognize facial-affect after TBI is related to rs-FC within components of facial-affective networks, and provide new evidence that further our understanding of the mechanisms underlying emotion recognition impairment in TBI.

  8. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  9. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  10. Are facial expressions of emotion produced by categorical affect programs or dynamically driven by appraisal?

    Science.gov (United States)

    Scherer, Klaus R; Ellgring, Heiner

    2007-02-01

    The different assumptions made by discrete and componential emotion theories about the nature of the facial expression of emotion and the underlying mechanisms are reviewed. Explicit and implicit predictions are derived from each model. It is argued that experimental expression-production paradigms rather than recognition studies are required to critically test these differential predictions. Data from a large-scale actor portrayal study are reported to demonstrate the utility of this approach. The frequencies with which 12 professional actors use major facial muscle actions individually and in combination to express 14 major emotions show little evidence for emotion-specific prototypical affect programs. Rather, the results encourage empirical investigation of componential emotion model predictions of dynamic configurations of appraisal-driven adaptive facial actions. (c) 2007 APA, all rights reserved.

  11. Dynamics of processing invisible faces in the brain: automatic neural encoding of facial expression information.

    Science.gov (United States)

    Jiang, Yi; Shannon, Robert W; Vizueta, Nathalie; Bernat, Edward M; Patrick, Christopher J; He, Sheng

    2009-02-01

    The fusiform face area (FFA) and the superior temporal sulcus (STS) are suggested to process facial identity and facial expression information respectively. We recently demonstrated a functional dissociation between the FFA and the STS as well as correlated sensitivity of the STS and the amygdala to facial expressions using an interocular suppression paradigm [Jiang, Y., He, S., 2006. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023-2029.]. In the current event-related brain potential (ERP) study, we investigated the temporal dynamics of facial information processing. Observers viewed neutral, fearful, and scrambled face stimuli, either visibly or rendered invisible through interocular suppression. Relative to scrambled face stimuli, intact visible faces elicited larger positive P1 (110-130 ms) and larger negative N1 or N170 (160-180 ms) potentials at posterior occipital and bilateral occipito-temporal regions respectively, with the N170 amplitude significantly greater for fearful than neutral faces. Invisible intact faces generated a stronger signal than scrambled faces at 140-200 ms over posterior occipital areas whereas invisible fearful faces (compared to neutral and scrambled faces) elicited a significantly larger negative deflection starting at 220 ms along the STS. These results provide further evidence for cortical processing of facial information without awareness and elucidate the temporal sequence of automatic facial expression information extraction.

  12. Faces in context: A review and systematization of contextual influences on affective face processing

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2012-11-01

    Full Text Available Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant basic emotion approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, decontextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at 1 systematizing the contextual variables that may influence the perception of facial expressions and 2 summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in

  13. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of "Automatic" Mimicry by Higher Cognitive Processes.

    Science.gov (United States)

    Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya

    2016-01-01

    A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.

  15. Neurocognition and symptoms identify links between facial recognition and emotion processing in schizophrenia: meta-analytic findings.

    Science.gov (United States)

    Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S

    2013-12-01

    In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.

  16. Positive Facial Affect – An fMRI Study on the Involvement of Insula and Amygdala

    Science.gov (United States)

    Pohl, Anna; Anders, Silke; Schulte-Rüther, Martin; Mathiak, Klaus; Kircher, Tilo

    2013-01-01

    Imitation of facial expressions engages the putative human mirror neuron system as well as the insula and the amygdala as part of the limbic system. The specific function of the latter two regions during emotional actions is still under debate. The current study investigated brain responses during imitation of positive in comparison to non-emotional facial expressions. Differences in brain activation of the amygdala and insula were additionally examined during observation and execution of facial expressions. Participants imitated, executed and observed happy and non-emotional facial expressions, as well as neutral faces. During imitation, higher right hemispheric activation emerged in the happy compared to the non-emotional condition in the right anterior insula and the right amygdala, in addition to the pre-supplementary motor area, middle temporal gyrus and the inferior frontal gyrus. Region-of-interest analyses revealed that the right insula was more strongly recruited by (i) imitation and execution than by observation of facial expressions, that (ii) the insula was significantly stronger activated by happy than by non-emotional facial expressions during observation and imitation and that (iii) the activation differences in the right amygdala between happy and non-emotional facial expressions were increased during imitation and execution, in comparison to sole observation. We suggest that the insula and the amygdala contribute specifically to the happy emotional connotation of the facial expressions depending on the task. The pattern of the insula activity might reflect increased bodily awareness during active execution compared to passive observation and during visual processing of the happy compared to non-emotional facial expressions. The activation specific for the happy facial expression of the amygdala during motor tasks, but not in the observation condition, might reflect increased autonomic activity or feedback from facial muscles to the amygdala. PMID

  17. Facial appearance affects science communication.

    Science.gov (United States)

    Gheorghiu, Ana I; Callan, Mitchell J; Skylark, William J

    2017-06-06

    First impressions based on facial appearance predict many important social outcomes. We investigated whether such impressions also influence the communication of scientific findings to lay audiences, a process that shapes public beliefs, opinion, and policy. First, we investigated the traits that engender interest in a scientist's work, and those that create the impression of a "good scientist" who does high-quality research. Apparent competence and morality were positively related to both interest and quality judgments, whereas attractiveness boosted interest but decreased perceived quality. Next, we had members of the public choose real science news stories to read or watch and found that people were more likely to choose items that were paired with "interesting-looking" scientists, especially when selecting video-based communications. Finally, we had people read real science news items and found that the research was judged to be of higher quality when paired with researchers who look like "good scientists." Our findings offer insights into the social psychology of science, and indicate a source of bias in the dissemination of scientific findings to broader society.

  18. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    Science.gov (United States)

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  19. Sex Hormones and Processing of Facial Expressions of Emotion: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Flávia L. Osório

    2018-04-01

    Full Text Available Background: We systematically reviewed the literature to determine the influence of sex hormones on facial emotion processing (FEP in healthy women at different phases of life.Methods: Searches were performed in PubMed, Web of Science, PsycINFO, LILACS, and SciELO. Twenty-seven articles were included in the review and allocated into five different categories according to their objectives and sample characteristics (menstrual cycle, oral contraceptives, pregnancy/postpartum, testosterone, and progesterone.Results: Despite the limited number of studies in some categories and the existence of inconsistencies in the results of interest, the findings of the review suggest that FEP may be enhanced during the follicular phase. Studies with women taking oral contraceptives showed reduced recognition accuracy and decreased responsiveness of different brain structures during FEP tasks. Studies with pregnant women and women in the postpartum showed that hormonal changes are associated with alterations in FEP and in brain functioning that could indicate the existence of a hypervigilant state in new and future mothers. Exogenous administration of testosterone enhanced the recognition of threatening facial expressions and the activation of brain structures involved in the processing of emotional stimuli.Conclusions: We conclude that sex hormones affect FEP in women, which may have an impact in adaptive processes of the species and in the onset of mood symptoms associated with the premenstrual syndrome.

  20. Spatiotemporal neural network dynamics for the processing of dynamic facial expressions

    Science.gov (United States)

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota

    2015-01-01

    The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150–200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300–350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual–motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions. PMID:26206708

  1. Spatiotemporal neural network dynamics for the processing of dynamic facial expressions.

    Science.gov (United States)

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota

    2015-07-24

    The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150-200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300-350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual-motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions.

  2. The MPI facial expression database--a validated database of emotional and conversational facial expressions.

    Directory of Open Access Journals (Sweden)

    Kathrin Kaulard

    Full Text Available The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision to investigate the processing of a wider range of natural

  3. Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

    Science.gov (United States)

    Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J

    2017-12-01

    The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants' perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.

  4. Neural Temporal Dynamics of Facial Emotion Processing: Age Effects and Relationship to Cognitive Function

    Directory of Open Access Journals (Sweden)

    Xiaoyan Liao

    2017-06-01

    Full Text Available This study used event-related potentials (ERPs to investigate the effects of age on neural temporal dynamics of processing task-relevant facial expressions and their relationship to cognitive functions. Negative (sad, afraid, angry, and disgusted, positive (happy, and neutral faces were presented to 30 older and 31 young participants who performed a facial emotion categorization task. Behavioral and ERP indices of facial emotion processing were analyzed. An enhanced N170 for negative faces, in addition to intact right-hemispheric N170 for positive faces, was observed in older adults relative to their younger counterparts. Moreover, older adults demonstrated an attenuated within-group N170 laterality effect for neutral faces, while younger adults showed the opposite pattern. Furthermore, older adults exhibited sustained temporo-occipital negativity deflection over the time range of 200–500 ms post-stimulus, while young adults showed posterior positivity and subsequent emotion-specific frontal negativity deflections. In older adults, decreased accuracy for labeling negative faces was positively correlated with Montreal Cognitive Assessment Scores, and accuracy for labeling neutral faces was negatively correlated with age. These findings suggest that older people may exert more effort in structural encoding for negative faces and there are different response patterns for the categorization of different facial emotions. Cognitive functioning may be related to facial emotion categorization deficits observed in older adults. This may not be attributable to positivity effects: it may represent a selective deficit for the processing of negative facial expressions in older adults.

  5. Affective responses to ambivalence are context-dependent : A facial EMG study on the role of inconsistency and evaluative context in shaping affective responses to ambivalence

    NARCIS (Netherlands)

    Nohlen, H.U.; van Harreveld, F.; Rotteveel, M.; Barends, A.J.; Larsen, J.T.

    It has long been debated whether attitudinal ambivalence elicits negative affect and evidence for such a link is inconclusive. Using facial EMG, we tested the idea that affective responses to ambivalence are dependent on the inconsistency of evaluations in the current situation. In a person

  6. Direction of Amygdala-Neocortex Interaction During Dynamic Facial Expression Processing.

    Science.gov (United States)

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota; Yoshikawa, Sakiko; Toichi, Motomi

    2017-03-01

    Dynamic facial expressions of emotion strongly elicit multifaceted emotional, perceptual, cognitive, and motor responses. Neuroimaging studies revealed that some subcortical (e.g., amygdala) and neocortical (e.g., superior temporal sulcus and inferior frontal gyrus) brain regions and their functional interaction were involved in processing dynamic facial expressions. However, the direction of the functional interaction between the amygdala and the neocortex remains unknown. To investigate this issue, we re-analyzed functional magnetic resonance imaging (fMRI) data from 2 studies and magnetoencephalography (MEG) data from 1 study. First, a psychophysiological interaction analysis of the fMRI data confirmed the functional interaction between the amygdala and neocortical regions. Then, dynamic causal modeling analysis was used to compare models with forward, backward, or bidirectional effective connectivity between the amygdala and neocortical networks in the fMRI and MEG data. The results consistently supported the model of effective connectivity from the amygdala to the neocortex. Further increasing time-window analysis of the MEG demonstrated that this model was valid after 200 ms from the stimulus onset. These data suggest that emotional processing in the amygdala rapidly modulates some neocortical processing, such as perception, recognition, and motor mimicry, when observing dynamic facial expressions of emotion. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    Science.gov (United States)

    Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian

    2012-01-01

    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions

  8. The role of the cannabinoid receptor in adolescents' processing of facial expressions.

    Science.gov (United States)

    Ewald, Anais; Becker, Susanne; Heinrich, Angela; Banaschewski, Tobias; Poustka, Luise; Bokde, Arun; Büchel, Christian; Bromberg, Uli; Cattrell, Anna; Conrod, Patricia; Desrivières, Sylvane; Frouin, Vincent; Papadopoulos-Orfanos, Dimitri; Gallinat, Jürgen; Garavan, Hugh; Heinz, Andreas; Walter, Henrik; Ittermann, Bernd; Gowland, Penny; Paus, Tomáš; Martinot, Jean-Luc; Paillère Martinot, Marie-Laure; Smolka, Michael N; Vetter, Nora; Whelan, Rob; Schumann, Gunter; Flor, Herta; Nees, Frauke

    2016-01-01

    The processing of emotional faces is an important prerequisite for adequate social interactions in daily life, and might thus specifically be altered in adolescence, a period marked by significant changes in social emotional processing. Previous research has shown that the cannabinoid receptor CB1R is associated with longer gaze duration and increased brain responses in the striatum to happy faces in adults, yet, for adolescents, it is not clear whether an association between CBR1 and face processing exists. In the present study we investigated genetic effects of the two CB1R polymorphisms, rs1049353 and rs806377, on the processing of emotional faces in healthy adolescents. They participated in functional magnetic resonance imaging during a Faces Task, watching blocks of video clips with angry and neutral facial expressions, and completed a Morphed Faces Task in the laboratory where they looked at different facial expressions that switched from anger to fear or sadness or from happiness to fear or sadness, and labelled them according to these four emotional expressions. A-allele versus GG-carriers in rs1049353 displayed earlier recognition of facial expressions changing from anger to sadness or fear, but not for expressions changing from happiness to sadness or fear, and higher brain responses to angry, but not neutral, faces in the amygdala and insula. For rs806377 no significant effects emerged. This suggests that rs1049353 is involved in the processing of negative facial expressions with relation to anger in adolescence. These findings add to our understanding of social emotion-related mechanisms in this life period. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  9. Motor, affective and cognitive empathy in adolescence : Interrelations between facial electromyography and self-reported trait and state measures

    NARCIS (Netherlands)

    Van der Graaff, Jolien; Meeus, Wim; de Wied, Minet; van Boxtel, Anton; van Lier, Pol A C; Koot, Hans M.; Branje, Susan J. T.

    2016-01-01

    This study examined interrelations of trait and state empathy in an adolescent sample. Self-reported affective trait empathy and cognitive trait empathy were assessed during a home visit. During a test session at the university, motor empathy (facial electromyography), and self-reported affective

  10. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  11. Scattered Data Processing Approach Based on Optical Facial Motion Capture

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2013-01-01

    Full Text Available In recent years, animation reconstruction of facial expressions has become a popular research field in computer science and motion capture-based facial expression reconstruction is now emerging in this field. Based on the facial motion data obtained using a passive optical motion capture system, we propose a scattered data processing approach, which aims to solve the common problems of missing data and noise. To recover missing data, given the nonlinear relationships among neighbors with the current missing marker, we propose an improved version of a previous method, where we use the motion of three muscles rather than one to recover the missing data. To reduce the noise, we initially apply preprocessing to eliminate impulsive noise, before our proposed three-order quasi-uniform B-spline-based fitting method is used to reduce the remaining noise. Our experiments showed that the principles that underlie this method are simple and straightforward, and it delivered acceptable precision during reconstruction.

  12. How does context affect assessments of facial emotion? The role of culture and age.

    Science.gov (United States)

    Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara

    2011-03-01

    People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. (c) 2011 APA, all rights reserved.

  13. Facial anatomy.

    Science.gov (United States)

    Marur, Tania; Tuna, Yakup; Demirci, Selman

    2014-01-01

    Dermatologic problems of the face affect both function and aesthetics, which are based on complex anatomical features. Treating dermatologic problems while preserving the aesthetics and functions of the face requires knowledge of normal anatomy. When performing successfully invasive procedures of the face, it is essential to understand its underlying topographic anatomy. This chapter presents the anatomy of the facial musculature and neurovascular structures in a systematic way with some clinically important aspects. We describe the attachments of the mimetic and masticatory muscles and emphasize their functions and nerve supply. We highlight clinically relevant facial topographic anatomy by explaining the course and location of the sensory and motor nerves of the face and facial vasculature with their relations. Additionally, this chapter reviews the recent nomenclature of the branching pattern of the facial artery. © 2013 Elsevier Inc. All rights reserved.

  14. Gender Differences in the Motivational Processing of Facial Beauty

    Science.gov (United States)

    Levy, Boaz; Ariely, Dan; Mazar, Nina; Chi, Won; Lukas, Scott; Elman, Igor

    2008-01-01

    Gender may be involved in the motivational processing of facial beauty. This study applied a behavioral probe, known to activate brain motivational regions, to healthy heterosexual subjects. Matched samples of men and women were administered two tasks: (a) key pressing to change the viewing time of average or beautiful female or male facial…

  15. The Impact of Sex Differences on Odor Identification and Facial Affect Recognition in Patients with Schizophrenia Spectrum Disorders.

    Science.gov (United States)

    Mossaheb, Nilufar; Kaufmann, Rainer M; Schlögelhofer, Monika; Aninilkumparambil, Thushara; Himmelbauer, Claudia; Gold, Anna; Zehetmayer, Sonja; Hoffmann, Holger; Traue, Harald C; Aschauer, Harald

    2018-01-01

    Social interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls. Standardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL)] and smell identification [University of Pennsylvania Smell Identification Test (UPSIT)] in 51 patients with schizophrenia spectrum disorders and 79 healthy controls; furthermore, working memory functions and clinical variables were assessed. In both the univariate and the multivariate results, illness showed a significant influence on UPSIT and FEEL. The inclusion of age and working memory in the MANOVA resulted in a differential effect with sex and working memory as remaining significant factors. Duration of illness was correlated with both emotion recognition and smell identification in men only, whereas immediate general psychopathology and negative symptoms were associated with emotion recognition only in women. Being affected by schizophrenia spectrum disorder impacts one's ability to correctly recognize facial affects and identify odors. Converging evidence suggests a link between the investigated basic and social cognitive abilities in patients with schizophrenia spectrum disorders with a strong contribution of working memory and differential effects of modulators in women vs. men.

  16. The Impact of Sex Differences on Odor Identification and Facial Affect Recognition in Patients with Schizophrenia Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Nilufar Mossaheb

    2018-01-01

    Full Text Available BackgroundSocial interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls.MethodsStandardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL] and smell identification [University of Pennsylvania Smell Identification Test (UPSIT] in 51 patients with schizophrenia spectrum disorders and 79 healthy controls; furthermore, working memory functions and clinical variables were assessed.ResultsIn both the univariate and the multivariate results, illness showed a significant influence on UPSIT and FEEL. The inclusion of age and working memory in the MANOVA resulted in a differential effect with sex and working memory as remaining significant factors. Duration of illness was correlated with both emotion recognition and smell identification in men only, whereas immediate general psychopathology and negative symptoms were associated with emotion recognition only in women.ConclusionBeing affected by schizophrenia spectrum disorder impacts one’s ability to correctly recognize facial affects and identify odors. Converging evidence suggests a link between the investigated basic and social cognitive abilities in patients with schizophrenia spectrum disorders with a strong contribution of working memory and differential effects of modulators in women vs. men.

  17. Facial Expression Recognition

    NARCIS (Netherlands)

    Pantic, Maja; Li, S.; Jain, A.

    2009-01-01

    Facial expression recognition is a process performed by humans or computers, which consists of: 1. Locating faces in the scene (e.g., in an image; this step is also referred to as face detection), 2. Extracting facial features from the detected face region (e.g., detecting the shape of facial

  18. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  19. Revisiting the Relationship between the Processing of Gaze Direction and the Processing of Facial Expression

    Science.gov (United States)

    Ganel, Tzvi

    2011-01-01

    There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension.…

  20. Early Adverse Caregiving Experiences and Preschoolers' Current Attachment Affect Brain Responses during Facial Familiarity Processing: An ERP Study.

    Science.gov (United States)

    Kungl, Melanie T; Bovenschen, Ina; Spangler, Gottfried

    2017-01-01

    When being placed into more benign environments like foster care, children from adverse rearing backgrounds are capable of forming attachment relationships to new caregivers within the first year of placement, while certain problematic social behaviors appear to be more persistent. Assuming that early averse experiences shape neural circuits underlying social behavior, neurophysiological studies on individual differences in early social-information processing have great informative value. More precisely, ERP studies have repeatedly shown face processing to be sensitive to experience especially regarding the caregiving background. However, studies on effects of early adverse caregiving experiences are restricted to children with a history of institutionalization. Also, no study has investigated effects of attachment security as a marker of the quality of the caregiver-child relationship. Thus, the current study asks how adverse caregiving experiences and attachment security to (new) caregivers affect early- and mid-latency ERPs sensitive to facial familiarity processing. Therefore, pre-school aged foster children during their second year within the foster home were compared to an age matched control group. Attachment was assessed using the AQS and neurophysiological data was collected during a passive viewing task presenting (foster) mother and stranger faces. Foster children were comparable to the control group with regard to attachment security. On a neurophysiological level, however, the foster group showed dampened N170 amplitudes for both face types. In both foster and control children, dampened N170 amplitudes were also found for stranger as compared to (foster) mother faces, and, for insecurely attached children as compared to securely attached children. This neural pattern may be viewed as a result of poorer social interactions earlier in life. Still, there was no effect on P1 amplitudes. Indicating heightened attentional processing, Nc amplitude responses

  1. Early Adverse Caregiving Experiences and Preschoolers' Current Attachment Affect Brain Responses during Facial Familiarity Processing: An ERP Study

    Directory of Open Access Journals (Sweden)

    Melanie T. Kungl

    2017-12-01

    Full Text Available When being placed into more benign environments like foster care, children from adverse rearing backgrounds are capable of forming attachment relationships to new caregivers within the first year of placement, while certain problematic social behaviors appear to be more persistent. Assuming that early averse experiences shape neural circuits underlying social behavior, neurophysiological studies on individual differences in early social-information processing have great informative value. More precisely, ERP studies have repeatedly shown face processing to be sensitive to experience especially regarding the caregiving background. However, studies on effects of early adverse caregiving experiences are restricted to children with a history of institutionalization. Also, no study has investigated effects of attachment security as a marker of the quality of the caregiver-child relationship. Thus, the current study asks how adverse caregiving experiences and attachment security to (new caregivers affect early- and mid-latency ERPs sensitive to facial familiarity processing. Therefore, pre-school aged foster children during their second year within the foster home were compared to an age matched control group. Attachment was assessed using the AQS and neurophysiological data was collected during a passive viewing task presenting (foster mother and stranger faces. Foster children were comparable to the control group with regard to attachment security. On a neurophysiological level, however, the foster group showed dampened N170 amplitudes for both face types. In both foster and control children, dampened N170 amplitudes were also found for stranger as compared to (foster mother faces, and, for insecurely attached children as compared to securely attached children. This neural pattern may be viewed as a result of poorer social interactions earlier in life. Still, there was no effect on P1 amplitudes. Indicating heightened attentional processing, Nc

  2. Early Adverse Caregiving Experiences and Preschoolers' Current Attachment Affect Brain Responses during Facial Familiarity Processing: An ERP Study

    Science.gov (United States)

    Kungl, Melanie T.; Bovenschen, Ina; Spangler, Gottfried

    2017-01-01

    When being placed into more benign environments like foster care, children from adverse rearing backgrounds are capable of forming attachment relationships to new caregivers within the first year of placement, while certain problematic social behaviors appear to be more persistent. Assuming that early averse experiences shape neural circuits underlying social behavior, neurophysiological studies on individual differences in early social-information processing have great informative value. More precisely, ERP studies have repeatedly shown face processing to be sensitive to experience especially regarding the caregiving background. However, studies on effects of early adverse caregiving experiences are restricted to children with a history of institutionalization. Also, no study has investigated effects of attachment security as a marker of the quality of the caregiver-child relationship. Thus, the current study asks how adverse caregiving experiences and attachment security to (new) caregivers affect early- and mid-latency ERPs sensitive to facial familiarity processing. Therefore, pre-school aged foster children during their second year within the foster home were compared to an age matched control group. Attachment was assessed using the AQS and neurophysiological data was collected during a passive viewing task presenting (foster) mother and stranger faces. Foster children were comparable to the control group with regard to attachment security. On a neurophysiological level, however, the foster group showed dampened N170 amplitudes for both face types. In both foster and control children, dampened N170 amplitudes were also found for stranger as compared to (foster) mother faces, and, for insecurely attached children as compared to securely attached children. This neural pattern may be viewed as a result of poorer social interactions earlier in life. Still, there was no effect on P1 amplitudes. Indicating heightened attentional processing, Nc amplitude responses

  3. Context Effects on Facial Affect Recognition in Schizophrenia and Autism: Behavioral and Eye-Tracking Evidence.

    Science.gov (United States)

    Sasson, Noah J; Pinkham, Amy E; Weittenhiller, Lauren P; Faso, Daniel J; Simpson, Claire

    2016-05-01

    Although Schizophrenia (SCZ) and Autism Spectrum Disorder (ASD) share impairments in emotion recognition, the mechanisms underlying these impairments may differ. The current study used the novel "Emotions in Context" task to examine how the interpretation and visual inspection of facial affect is modulated by congruent and incongruent emotional contexts in SCZ and ASD. Both adults with SCZ (n= 44) and those with ASD (n= 21) exhibited reduced affect recognition relative to typically-developing (TD) controls (n= 39) when faces were integrated within broader emotional scenes but not when they were presented in isolation, underscoring the importance of using stimuli that better approximate real-world contexts. Additionally, viewing faces within congruent emotional scenes improved accuracy and visual attention to the face for controls more so than the clinical groups, suggesting that individuals with SCZ and ASD may not benefit from the presence of complementary emotional information as readily as controls. Despite these similarities, important distinctions between SCZ and ASD were found. In every condition, IQ was related to emotion-recognition accuracy for the SCZ group but not for the ASD or TD groups. Further, only the ASD group failed to increase their visual attention to faces in incongruent emotional scenes, suggesting a lower reliance on facial information within ambiguous emotional contexts relative to congruent ones. Collectively, these findings highlight both shared and distinct social cognitive processes in SCZ and ASD that may contribute to their characteristic social disabilities. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. Mothers' pupillary responses to infant facial expressions.

    Science.gov (United States)

    Yrttiaho, Santeri; Niehaus, Dana; Thomas, Eileen; Leppänen, Jukka M

    2017-02-06

    Human parental care relies heavily on the ability to monitor and respond to a child's affective states. The current study examined pupil diameter as a potential physiological index of mothers' affective response to infant facial expressions. Pupillary time-series were measured from 86 mothers of young infants in response to an array of photographic infant faces falling into four emotive categories based on valence (positive vs. negative) and arousal (mild vs. strong). Pupil dilation was highly sensitive to the valence of facial expressions, being larger for negative vs. positive facial expressions. A separate control experiment with luminance-matched non-face stimuli indicated that the valence effect was specific to facial expressions and cannot be explained by luminance confounds. Pupil response was not sensitive to the arousal level of facial expressions. The results show the feasibility of using pupil diameter as a marker of mothers' affective responses to ecologically valid infant stimuli and point to a particularly prompt maternal response to infant distress cues.

  5. Facial and extrafacial eosinophilic pustular folliculitis: a clinical and histopathological comparative study.

    Science.gov (United States)

    Lee, W J; Won, K H; Won, C H; Chang, S E; Choi, J H; Moon, K C; Lee, M W

    2014-05-01

    Although more than 300 cases of eosinophilic pustular folliculitis (EPF) have been reported to date, differences in clinicohistopathological findings among affected sites have not yet been evaluated. To evaluate differences in the clinical and histopathological features of facial and extrafacial EPF. Forty-six patients diagnosed with EPF were classified into those with facial and extrafacial disease according to the affected site. Clinical and histopathological characteristics were retrospectively compared, using all data available in the patient medical records. There were no significant between-group differences in subject ages at presentation, but a male predominance was observed in the extrafacial group. In addition, immunosuppression-associated type EPF was more common in the extrafacial group. Eruptions of plaques with an annular appearance were more common in the facial group. Histologically, perifollicular infiltration of eosinophils occurred more frequently in the facial group, whereas perivascular patterns occurred more frequently in the extrafacial group. Follicular mucinosis and exocytosis of inflammatory cells in the hair follicles were strongly associated with facial EPF. The clinical and histopathological characteristics of patients with facial and extrafacial EPF differ, suggesting the involvement of different pathogenic processes in the development of EPF at different sites. © 2013 British Association of Dermatologists.

  6. A facial marker in facial wasting rehabilitation.

    Science.gov (United States)

    Rauso, Raffaele; Tartaro, Gianpaolo; Freda, Nicola; Rusciani, Antonio; Curinga, Giuseppe

    2012-02-01

    Facial lipoatrophy is one of the most distressing manifestation for HIV patients. It can be stigmatizing, severely affecting quality of life and self-esteem, and it may result in reduced antiretroviral adherence. Several filling techniques have been proposed in facial wasting restoration, with different outcomes. The aim of this study is to present a triangular area that is useful to fill in facial wasting rehabilitation. Twenty-eight HIV patients rehabilitated for facial wasting were enrolled in this study. Sixteen were rehabilitated with a non-resorbable filler and twelve with structural fat graft harvested from lipohypertrophied areas. A photographic pre-operative and post-operative evaluation was performed by the patients and by two plastic surgeons who were "blinded." The filled area, in both patients rehabilitated with structural fat grafts or non-resorbable filler, was a triangular area of depression identified between the nasolabial fold, the malar arch, and the line that connects these two anatomical landmarks. The cosmetic result was evaluated after three months after the last filling procedure in the non-resorbable filler group and after three months post-surgery in the structural fat graft group. The mean patient satisfaction score was 8.7 as assessed with a visual analogue scale. The mean score for blinded evaluators was 7.6. In this study the authors describe a triangular area of the face, between the nasolabial fold, the malar arch, and the line that connects these two anatomical landmarks, where a good aesthetic facial restoration in HIV patients with facial wasting may be achieved regardless of which filling technique is used.

  7. Agency and facial emotion judgment in context.

    Science.gov (United States)

    Ito, Kenichi; Masuda, Takahiko; Li, Liman Man Wai

    2013-06-01

    Past research showed that East Asians' belief in holism was expressed as their tendencies to include background facial emotions into the evaluation of target faces more than North Americans. However, this pattern can be interpreted as North Americans' tendency to downplay background facial emotions due to their conceptualization of facial emotion as volitional expression of internal states. Examining this alternative explanation, we investigated whether different types of contextual information produce varying degrees of effect on one's face evaluation across cultures. In three studies, European Canadians and East Asians rated the intensity of target facial emotions surrounded with either affectively salient landscape sceneries or background facial emotions. The results showed that, although affectively salient landscapes influenced the judgment of both cultural groups, only European Canadians downplayed the background facial emotions. The role of agency as differently conceptualized across cultures and multilayered systems of cultural meanings are discussed.

  8. Face processing regions are sensitive to distinct aspects of temporal sequence in facial dynamics.

    Science.gov (United States)

    Reinl, Maren; Bartels, Andreas

    2014-11-15

    Facial movement conveys important information for social interactions, yet its neural processing is poorly understood. Computational models propose that shape- and temporal sequence sensitive mechanisms interact in processing dynamic faces. While face processing regions are known to respond to facial movement, their sensitivity to particular temporal sequences has barely been studied. Here we used fMRI to examine the sensitivity of human face-processing regions to two aspects of directionality in facial movement trajectories. We presented genuine movie recordings of increasing and decreasing fear expressions, each of which were played in natural or reversed frame order. This two-by-two factorial design matched low-level visual properties, static content and motion energy within each factor, emotion-direction (increasing or decreasing emotion) and timeline (natural versus artificial). The results showed sensitivity for emotion-direction in FFA, which was timeline-dependent as it only occurred within the natural frame order, and sensitivity to timeline in the STS, which was emotion-direction-dependent as it only occurred for decreased fear. The occipital face area (OFA) was sensitive to the factor timeline. These findings reveal interacting temporal sequence sensitive mechanisms that are responsive to both ecological meaning and to prototypical unfolding of facial dynamics. These mechanisms are temporally directional, provide socially relevant information regarding emotional state or naturalness of behavior, and agree with predictions from modeling and predictive coding theory. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. The Relationship of the Facial Nerve to the Condylar Process: A Cadaveric Study with Implications for Open Reduction Internal Fixation

    Directory of Open Access Journals (Sweden)

    H. P. Barham

    2015-01-01

    Full Text Available Introduction. The mandibular condyle is the most common site of mandibular fracture. Surgical treatment of condylar fractures by open reduction and internal fixation (ORIF demands direct visualization of the fracture. This project aimed to investigate the anatomic relationship of the tragus to the facial nerve and condylar process. Materials and Methods. Twelve fresh hemicadavers heads were used. An extended retromandibular/preauricular approach was utilized, with the incision being based parallel to the posterior edge of the ramus. Measurements were obtained from the tragus to the facial nerve and condylar process. Results. The temporozygomatic division of the facial nerve was encountered during each approach, crossing the mandible at the condylar neck. The mean tissue depth separating the facial nerve from the condylar neck was 5.5 mm (range: 3.5 mm–7 mm, SD 1.2 mm. The upper division of the facial nerve crossed the posterior border of the condylar process on average 2.31 cm (SD 0.10 cm anterior to the tragus. Conclusions. This study suggests that the temporozygomatic division of the facial nerve will be encountered in most approaches to the condylar process. As visualization of the relationship of the facial nerve to condyle is often limited, recognition that, on average, 5.5 mm of tissue separates condylar process from nerve should help reduce the incidence of facial nerve injury during this procedure.

  10. Facial infiltrative lipomatosis

    International Nuclear Information System (INIS)

    Haloi, A.K.; Ditchfield, M.; Pennington, A.; Philips, R.

    2006-01-01

    Although there are multiple case reports and small series concerning facial infiltrative lipomatosis, there is no composite radiological description of the condition. Radiological evaluation of facial infiltrative lipomatosis using plain film, sonography, CT and MRI. We radiologically evaluated four patients with facial infiltrative lipomatosis. Initial plain radiographs of the face were acquired in all patients. Three children had an initial sonographic examination to evaluate the condition, followed by MRI. One child had a CT and then MRI. One child had abnormalities on plain radiographs. Sonographically, the lesions were seen as ill-defined heterogeneously hypoechoic areas with indistinct margins. On CT images, the lesions did not have a homogeneous fat density but showed some relatively more dense areas in deeper parts of the lesions. MRI provided better delineation of the exact extent of the process and characterization of facial infiltrative lipomatosis. Facial infiltrative lipomatosis should be considered as a differential diagnosis of vascular or lymphatic malformation when a child presents with unilateral facial swelling. MRI is the most useful single imaging modality to evaluate the condition, as it provides the best delineation of the exact extent of the process. (orig.)

  11. Speed and accuracy of facial expression classification in avoidant personality disorder: a preliminary study.

    Science.gov (United States)

    Rosenthal, M Zachary; Kim, Kwanguk; Herr, Nathaniel R; Smoski, Moria J; Cheavens, Jennifer S; Lynch, Thomas R; Kosson, David S

    2011-10-01

    The aim of this preliminary study was to examine whether individuals with avoidant personality disorder (APD) could be characterized by deficits in the classification of dynamically presented facial emotional expressions. Using a community sample of adults with APD (n = 17) and non-APD controls (n = 16), speed and accuracy of facial emotional expression recognition was investigated in a task that morphs facial expressions from neutral to prototypical expressions (Multi-Morph Facial Affect Recognition Task; Blair, Colledge, Murray, & Mitchell, 2001). Results indicated that individuals with APD were significantly more likely than controls to make errors when classifying fully expressed fear. However, no differences were found between groups in the speed to correctly classify facial emotional expressions. The findings are some of the first to investigate facial emotional processing in a sample of individuals with APD and point to an underlying deficit in processing social cues that may be involved in the maintenance of APD.

  12. [Prosopagnosia and facial expression recognition].

    Science.gov (United States)

    Koyama, Shinichi

    2014-04-01

    This paper reviews clinical neuropsychological studies that have indicated that the recognition of a person's identity and the recognition of facial expressions are processed by different cortical and subcortical areas of the brain. The fusiform gyrus, especially the right fusiform gyrus, plays an important role in the recognition of identity. The superior temporal sulcus, amygdala, and medial frontal cortex play important roles in facial-expression recognition. Both facial recognition and facial-expression recognition are highly intellectual processes that involve several regions of the brain.

  13. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research

    Directory of Open Access Journals (Sweden)

    Xu Chen

    2017-04-01

    Full Text Available Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming and early-stage information processing system (attention, given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has

  14. Operant conditioning of facial displays of pain.

    Science.gov (United States)

    Kunz, Miriam; Rainville, Pierre; Lautenbacher, Stefan

    2011-06-01

    The operant model of chronic pain posits that nonverbal pain behavior, such as facial expressions, is sensitive to reinforcement, but experimental evidence supporting this assumption is sparse. The aim of the present study was to investigate in a healthy population a) whether facial pain behavior can indeed be operantly conditioned using a discriminative reinforcement schedule to increase and decrease facial pain behavior and b) to what extent these changes affect pain experience indexed by self-ratings. In the experimental group (n = 29), the participants were reinforced every time that they showed pain-indicative facial behavior (up-conditioning) or a neutral expression (down-conditioning) in response to painful heat stimulation. Once facial pain behavior was successfully up- or down-conditioned, respectively (which occurred in 72% of participants), facial pain displays and self-report ratings were assessed. In addition, a control group (n = 11) was used that was yoked to the reinforcement plans of the experimental group. During the conditioning phases, reinforcement led to significant changes in facial pain behavior in the majority of the experimental group (p .136). Fine-grained analyses of facial muscle movements revealed a similar picture. Furthermore, the decline in facial pain displays (as observed during down-conditioning) strongly predicted changes in pain ratings (R(2) = 0.329). These results suggest that a) facial pain displays are sensitive to reinforcement and b) that changes in facial pain displays can affect self-report ratings.

  15. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    Science.gov (United States)

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  16. Basic abnormalities in visual processing affect face processing at an early age in autism spectrum disorder.

    Science.gov (United States)

    Vlamings, Petra Hendrika Johanna Maria; Jonkman, Lisa Marthe; van Daalen, Emma; van der Gaag, Rutger Jan; Kemner, Chantal

    2010-12-15

    A detailed visual processing style has been noted in autism spectrum disorder (ASD); this contributes to problems in face processing and has been directly related to abnormal processing of spatial frequencies (SFs). Little is known about the early development of face processing in ASD and the relation with abnormal SF processing. We investigated whether young ASD children show abnormalities in low spatial frequency (LSF, global) and high spatial frequency (HSF, detailed) processing and explored whether these are crucially involved in the early development of face processing. Three- to 4-year-old children with ASD (n = 22) were compared with developmentally delayed children without ASD (n = 17). Spatial frequency processing was studied by recording visual evoked potentials from visual brain areas while children passively viewed gratings (HSF/LSF). In addition, children watched face stimuli with different expressions, filtered to include only HSF or LSF. Enhanced activity in visual brain areas was found in response to HSF versus LSF information in children with ASD, in contrast to control subjects. Furthermore, facial-expression processing was also primarily driven by detail in ASD. Enhanced visual processing of detailed (HSF) information is present early in ASD and occurs for neutral (gratings), as well as for socially relevant stimuli (facial expressions). These data indicate that there is a general abnormality in visual SF processing in early ASD and are in agreement with suggestions that a fast LSF subcortical face processing route might be affected in ASD. This could suggest that abnormal visual processing is causative in the development of social problems in ASD. Copyright © 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. Mutual regulation between infant facial affect and maternal touch in depressed and nondepressed dyads

    DEFF Research Database (Denmark)

    Egmose, Ida; Cordes, Katharina; Smith-Nielsen, Johanne

    2017-01-01

    research suggests that touch is an important means through which parents regulate their infants’ affects. Also, previous research has shown that post-partum depressed (PPD) mothers and nonclinical mothers differ in their touching behaviors when interacting with their infants. We examined the affect......-regulating function of affectionate, caregiving and playful maternal touch in 24 PPD and 47 nonclinical mother-infant dyads when infants were four months old. In order to investigate the direction of effects and to account for repeated observations, the data were analysed using time-window sequential analysis......, only in the PPD dyads, were the mothers more likely to initiate affectionate touch when their infants were displaying negative facial affect. Our results also showed that mothers use specific touch types to regulate infants’ negative and positive affects; infants are more likely to initiate positive...

  18. Facial trauma among victims of terrestrial transport accidents.

    Science.gov (United States)

    d'Avila, Sérgio; Barbosa, Kevan Guilherme Nóbrega; Bernardino, Ítalo de Macedo; da Nóbrega, Lorena Marques; Bento, Patrícia Meira; E Ferreira, Efigênia Ferreira

    2016-01-01

    In developing countries, terrestrial transport accidents - TTA, especially those involving automobiles and motorcycles - are a major cause of facial trauma, surpassing urban violence. This cross-sectional census study attempted to determine facial trauma occurrence with terrestrial transport accidents etiology, involving cars, motorcycles, or accidents with pedestrians in the northeastern region of Brazil, and examine victims' socio-demographic characteristics. Morbidity data from forensic service reports of victims who sought care from January to December 2012 were analyzed. Altogether, 2379 reports were evaluated, of which 673 were related to terrestrial transport accidents and 103 involved facial trauma. Three previously trained and calibrated researchers collected data using a specific form. Facial trauma occurrence rate was 15.3% (n=103). The most affected age group was 20-29 years (48.3%), and more men than women were affected (2.81:1). Motorcycles were involved in the majority of accidents resulting in facial trauma (66.3%). The occurrence of facial trauma in terrestrial transport accident victims tends to affect a greater proportion of young and male subjects, and the most prevalent accidents involve motorcycles. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  19. Facial emotion linked cooperation in patients with paranoid schizophrenia: a test on the Interpersonal Communication Model.

    Science.gov (United States)

    Tse, Wai S; Yan Lu; Bond, Alyson J; Chan, Raymond Ck; Tam, Danny W H

    2011-09-01

    Patients with schizophrenia consistently show deficits in facial affect perception and social behaviours. It is illusive to suggest that these deficits in facial affect perception cause poor social behaviours. The present research aims to study how facial affects influence ingratiation, cooperation and punishment behaviours of the patients. Forty outpatients with paranoid schizophrenia, 26 matched depressed patients and 46 healthy volunteers were recruited. After measurement of clinical symptoms and depression, their facial emotion recognition, neurocognitive functioning and the facial affects dependent cooperative behaviour were measured using a modified version of Mixed-Motive Game. The depressed control group showed demographic characteristics, depression levels and neurocognitive functioning similar to the schizophrenic group. Patients with schizophrenia committed significantly more errors in neutral face identification than the other two groups. They were significantly more punitive on the Mixed-Motive Game in the neutral face condition. Neutral face misidentification was a unique emotion-processing deficit in the schizophrenic group. Their increase in punitive behaviours in the neutral face condition might confuse their family members and trigger more expressed emotion from them, thus increasing the risk of relapse. Family members might display more happy faces to promote positive relationships with patients.

  20. Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

    NARCIS (Netherlands)

    van Dillen, L.F.; Harris, L.T.; van Dijk, W.W.; Rotteveel, M.

    2015-01-01

    In the present research we examined whether the psychological meaning of people's categorisation goals affects facial muscle activity in response to facial expressions of emotion. We had participants associate eye colour (blue, brown) with either a personality trait (extraversion) or a physical

  1. Considering sex differences clarifies the effects of depression on facial emotion processing during fMRI.

    Science.gov (United States)

    Jenkins, L M; Kendall, A D; Kassel, M T; Patrón, V G; Gowins, J R; Dion, C; Shankman, S A; Weisenbach, S L; Maki, P; Langenecker, S A

    2018-01-01

    Sex differences in emotion processing may play a role in women's increased risk for Major Depressive Disorder (MDD). However, studies of sex differences in brain mechanisms involved in emotion processing in MDD (or interactions of sex and diagnosis) are sparse. We conducted an event-related fMRI study examining the interactive and distinct effects of sex and MDD on neural activity during a facial emotion perception task. To minimize effects of current affective state and cumulative disease burden, we studied participants with remitted MDD (rMDD) who were early in the course of the illness. In total, 88 individuals aged 18-23 participated, including 48 with rMDD (32 female) and 40 healthy controls (HC; 25 female). fMRI revealed an interaction between sex and diagnosis for sad and neutral facial expressions in the superior frontal gyrus and left middle temporal gyrus. Results also revealed an interaction of sex with diagnosis in the amygdala. Data was from two sites, which might increase variability, but it also increases power to examine sex by diagnosis interactions. This study demonstrates the importance of taking sex differences into account when examining potential trait (or scar) mechanisms that could be useful in identifying individuals at-risk for MDD as well as for evaluating potential therapeutic innovations. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Enhancing facial features by using clear facial features

    Science.gov (United States)

    Rofoo, Fanar Fareed Hanna

    2017-09-01

    The similarity of features between individuals of same ethnicity motivated the idea of this project. The idea of this project is to extract features of clear facial image and impose them on blurred facial image of same ethnic origin as an approach to enhance a blurred facial image. A database of clear images containing 30 individuals equally divided to five different ethnicities which were Arab, African, Chines, European and Indian. Software was built to perform pre-processing on images in order to align the features of clear and blurred images. And the idea was to extract features of clear facial image or template built from clear facial images using wavelet transformation to impose them on blurred image by using reverse wavelet. The results of this approach did not come well as all the features did not align together as in most cases the eyes were aligned but the nose or mouth were not aligned. Then we decided in the next approach to deal with features separately but in the result in some cases a blocky effect was present on features due to not having close matching features. In general the available small database did not help to achieve the goal results, because of the number of available individuals. The color information and features similarity could be more investigated to achieve better results by having larger database as well as improving the process of enhancement by the availability of closer matches in each ethnicity.

  3. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  4. MR findings of facial nerve on oblique sagittal MRI using TMJ surface coil: normal vs peripheral facial nerve palsy

    International Nuclear Information System (INIS)

    Park, Yong Ok; Lee, Myeong Jun; Lee, Chang Joon; Yoo, Jeong Hyun

    2000-01-01

    To evaluate the findings of normal facial nerve, as seen on oblique sagittal MRI using a TMJ (temporomandibular joint) surface coil, and then to evaluate abnormal findings of peripheral facial nerve palsy. We retrospectively reviewed the MR findings of 20 patients with peripheral facial palsy and 50 normal facial nerves of 36 patients without facial palsy. All underwent oblique sagittal MRI using a T MJ surface coil. We analyzed the course, signal intensity, thickness, location, and degree of enhancement of the facial nerve. According to the angle made by the proximal parotid segment on the axis of the mastoid segment, course was classified as anterior angulation (obtuse and acute, or buckling), straight and posterior angulation. Among 50 normal facial nerves, 24 (48%) were straight, and 23 (46%) demonstrated anterior angulation; 34 (68%) showed iso signal intensity on T1W1. In the group of patients, course on the affected side was either straight (40%) or showed anterior angulation (55%), and signal intensity in 80% of cases was isointense. These findings were similar to those in the normal group, but in patients with post-traumatic or post-operative facial palsy, buckling, of course, appeared. In 12 of 18 facial palsy cases (66.6%) in which contrast materials were administered, a normal facial nerve of the opposite facial canal showed mild enhancement on more than one segment, but on the affected side the facial nerve showed diffuse enhancement in all 14 patients with acute facial palsy. Eleven of these (79%) showed fair or marked enhancement on more than one segment, and in 12 (86%), mild enhancement of the proximal parotid segment was noted. Four of six chronic facial palsy cases (66.6%) showed atrophy of the facial nerve. When oblique sagittal MR images are obtained using a TMJ surface coil, enhancement of the proximal parotid segment of the facial nerve and fair or marked enhancement of at least one segment within the facial canal always suggests pathology of

  5. How do we update faces? Effects of gaze direction and facial expressions on working memory updating

    Directory of Open Access Journals (Sweden)

    Caterina eArtuso

    2012-09-01

    Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  6. How do we update faces? Effects of gaze direction and facial expressions on working memory updating.

    Science.gov (United States)

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  7. Development of the Korean Facial Emotion Stimuli: Korea University Facial Expression Collection 2nd Edition

    Directory of Open Access Journals (Sweden)

    Sun-Min Kim

    2017-05-01

    Full Text Available Background: Developing valid emotional facial stimuli for specific ethnicities creates ample opportunities to investigate both the nature of emotional facial information processing in general and clinical populations as well as the underlying mechanisms of facial emotion processing within and across cultures. Given that most entries in emotional facial stimuli databases were developed with western samples, and given that very few of the eastern emotional facial stimuli sets were based strictly on the Ekman’s Facial Action Coding System, developing valid emotional facial stimuli of eastern samples remains a high priority.Aims: To develop and examine the psychometric properties of six basic emotional facial stimuli recruiting professional Korean actors and actresses based on the Ekman’s Facial Action Coding System for the Korea University Facial Expression Collection-Second Edition (KUFEC-II.Materials And Methods: Stimulus selection was done in two phases. First, researchers evaluated the clarity and intensity of each stimulus developed based on the Facial Action Coding System. Second, researchers selected a total of 399 stimuli from a total of 57 actors and actresses, which were then rated on accuracy, intensity, valence, and arousal by 75 independent raters.Conclusion: The hit rates between the targeted and rated expressions of the KUFEC-II were all above 80%, except for fear (50% and disgust (63%. The KUFEC-II appears to be a valid emotional facial stimuli database, providing the largest set of emotional facial stimuli. The mean intensity score was 5.63 (out of 7, suggesting that the stimuli delivered the targeted emotions with great intensity. All positive expressions were rated as having a high positive valence, whereas all negative expressions were rated as having a high negative valence. The KUFEC II is expected to be widely used in various psychological studies on emotional facial expression. KUFEC-II stimuli can be obtained through

  8. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  9. Subliminal and supraliminal processing of facial expression of emotions: brain oscillation in the left/right frontal area.

    Science.gov (United States)

    Balconi, Michela; Ferrari, Chiara

    2012-03-26

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  10. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-03-01

    Full Text Available The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation or unconsciously (subliminal stimulation processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral under two different conditions: supraliminal (200 ms vs. subliminal (30 ms stimulation (140 target-mask pairs for each condition. The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  11. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    Science.gov (United States)

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  12. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    Directory of Open Access Journals (Sweden)

    Yoshi-Taka eMatsuda

    2013-09-01

    Full Text Available Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura et al., 2012. The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging (fMRI to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner.

  13. Genetics Home Reference: oral-facial-digital syndrome

    Science.gov (United States)

    ... related conditions that affect the development of the oral cavity (the mouth and teeth), facial features, and digits ( ... this disorder involve problems with development of the oral cavity , facial features, and digits. Most forms are also ...

  14. Discrimination of gender using facial image with expression change

    Science.gov (United States)

    Kuniyada, Jun; Fukuda, Takahiro; Terada, Kenji

    2005-12-01

    By carrying out marketing research, the managers of large-sized department stores or small convenience stores obtain the information such as ratio of men and women of visitors and an age group, and improve their management plan. However, these works are carried out in the manual operations, and it becomes a big burden to small stores. In this paper, the authors propose a method of men and women discrimination by extracting difference of the facial expression change from color facial images. Now, there are a lot of methods of the automatic recognition of the individual using a motion facial image or a still facial image in the field of image processing. However, it is very difficult to discriminate gender under the influence of the hairstyle and clothes, etc. Therefore, we propose the method which is not affected by personality such as size and position of facial parts by paying attention to a change of an expression. In this method, it is necessary to obtain two facial images with an expression and an expressionless. First, a region of facial surface and the regions of facial parts such as eyes, nose, and mouth are extracted in the facial image with color information of hue and saturation in HSV color system and emphasized edge information. Next, the features are extracted by calculating the rate of the change of each facial part generated by an expression change. In the last step, the values of those features are compared between the input data and the database, and the gender is discriminated. In this paper, it experimented for the laughing expression and smile expression, and good results were provided for discriminating gender.

  15. Perception of global facial geometry is modulated through experience

    Directory of Open Access Journals (Sweden)

    Meike Ramon

    2015-03-01

    Full Text Available Identification of personally familiar faces is highly efficient across various viewing conditions. While the presence of robust facial representations stored in memory is considered to aid this process, the mechanisms underlying invariant identification remain unclear. Two experiments tested the hypothesis that facial representations stored in memory are associated with differential perceptual processing of the overall facial geometry. Subjects who were personally familiar or unfamiliar with the identities presented discriminated between stimuli whose overall facial geometry had been manipulated to maintain or alter the original facial configuration (see Barton, Zhao & Keenan, 2003. The results demonstrate that familiarity gives rise to more efficient processing of global facial geometry, and are interpreted in terms of increased holistic processing of facial information that is maintained across viewing distances.

  16. An fMRI study of facial emotion processing in patients with schizophrenia.

    Science.gov (United States)

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  17. Endogenous testosterone levels are associated with neural activity in men with schizophrenia during facial emotion processing.

    Science.gov (United States)

    Ji, Ellen; Weickert, Cynthia Shannon; Lenroot, Rhoshel; Catts, Stanley V; Vercammen, Ans; White, Christopher; Gur, Raquel E; Weickert, Thomas W

    2015-06-01

    Growing evidence suggests that testosterone may play a role in the pathophysiology of schizophrenia given that testosterone has been linked to cognition and negative symptoms in schizophrenia. Here, we determine the extent to which serum testosterone levels are related to neural activity in affective processing circuitry in men with schizophrenia. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as 32 healthy controls and 26 people with schizophrenia performed a facial emotion identification task. Whole brain analyses were performed to determine regions of differential activity between groups during processing of angry versus non-threatening faces. A follow-up ROI analysis using a regression model in a subset of 16 healthy men and 16 men with schizophrenia was used to determine the extent to which serum testosterone levels were related to neural activity. Healthy controls displayed significantly greater activation than people with schizophrenia in the left inferior frontal gyrus (IFG). There was no significant difference in circulating testosterone levels between healthy men and men with schizophrenia. Regression analyses between activation in the IFG and circulating testosterone levels revealed a significant positive correlation in men with schizophrenia (r=.63, p=.01) and no significant relationship in healthy men. This study provides the first evidence that circulating serum testosterone levels are related to IFG activation during emotion face processing in men with schizophrenia but not in healthy men, which suggests that testosterone levels modulate neural processes relevant to facial emotion processing that may interfere with social functioning in men with schizophrenia. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  18. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  19. Corrugator Activity Confirms Immediate Negative Affect in Surprise

    Directory of Open Access Journals (Sweden)

    Sascha eTopolinski

    2015-02-01

    Full Text Available The emotion of surprise entails a complex of immediate responses, such as cognitive interruption, attention allocation to, and more systematic processing of the surprising stimulus. All these processes serve the ultimate function to increase processing depth and thus cognitively master the surprising stimulus. The present account introduces phasic negative affect as the underlying mechanism responsible for these consequences. Surprising stimuli are schema-discrepant and thus entail cognitive disfluency, which elicits immediate negative affect. This affect in turn works like a phasic cognitive tuning switching the current processing mode from more automatic and heuristic to more systematic and reflective processing. Directly testing the initial elicitation of negative affect by suprising events, the present experiment presented high and low surprising neutral trivia statements to N = 28 participants while assessing their spontaneous facial expressions via facial electromyography. High compared to low suprising trivia elicited higher corrugator activity, indicative of negative affect and mental effort, while leaving zygomaticus (positive affect and frontalis (cultural surprise expression activity unaffected. Future research shall investigate the mediating role of negative affect in eliciting surprise-related outcomes.

  20. The influence of context on distinct facial expressions of disgust.

    Science.gov (United States)

    Reschke, Peter J; Walle, Eric A; Knothe, Jennifer M; Lopez, Lukas D

    2018-06-11

    Face perception is susceptible to contextual influence and perceived physical similarities between emotion cues. However, studies often use structurally homogeneous facial expressions, making it difficult to explore how within-emotion variability in facial configuration affects emotion perception. This study examined the influence of context on the emotional perception of categorically identical, yet physically distinct, facial expressions of disgust. Participants categorized two perceptually distinct disgust facial expressions, "closed" (i.e., scrunched nose, closed mouth) and "open" (i.e., scrunched nose, open mouth, protruding tongue), that were embedded in contexts comprising emotion postures and scenes. Results demonstrated that the effect of nonfacial elements was significantly stronger for "open" disgust facial expressions than "closed" disgust facial expressions. These findings provide support that physical similarity within discrete categories of facial expressions is mutable and plays an important role in affective face perception. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.

    Science.gov (United States)

    Nummenmaa, Lauri; Calvo, Manuel G

    2015-04-01

    Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).

  2. Relations between emotions, display rules, social motives, and facial behaviour.

    Science.gov (United States)

    Zaalberg, Ruud; Manstead, Antony; Fischer, Agneta

    2004-02-01

    We report research on the relations between emotions, display rules, social motives, and facial behaviour. In Study 1 we used a questionnaire methodology to examine how respondents would react to a funny or a not funny joke told to them by a close friend or a stranger. We assessed display rules and motivations for smiling and/or laughing. Display rules and social motives (partly) mediated the relationship between the experimental manipulations and self-reported facial behaviour. Study 2 was a laboratory experiment in which funny or not funny jokes were told to participants by a male or female stranger. Consistent with hypotheses, hearing a funny joke evoked a stronger motivation to share positive affect by showing longer Duchenne smiling. Contrary to hypotheses, a not funny joke did not elicit greater prosocial motivation by showing longer "polite" smiling, although such a smiling pattern did occur. Rated funniness of the joke and the motivation to share positive affect mediated the relationship between the joke manipulation and facial behaviour. Path analysis was used to explore this mediating process in greater detail.

  3. Asians' Facial Responsiveness to Basic Tastes by Automated Facial Expression Analysis System.

    Science.gov (United States)

    Zhi, Ruicong; Cao, Lianyu; Cao, Gang

    2017-03-01

    Growing evidence shows that consumer choices in real life are mostly driven by unconscious mechanisms rather than conscious. The unconscious process could be measured by behavioral measurements. This study aims to apply automatic facial expression analysis technique for consumers' emotion representation, and explore the relationships between sensory perception and facial responses. Basic taste solutions (sourness, sweetness, bitterness, umami, and saltiness) with 6 levels plus water were used, which could cover most of the tastes found in food and drink. The other contribution of this study is to analyze the characteristics of facial expressions and correlation between facial expressions and perceptive hedonic liking for Asian consumers. Up until now, the facial expression application researches only reported for western consumers, while few related researches investigated the facial responses during food consuming for Asian consumers. Experimental results indicated that facial expressions could identify different stimuli with various concentrations and different hedonic levels. The perceived liking increased at lower concentrations and decreased at higher concentrations, while samples with medium concentrations were perceived as the most pleasant except sweetness and bitterness. High correlations were founded between perceived intensities of bitterness, umami, saltiness, and facial reactions of disgust and fear. Facial expression disgust and anger could characterize emotion "dislike," and happiness could characterize emotion "like," while neutral could represent "neither like nor dislike." The identified facial expressions agree with the perceived sensory emotions elicited by basic taste solutions. The correlation analysis between hedonic levels and facial expression intensities obtained in this study are in accordance with that discussed for western consumers. © 2017 Institute of Food Technologists®.

  4. Affective Priming in Major Depressive Disorder

    Directory of Open Access Journals (Sweden)

    Joelle eLeMoult

    2012-10-01

    Full Text Available Research on cognitive biases in depression has provided considerable evidence for the impact of emotion on cognition. Individuals with depression tend to preferentially process mood-congruent material and to show deficits in the processing of positive material leading to biases in attention, memory, and judgments. More research is needed, however, to fully understand which cognitive processes are affected. The current study further examines the impact of emotion on cognition using a priming design with facial expressions of emotion. Specifically, this study tested whether the presentation of facial expressions of emotion affects subsequent processing of affective material in participants with major depressive disorder (MDD and healthy controls (CTL. Facial expressions displaying happy, sad, angry, disgusted, or neutral expressions were presented as primes for 500ms, and participants’ speed to identify a subsequent target’s emotional expression was assessed. All participants displayed greater interference from emotional versus neutral primes, marked by slower response times to judge the emotion of the target face when it was preceded by an emotional prime. Importantly, the CTL group showed the strongest interference when happy emotional expressions served as primes whereas the MDD group failed to show this bias. These results add to a growing literature that shows that depression is associated with difficulties in the processing of positive material.

  5. Effects of task demands on the early neural processing of fearful and happy facial expressions.

    Science.gov (United States)

    Itier, Roxane J; Neath-Tavares, Karly N

    2017-05-15

    Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Speech Signal and Facial Image Processing for Obstructive Sleep Apnea Assessment

    Directory of Open Access Journals (Sweden)

    Fernando Espinoza-Cuadros

    2015-01-01

    Full Text Available Obstructive sleep apnea (OSA is a common sleep disorder characterized by recurring breathing pauses during sleep caused by a blockage of the upper airway (UA. OSA is generally diagnosed through a costly procedure requiring an overnight stay of the patient at the hospital. This has led to proposing less costly procedures based on the analysis of patients’ facial images and voice recordings to help in OSA detection and severity assessment. In this paper we investigate the use of both image and speech processing to estimate the apnea-hypopnea index, AHI (which describes the severity of the condition, over a population of 285 male Spanish subjects suspected to suffer from OSA and referred to a Sleep Disorders Unit. Photographs and voice recordings were collected in a supervised but not highly controlled way trying to test a scenario close to an OSA assessment application running on a mobile device (i.e., smartphones or tablets. Spectral information in speech utterances is modeled by a state-of-the-art low-dimensional acoustic representation, called i-vector. A set of local craniofacial features related to OSA are extracted from images after detecting facial landmarks using Active Appearance Models (AAMs. Support vector regression (SVR is applied on facial features and i-vectors to estimate the AHI.

  7. The effects of gender and COMT Val158Met polymorphism on fearful facial affect recognition: a fMRI study.

    Science.gov (United States)

    Kempton, Matthew J; Haldane, Morgan; Jogia, Jigar; Christodoulou, Tessa; Powell, John; Collier, David; Williams, Steven C R; Frangou, Sophia

    2009-04-01

    The functional catechol-O-methyltransferase (COMT Val108/158Met) polymorphism has been shown to have an impact on tasks of executive function, memory and attention and recently, tasks with an affective component. As oestrogen reduces COMT activity, we focused on the interaction between gender and COMT genotype on brain activations during an affective processing task. We used functional MRI (fMRI) to record brain activations from 74 healthy subjects who engaged in a facial affect recognition task; subjects viewed and identified fearful compared to neutral faces. There was no main effect of the COMT polymorphism, gender or genotypexgender interaction on task performance. We found a significant effect of gender on brain activations in the left amygdala and right temporal pole, where females demonstrated increased activations over males. Within these regions, Val/Val carriers showed greater signal magnitude compared to Met/Met carriers, particularly in females. The COMT Val108/158Met polymorphism impacts on gender-related patterns of activation in limbic and paralimbic regions but the functional significance of any oestrogen-related COMT inhibition appears modest.

  8. Effects of Orientation on Recognition of Facial Affect

    Science.gov (United States)

    Cohen, M. M.; Mealey, J. B.; Hargens, Alan R. (Technical Monitor)

    1997-01-01

    The ability to discriminate facial features is often degraded when the orientation of the face and/or the observer is altered. Previous studies have shown that gross distortions of facial features can go unrecognized when the image of the face is inverted, as exemplified by the 'Margaret Thatcher' effect. This study examines how quickly erect and supine observers can distinguish between smiling and frowning faces that are presented at various orientations. The effects of orientation are of particular interest in space, where astronauts frequently view one another in orientations other than the upright. Sixteen observers viewed individual facial images of six people on a computer screen; on a given trial, the image was either smiling or frowning. Each image was viewed when it was erect and when it was rotated (rolled) by 45 degrees, 90 degrees, 135 degrees, 180 degrees, 225 degrees and 270 degrees about the line of sight. The observers were required to respond as rapidly and accurately as possible to identify if the face presented was smiling or frowning. Measures of reaction time were obtained when the observers were both upright and supine. Analyses of variance revealed that mean reaction time, which increased with stimulus rotation (F=18.54, df 7/15, p (is less than) 0.001), was 22% longer when the faces were inverted than when they were erect, but that the orientation of the observer had no significant effect on reaction time (F=1.07, df 1/15, p (is greater than) .30). These data strongly suggest that the orientation of the image of a face on the observer's retina, but not its orientation with respect to gravity, is important in identifying the expression on the face.

  9. Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.

    Science.gov (United States)

    Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus

    2013-12-01

    Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.

  10. Reconstruction of facial nerve injuries in children.

    Science.gov (United States)

    Fattah, Adel; Borschel, Gregory H; Zuker, Ron M

    2011-05-01

    Facial nerve trauma is uncommon in children, and many spontaneously recover some function; nonetheless, loss of facial nerve activity leads to functional impairment of ocular and oral sphincters and nasal orifice. In many cases, the impediment posed by facial asymmetry and reduced mimetic function more significantly affects the child's psychosocial interactions. As such, reconstruction of the facial nerve affords great benefits in quality of life. The therapeutic strategy is dependent on numerous factors, including the cause of facial nerve injury, the deficit, the prognosis for recovery, and the time elapsed since the injury. The options for treatment include a diverse range of surgical techniques including static lifts and slings, nerve repairs, nerve grafts and nerve transfers, regional, and microvascular free muscle transfer. We review our strategies for addressing facial nerve injuries in children.

  11. Facial nerve problems and Bell's palsy

    OpenAIRE

    Sala, DV; Venter, C; Valenas, O

    2015-01-01

    Bell's palsy is paralysis or weakness of muscle at the hemifacial level, a form of temporary facial paralysis, probable a virus infection or trauma, to one or two facial nerves. Damage to the facial nerve innervating the muscles on one side of the face result in a flabby appearance, fell the respective hemiface. Nerve damage can also affect the sense of taste and salivary and lacrimal secretion. This condition begins suddenly, often overnight, and usually gets better on its own within a few w...

  12. Cradling Side Preference Is Associated with Lateralized Processing of Baby Facial Expressions in Females

    Science.gov (United States)

    Huggenberger, Harriet J.; Suter, Susanne E.; Reijnen, Ester; Schachinger, Hartmut

    2009-01-01

    Women's cradling side preference has been related to contralateral hemispheric specialization of processing emotional signals; but not of processing baby's facial expression. Therefore, 46 nulliparous female volunteers were characterized as left or non-left holders (HG) during a doll holding task. During a signal detection task they were then…

  13. Quantitative facial asymmetry: using three-dimensional photogrammetry to measure baseline facial surface symmetry.

    Science.gov (United States)

    Taylor, Helena O; Morrison, Clinton S; Linden, Olivia; Phillips, Benjamin; Chang, Johnny; Byrne, Margaret E; Sullivan, Stephen R; Forrest, Christopher R

    2014-01-01

    subjectively, can be easily and reproducibly measured using three-dimensional photogrammetry. The RMSD for facial asymmetry of healthy volunteers clusters at approximately 0.80 ± 0.24 mm. Patients with facial asymmetry due to a pathologic process can be differentiated from normative facial asymmetry based on their RMSDs.

  14. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  15. Computer Aided Facial Prosthetics Manufacturing System

    Directory of Open Access Journals (Sweden)

    Peng H.K.

    2016-01-01

    Full Text Available Facial deformities can impose burden to the patient. There are many solutions for facial deformities such as plastic surgery and facial prosthetics. However, current fabrication method of facial prosthetics is high-cost and time consuming. This study aimed to identify a new method to construct a customized facial prosthetic. A 3D scanner, computer software and 3D printer were used in this study. Results showed that the new developed method can be used to produce a customized facial prosthetics. The advantages of the developed method over the conventional process are low cost, reduce waste of material and pollution in order to meet the green concept.

  16. Psychopathic traits affect the visual exploration of facial expressions.

    Science.gov (United States)

    Boll, Sabrina; Gamer, Matthias

    2016-05-01

    Deficits in emotional reactivity and recognition have been reported in psychopathy. Impaired attention to the eyes along with amygdala malfunctions may underlie these problems. Here, we investigated how different facets of psychopathy modulate the visual exploration of facial expressions by assessing personality traits in a sample of healthy young adults using an eye-tracking based face perception task. Fearless Dominance (the interpersonal-emotional facet of psychopathy) and Coldheartedness scores predicted reduced face exploration consistent with findings on lowered emotional reactivity in psychopathy. Moreover, participants high on the social deviance facet of psychopathy ('Self-Centered Impulsivity') showed a reduced bias to shift attention towards the eyes. Our data suggest that facets of psychopathy modulate face processing in healthy individuals and reveal possible attentional mechanisms which might be responsible for the severe impairments of social perception and behavior observed in psychopathy. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Towards emotion detection in educational scenarios from facial expressions and body movements through multimodal approaches.

    Science.gov (United States)

    Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G

    2014-01-01

    We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  18. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

    Directory of Open Access Journals (Sweden)

    Mar Saneiro

    2014-01-01

    Full Text Available We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners’ affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  19. Gender effects in alcohol dependence: an fMRI pilot study examining affective processing.

    Science.gov (United States)

    Padula, Claudia B; Anthenelli, Robert M; Eliassen, James C; Nelson, Erik; Lisdahl, Krista M

    2015-02-01

    Alcohol dependence (AD) has global effects on brain structure and function, including frontolimbic regions regulating affective processing. Preliminary evidence suggests alcohol blunts limbic response to negative affective stimuli and increases activation to positive affective stimuli. Subtle gender differences are also evident during affective processing. Fourteen abstinent AD individuals (8 F, 6 M) and 14 healthy controls (9 F, 5 M), ages 23 to 60, were included in this facial affective processing functional magnetic resonance imaging pilot study. Whole-brain linear regression analyses were performed, and follow-up analyses examined whether AD status significantly predicted depressive symptoms and/or coping. Fearful Condition-The AD group demonstrated reduced activation in the right medial frontal gyrus, compared with controls. Gender moderated the effects of AD in bilateral inferior frontal gyri. Happy Condition-AD individuals had increased activation in the right thalamus. Gender moderated the effects of AD in the left caudate, right middle frontal gyrus, left paracentral lobule, and right lingual gyrus. Interactive AD and gender effects for fearful and happy faces were such that AD men activated more than control men, but AD women activated less than control women. Enhanced coping was associated with greater activation in right medial frontal gyrus during fearful condition in AD individuals. Abnormal affective processing in AD may be a marker of alcoholism risk or a consequence of chronic alcoholism. Subtle gender differences were observed, and gender moderated the effects of AD on neural substrates of affective processing. AD individuals with enhanced coping had brain activation patterns more similar to controls. Results help elucidate the effects of alcohol, gender, and their interaction on affective processing. Copyright © 2015 by the Research Society on Alcoholism.

  20. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    Science.gov (United States)

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Contributions of feature shapes and surface cues to the recognition of facial expressions.

    Science.gov (United States)

    Sormaz, Mladen; Young, Andrew W; Andrews, Timothy J

    2016-10-01

    Theoretical accounts of face processing often emphasise feature shapes as the primary visual cue to the recognition of facial expressions. However, changes in facial expression also affect the surface properties of the face. In this study, we investigated whether this surface information can also be used in the recognition of facial expression. First, participants identified facial expressions (fear, anger, disgust, sadness, happiness) from images that were manipulated such that they varied mainly in shape or mainly in surface properties. We found that the categorization of facial expression is possible in either type of image, but that different expressions are relatively dependent on surface or shape properties. Next, we investigated the relative contributions of shape and surface information to the categorization of facial expressions. This employed a complementary method that involved combining the surface properties of one expression with the shape properties from a different expression. Our results showed that the categorization of facial expressions in these hybrid images was equally dependent on the surface and shape properties of the image. Together, these findings provide a direct demonstration that both feature shape and surface information make significant contributions to the recognition of facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Cognitive behavioural therapy attenuates the enhanced early facial stimuli processing in social anxiety disorders: an ERP investigation.

    Science.gov (United States)

    Cao, Jianqin; Liu, Quanying; Li, Yang; Yang, Jun; Gu, Ruolei; Liang, Jin; Qi, Yanyan; Wu, Haiyan; Liu, Xun

    2017-07-28

    Previous studies of patients with social anxiety have demonstrated abnormal early processing of facial stimuli in social contexts. In other words, patients with social anxiety disorder (SAD) tend to exhibit enhanced early facial processing when compared to healthy controls. Few studies have examined the temporal electrophysiological event-related potential (ERP)-indexed profiles when an individual with SAD compares faces to objects in SAD. Systematic comparisons of ERPs to facial/object stimuli before and after therapy are also lacking. We used a passive visual detection paradigm with upright and inverted faces/objects, which are known to elicit early P1 and N170 components, to study abnormal early face processing and subsequent improvements in this measure in patients with SAD. Seventeen patients with SAD and 17 matched control participants performed a passive visual detection paradigm task while undergoing EEG. The healthy controls were compared to patients with SAD pre-therapy to test the hypothesis that patients with SAD have early hypervigilance to facial cues. We compared patients with SAD before and after therapy to test the hypothesis that the early hypervigilance to facial cues in patients with SAD can be alleviated. Compared to healthy control (HC) participants, patients with SAD had more robust P1-N170 slope but no amplitude effects in response to both upright and inverted faces and objects. Interestingly, we found that patients with SAD had reduced P1 responses to all objects and faces after therapy, but had selectively reduced N170 responses to faces, and especially inverted faces. Interestingly, the slope from P1 to N170 in patients with SAD was flatter post-therapy than pre-therapy. Furthermore, the amplitude of N170 evoked by the facial stimuli was correlated with scores on the interaction anxiousness scale (IAS) after therapy. Our results did not provide electrophysiological support for the early hypervigilance hypothesis in SAD to faces, but

  3. Novel Noninvasive Brain Disease Detection System Using a Facial Image Sensor

    Directory of Open Access Journals (Sweden)

    Ting Shu

    2017-12-01

    Full Text Available Brain disease including any conditions or disabilities that affect the brain is fast becoming a leading cause of death. The traditional diagnostic methods of brain disease are time-consuming, inconvenient and non-patient friendly. As more and more individuals undergo examinations to determine if they suffer from any form of brain disease, developing noninvasive, efficient, and patient friendly detection systems will be beneficial. Therefore, in this paper, we propose a novel noninvasive brain disease detection system based on the analysis of facial colors. The system consists of four components. A facial image is first captured through a specialized sensor, where four facial key blocks are next located automatically from the various facial regions. Color features are extracted from each block to form a feature vector for classification via the Probabilistic Collaborative based Classifier. To thoroughly test the system and its performance, seven facial key block combinations were experimented. The best result was achieved using the second facial key block, where it showed that the Probabilistic Collaborative based Classifier is the most suitable. The overall performance of the proposed system achieves an accuracy −95%, a sensitivity −94.33%, a specificity −95.67%, and an average processing time (for one sample of <1 min at brain disease detection.

  4. Satisfaction with facial appearance and its determinants in adults with severe congenital facial disfigurement: a case-referent study.

    Science.gov (United States)

    Versnel, S L; Duivenvoorden, H J; Passchier, J; Mathijssen, I M J

    2010-10-01

    Patients with severe congenital facial disfigurement have a long track record of operations and hospital visits by the time they are 18 years old. The fact that their facial deformity is congenital may have an impact on how satisfied these patients are with their appearance. This study evaluated the level of satisfaction with facial appearance of congenital and of acquired facially disfigured adults, and explored demographic, physical and psychological determinants of this satisfaction. Differences compared with non-disfigured adults were examined. Fifty-nine adults with a rare facial cleft, 59 adults with a facial deformity traumatically acquired in adulthood, and a reference group of 201 non-disfigured adults completed standardised demographic, physical and psychological questionnaires. The congenital and acquired groups did not differ significantly in the level of satisfaction with facial appearance, but both were significantly less satisfied than the reference group. In facially disfigured adults, level of education, number of affected facial parts and facial function were determinants of the level of satisfaction. High fear of negative appearance evaluation by others (FNAE) and low self-esteem (SE) were strong psychological determinants. Although FNAE was higher in both patient groups, SE was similar in all three groups. Satisfaction with facial appearance of individuals with a congenital or acquired facial deformity is similar and will seldom reach the level of satisfaction of non-disfigured persons. A combination of surgical correction (with attention for facial profile and restoring facial functions) and psychological help (to increase SE and lower FNAE) may improve patient satisfaction. Copyright 2009 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  5. Coherence explored between emotion components: evidence from event-related potentials and facial electromyography.

    Science.gov (United States)

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2014-04-01

    Componential theories assume that emotion episodes consist of emergent and dynamic response changes to relevant events in different components, such as appraisal, physiology, motivation, expression, and subjective feeling. In particular, Scherer's Component Process Model hypothesizes that subjective feeling emerges when the synchronization (or coherence) of appraisal-driven changes between emotion components has reached a critical threshold. We examined the prerequisite of this synchronization hypothesis for appraisal-driven response changes in facial expression. The appraisal process was manipulated by using feedback stimuli, presented in a gambling task. Participants' responses to the feedback were investigated in concurrently recorded brain activity related to appraisal (event-related potentials, ERP) and facial muscle activity (electromyography, EMG). Using principal component analysis, the prediction of appraisal-driven response changes in facial EMG was examined. Results support this prediction: early cognitive processes (related to the feedback-related negativity) seem to primarily affect the upper face, whereas processes that modulate P300 amplitudes tend to predominantly drive cheek region responses. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Altering sensorimotor feedback disrupts visual discrimination of facial expressions.

    Science.gov (United States)

    Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula

    2016-08-01

    Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.

  7. [Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].

    Science.gov (United States)

    Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel

    2016-07-01

    Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.

  8. Gender differences in the motivational processing of babies are determined by their facial attractiveness.

    Directory of Open Access Journals (Sweden)

    Rinah Yamamoto

    2009-06-01

    Full Text Available This study sought to determine how esthetic appearance of babies may affect their motivational processing by the adults.Healthy men and women were administered two laboratory-based tasks: a key pressing to change the viewing time of normal-looking babies and of those with abnormal facial features (e.g., cleft palate, strabismus, skin disorders, Down's syndrome and fetal alcohol syndrome and b attractiveness ratings of these images. Exposure to the babies' images produced two different response patterns: for normal babies, there was a similar effort by the two groups to extend the visual processing with lower attractiveness ratings by men; for abnormal babies, women exerted greater effort to shorten the viewing time despite attractiveness ratings comparable to the men.These results indicate that gender differences in the motivational processing of babies include excessive (relative to the esthetic valuation motivation to extend the viewing time of normal babies by men vs. shortening the exposure to the abnormal babies by women. Such gender-specific incentive sensitization phenomenon may reflect an evolutionary-derived need for diversion of limited resources to the nurturance of healthy offspring.

  9. Gender Differences in the Motivational Processing of Babies Are Determined by Their Facial Attractiveness

    Science.gov (United States)

    Yamamoto, Rinah; Ariely, Dan; Chi, Won; Langleben, Daniel D.; Elman, Igor

    2009-01-01

    Background This study sought to determine how esthetic appearance of babies may affect their motivational processing by the adults. Methodology and Principal Findings Healthy men and women were administered two laboratory-based tasks: a) key pressing to change the viewing time of normal-looking babies and of those with abnormal facial features (e.g., cleft palate, strabismus, skin disorders, Down's syndrome and fetal alcohol syndrome) and b) attractiveness ratings of these images. Exposure to the babies' images produced two different response patterns: for normal babies, there was a similar effort by the two groups to extend the visual processing with lower attractiveness ratings by men; for abnormal babies, women exerted greater effort to shorten the viewing time despite attractiveness ratings comparable to the men. Conclusions These results indicate that gender differences in the motivational processing of babies include excessive (relative to the esthetic valuation) motivation to extend the viewing time of normal babies by men vs. shortening the exposure to the abnormal babies by women. Such gender-specific incentive sensitization phenomenon may reflect an evolutionary-derived need for diversion of limited resources to the nurturance of healthy offspring. PMID:19554100

  10. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: a fixation-to-feature approach

    Science.gov (United States)

    Neath-Tavares, Karly N.; Itier, Roxane J.

    2017-01-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100–120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. PMID:27430934

  11. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    OpenAIRE

    Balconi, Michela; Ferrari, Chiara

    2012-01-01

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects...

  12. People with chronic facial pain perform worse than controls at a facial emotion recognition task, but it is not all about the emotion.

    Science.gov (United States)

    von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L

    2015-04-01

    Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.

  13. The Mask of Sanity: Facial Expressive, Self-Reported, and Physiological Consequences of Emotion Regulation in Psychopathic Offenders.

    Science.gov (United States)

    Nentjes, Lieke; Bernstein, David P; Meijer, Ewout; Arntz, Arnoud; Wiers, Reinout W

    2016-12-01

    This study investigated the physiological, self-reported, and facial correlates of emotion regulation in psychopathy. Specifically, we compared psychopathic offenders (n = 42), nonpsychopathic offenders (n = 42), and nonoffender controls (n = 26) in their ability to inhibit and express emotion while watching affective films (fear, happy, and sad). Results showed that all participants were capable of drastically diminishing facial emotions under inhibition instructions. Contrary to expectation, psychopaths were not superior in adopting such a "poker face." Further, the inhibition of emotion was associated with cardiovascular changes, an effect that was also not dependent on psychopathy (or its factors), suggesting emotion inhibition to be an effortful process in psychopaths as well. Interestingly, psychopathic offenders did not differ from nonpsychopaths in the capacity to show content-appropriate facial emotions during the expression condition. Taken together, these data challenge the view that psychopathy is associated with either superior emotional inhibitory capacities or a generalized impairment in showing facial affect.

  14. Restoring facial shape in face lifting: the role of skeletal support in facial analysis and midface soft-tissue repositioning.

    Science.gov (United States)

    Stuzin, James M

    2007-01-01

    Aesthetic analysis in facial rejuvenation has traditionally been subordinate to technical solutions. While concerns regarding correction of facial laxity, a reduction in the depth of the nasolabial fold, and improvement of both the jowl and the jawline are worthy goals in rhytidectomy, the aesthetic concept of restoring facial shape to a more youthful appearance is equally important. Restoring facial shape in face lifting requires an understanding of how the face ages and then the formulation of a treatment plan that is individualized for the patient. Re-establishment of facial contour is significantly influenced by the re-elevation of descended facial fat through superficial musculoaponeurotic system manipulation; it can be approached through a variety of technical solutions. Underlying skeletal support affects not only the appearance of the face in youth but also how the face ages and influences the operative plan in terms of the requirements for fat repositioning. Formulating a treatment plan that is patient specific and based on the artistic goals as influenced by skeletal support is the key element for consistency in restoring facial shape in face lifting.

  15. Regional Brain Responses Are Biased Toward Infant Facial Expressions Compared to Adult Facial Expressions in Nulliparous Women.

    Science.gov (United States)

    Li, Bingbing; Cheng, Gang; Zhang, Dajun; Wei, Dongtao; Qiao, Lei; Wang, Xiangpeng; Che, Xianwei

    2016-01-01

    Recent neuroimaging studies suggest that neutral infant faces compared to neutral adult faces elicit greater activity in brain areas associated with face processing, attention, empathic response, reward, and movement. However, whether infant facial expressions evoke larger brain responses than adult facial expressions remains unclear. Here, we performed event-related functional magnetic resonance imaging in nulliparous women while they were presented with images of matched unfamiliar infant and adult facial expressions (happy, neutral, and uncomfortable/sad) in a pseudo-randomized order. We found that the bilateral fusiform and right lingual gyrus were overall more activated during the presentation of infant facial expressions compared to adult facial expressions. Uncomfortable infant faces compared to sad adult faces evoked greater activation in the bilateral fusiform gyrus, precentral gyrus, postcentral gyrus, posterior cingulate cortex-thalamus, and precuneus. Neutral infant faces activated larger brain responses in the left fusiform gyrus compared to neutral adult faces. Happy infant faces compared to happy adult faces elicited larger responses in areas of the brain associated with emotion and reward processing using a more liberal threshold of p facial expressions compared to adult facial expressions among nulliparous women, and this bias may be modulated by individual differences in Interest-In-Infants and perspective taking ability.

  16. Remnants and changes in facial emotion processing in women with remitted borderline personality disorder: an EEG study.

    Science.gov (United States)

    Schneider, Isabella; Bertsch, Katja; Izurieta Hidalgo, Natalie A; Müller, Laura E; Schmahl, Christian; Herpertz, Sabine C

    2017-09-27

    According to longitudinal studies, most individuals with borderline personality disorder (BPD) achieve remission. Since BPD is characterized by disturbed emotion recognition, this study investigated behavioral and electrophysiological correlates of facial emotion classification and processing in remitted BPD. 32 women with remitted BPD (rBPD), 32 women with current BPD (cBPD), and 28 healthy women (HC) participated in an emotion classification paradigm comprising blends of angry and happy faces while behavioral and electroencephalographic (event-related potentials) data were recorded. rBPD demonstrated a convergence in behavior towards HC in terms of responses and reaction times. They evaluated maximally ambiguous faces more positively and exhibited faster reaction times when classifying predominantly happy faces compared to cBPD. Group × facial emotion interaction effects were found in early electrophysiological processes with post hoc tests indicating differences between rBPD and cBPD but not between rBPD and HC. However, BPD-like impairments were still found in rBPD in later processing (P300). Our results suggest a reduction in negativity bias in rBPD on the behavioral level and a normalization of earlier stages of facial processing on the neural level, while alterations in later, more cognitive processing do not remit. Early processing may be more state-like, while later impairments may be more trait-like. Further research may need to focus on these stable components.

  17. Correlation between hedonic liking and facial expression measurement using dynamic affective response representation.

    Science.gov (United States)

    Zhi, Ruicong; Wan, Jingwei; Zhang, Dezheng; Li, Weiping

    2018-06-01

    Emotional reactions towards products play an essential role in consumers' decision making, and are more important than rational evaluation of sensory attributes. It is crucial to understand consumers' emotion, and the relationship between sensory properties, human liking and choice. There are many inconsistencies between Asian and Western consumers in the usage of hedonic scale, as well as the intensity of facial reactions, due to different culture and consuming habits. However, very few studies discussed the facial responses characteristics of Asian consumers during food consumption. In this paper, explicit liking measurement (hedonic scale) and implicit emotional measurement (facial expressions) were evaluated to judge the consumers' emotions elicited by five types of juices. The contributions of this study included: (1) Constructed the relationship model between hedonic liking and facial expressions analyzed by face reading technology. Negative emotions "sadness", "anger", and "disgust" showed noticeable high negative correlation tendency to hedonic scores. The "liking" hedonic scores could be characterized by positive emotion "happiness". (2) Several emotional intensity based parameters, especially dynamic parameter, were extracted to describe the facial characteristic in sensory evaluation procedure. Both amplitude information and frequency information were involved in the dynamic parameters to remain more information of the emotional responses signals. From the comparison of four types of emotional descriptive parameters, the maximum parameter and dynamic parameter were suggested to be utilized for representing emotional state and intensities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity--Evidence from Gazing Patterns.

    Directory of Open Access Journals (Sweden)

    Sanni Somppi

    Full Text Available Appropriate response to companions' emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs' gaze fixation distribution among the facial features (eyes, midface and mouth. We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral. We found that dogs' gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics' faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel

  19. Electrophysiology of Cranial Nerve Testing: Trigeminal and Facial Nerves.

    Science.gov (United States)

    Muzyka, Iryna M; Estephan, Bachir

    2018-01-01

    The clinical examination of the trigeminal and facial nerves provides significant diagnostic value, especially in the localization of lesions in disorders affecting the central and/or peripheral nervous system. The electrodiagnostic evaluation of these nerves and their pathways adds further accuracy and reliability to the diagnostic investigation and the localization process, especially when different testing methods are combined based on the clinical presentation and the electrophysiological findings. The diagnostic uniqueness of the trigeminal and facial nerves is their connectivity and their coparticipation in reflexes commonly used in clinical practice, namely the blink and corneal reflexes. The other reflexes used in the diagnostic process and lesion localization are very nerve specific and add more diagnostic yield to the workup of certain disorders of the nervous system. This article provides a review of commonly used electrodiagnostic studies and techniques in the evaluation and lesion localization of cranial nerves V and VII.

  20. Transcranial Electrical Stimulation over Dorsolateral Prefrontal Cortex Modulates Processing of Social Cognitive and Affective Information.

    Directory of Open Access Journals (Sweden)

    Massimiliano Conson

    Full Text Available Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS over dorsolateral prefrontal cortex (DLPFC. To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task, and on one cognitive task assessing the ability to adopt another person's visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants' tendency to adopt another's point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males' responses to threatening faces whereas it interferes with the ability to adopt another's viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.

  1. Toward a universal, automated facial measurement tool in facial reanimation.

    Science.gov (United States)

    Hadlock, Tessa A; Urban, Luke S

    2012-01-01

    To describe a highly quantitative facial function-measuring tool that yields accurate, objective measures of facial position in significantly less time than existing methods. Facial Assessment by Computer Evaluation (FACE) software was designed for facial analysis. Outputs report the static facial landmark positions and dynamic facial movements relevant in facial reanimation. Fifty individuals underwent facial movement analysis using Photoshop-based measurements and the new software; comparisons of agreement and efficiency were made. Comparisons were made between individuals with normal facial animation and patients with paralysis to gauge sensitivity to abnormal movements. Facial measurements were matched using FACE software and Photoshop-based measures at rest and during expressions. The automated assessments required significantly less time than Photoshop-based assessments.FACE measurements easily revealed differences between individuals with normal facial animation and patients with facial paralysis. FACE software produces accurate measurements of facial landmarks and facial movements and is sensitive to paralysis. Given its efficiency, it serves as a useful tool in the clinical setting for zonal facial movement analysis in comprehensive facial nerve rehabilitation programs.

  2. A neuroendocrine account of facial mimicry and its dynamic modulation

    NARCIS (Netherlands)

    Kraaijenvanger, Eline J.; Hofman, Dennis; Bos, Peter A.

    2017-01-01

    Facial expressions are considered central in conveying information about one's emotional state. During social encounters, facial expressions of another individual are often automatically imitated by the observer, a process referred to as ‘facial mimicry’. This process is assumed to facilitate

  3. Social Use of Facial Expressions in Hylobatids

    Science.gov (United States)

    Scheider, Linda; Waller, Bridget M.; Oña, Leonardo; Burrows, Anne M.; Liebal, Katja

    2016-01-01

    Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely ‘responded to’ by the partner’s facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics. PMID:26978660

  4. Enhanced subliminal emotional responses to dynamic facial expressions

    Directory of Open Access Journals (Sweden)

    Wataru eSato

    2014-09-01

    Full Text Available Emotional processing without conscious awareness plays an important role in human social interaction. Several behavioral studies reported that subliminal presentation of photographs of emotional facial expressions induces unconscious emotional processing. However, it was difficult to elicit strong and robust effects using this method. We hypothesized that dynamic presentations of facial expressions would enhance subliminal emotional effects and tested this hypothesis with two experiments. Fearful or happy facial expressions were presented dynamically or statically in either the left or the right visual field for 20 (Experiment 1 and 30 (Experiment 2 ms. Nonsense target ideographs were then presented, and participants reported their preference for them. The results consistently showed that dynamic presentations of emotional facial expressions induced more evident emotional biases toward subsequent targets than did static ones. These results indicate that dynamic presentations of emotional facial expressions induce more evident unconscious emotional processing.

  5. Alveolar ridge atrophy related to facial morphology in edentulous patients

    Directory of Open Access Journals (Sweden)

    Kuć J

    2017-09-01

    Full Text Available Joanna Kuć,1 Teresa Sierpińska,2 Maria Gołębiewska1 1Department of Prosthodontics, 2Department of Dental Technology, Medical University of Bialystok, Bialystok, Poland Objectives: The morphology of the alveolar process determines the retention and stability of prosthetic restorations, thereby determining the result of the therapy. Considering that the edentulous jaws may be affected by the atrophy process, it was hypothesized that the morphology of the alveolar process of the maxilla may be dependent on the anterior facial height and anatomy of the mandible. Subjects and methods: Twenty-five healthy edentulous Caucasian individuals were randomly chosen. Each subject underwent a lateral cephalogram before and after prosthetic rehabilitation. During exposition, newly made prostheses were placed in the patient’s mouth. Teeth remained in maximal intercuspidation. Morphological parameters were evaluated according to the Ricketts, McNamara, and Tallgren’s method. Results: An inversely proportional association was observed between patient age and the distal part of the maxilla. A statistically significant connection was noted between the vertical dimension of alveolar ridge and anterior total and lower facial height conditioned by prosthetic rehabilitation. Conclusion: The height of the lateral part of the alveolar ridge of the maxilla remains in connection with the anterior total and lower facial height obtained in the course of prosthetic rehabilitation. The vertical dimension of the alveolar ridge of the maxilla seems to be in close relationship with the morphology of the lower jaw. Keywords: anterior facial height, cephalometric analysis, complete dentures, vertical occlusal dimension

  6. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task.

    Science.gov (United States)

    Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.

  7. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task.

    Directory of Open Access Journals (Sweden)

    Emilie Qiao-Tasserit

    Full Text Available Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.

  8. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task

    Science.gov (United States)

    Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976

  9. Characterization of p75+ ectomesenchymal stem cells from rat embryonic facial process tissue

    International Nuclear Information System (INIS)

    Wen, Xiujie; Liu, Luchuan; Deng, Manjing; Zhang, Li; Liu, Rui; Xing, Yongjun; Zhou, Xia; Nie, Xin

    2012-01-01

    Highlights: ► Ectomesenchymal stem cells (EMSCs) were found to migrate to rat facial processes at E11.5. ► We successfully sorted p75NTR positive EMSCs (p75 + EMSCs). ► p75 + EMSCs up to nine passages showed relative stable proliferative activity. ► We examined the in vitro multilineage potential of p75 + EMSCs. ► p75 + EMSCs provide an in vitro model for tooth morphogenesis. -- Abstract: Several populations of stem cells, including those from the dental pulp and periodontal ligament, have been isolated from different parts of the tooth and periodontium. The characteristics of such stem cells have been reported as well. However, as a common progenitor of these cells, ectomesenchymal stem cells (EMSCs), derived from the cranial neural crest have yet to be fully characterized. The aim of this study was to better understand the characteristics of EMSCs isolated from rat embryonic facial processes. Immunohistochemical staining showed that EMSCs had migrated to rat facial processes at E11.5, while the absence of epithelial invagination or tooth-like epithelium suggested that any epithelial–mesenchymal interactions were limited at this stage. The p75 neurotrophin receptor (p75NTR), a typical neural crest marker, was used to select p75NTR-positive EMSCs (p75 + EMSCs), which were found to show a homogeneous fibroblast-like morphology and little change in the growth curve, proliferation capacity, and cell phenotype during cell passage. They also displayed the capacity to differentiate into diverse cell types under chemically defined conditions in vitro. p75 + EMSCs proved to be homogeneous, stable in vitro and potentially capable of multiple lineages, suggesting their potential for application in dental or orofacial tissue engineering.

  10. Pattern of facial palsy in a typical Nigerian specialist hospital.

    Science.gov (United States)

    Lamina, S; Hanif, S

    2012-12-01

    Data on incidence of facial palsy is generally lacking in Nigeria. To assess six years' incidence of facial palsy in Murtala Muhammed Specialist Hospital (MMSH), Kano, Nigeria. The records of patients diagnosed as facial problems between January 2000 and December 2005 were scrutinized. Data on diagnosis, age, sex, side affected, occupation and causes were obtained. A total number of 698 patients with facial problems were recorded. Five hundred and ninety four (85%) were diagnosed as facial palsy. Out of the diagnosed facial palsy, males (56.2%) had a higher incidence than females; 20-34 years age group (40.3%) had a greater prevalence; the commonest cause of facial palsy was found out to be Idiopathic (39.1%) and was most common among business men (31.6%). Right sided facial palsy (52.2%) was predominant. Incidence of facial palsy was highest in 2003 (25.3%) and decreased from 2004. It was concluded that the incidence of facial palsy was high and Bell's palsy remains the most common causes of facial (nerve) paralysis.

  11. Emotional facial expressions evoke faster orienting responses, but weaker emotional responses at neural and behavioural levels compared to scenes: A simultaneous EEG and facial EMG study.

    Science.gov (United States)

    Mavratzakis, Aimee; Herbert, Cornelia; Walla, Peter

    2016-01-01

    In the current study, electroencephalography (EEG) was recorded simultaneously with facial electromyography (fEMG) to determine whether emotional faces and emotional scenes are processed differently at the neural level. In addition, it was investigated whether these differences can be observed at the behavioural level via spontaneous facial muscle activity. Emotional content of the stimuli did not affect early P1 activity. Emotional faces elicited enhanced amplitudes of the face-sensitive N170 component, while its counterpart, the scene-related N100, was not sensitive to emotional content of scenes. At 220-280ms, the early posterior negativity (EPN) was enhanced only slightly for fearful as compared to neutral or happy faces. However, its amplitudes were significantly enhanced during processing of scenes with positive content, particularly over the right hemisphere. Scenes of positive content also elicited enhanced spontaneous zygomatic activity from 500-750ms onwards, while happy faces elicited no such changes. Contrastingly, both fearful faces and negative scenes elicited enhanced spontaneous corrugator activity at 500-750ms after stimulus onset. However, relative to baseline EMG changes occurred earlier for faces (250ms) than for scenes (500ms) whereas for scenes activity changes were more pronounced over the whole viewing period. Taking into account all effects, the data suggests that emotional facial expressions evoke faster attentional orienting, but weaker affective neural activity and emotional behavioural responses compared to emotional scenes. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Sound-induced facial synkinesis following facial nerve paralysis

    NARCIS (Netherlands)

    Ma, Ming-San; van der Hoeven, Johannes H.; Nicolai, Jean-Philippe A.; Meek, Marcel F.

    Facial synkinesis (or synkinesia) (FS) occurs frequently after paresis or paralysis of the facial nerve and is in most cases due to aberrant regeneration of (branches of) the facial nerve. Patients suffer from inappropriate and involuntary synchronous facial muscle contractions. Here we describe two

  13. Embodied simulation as part of affective evaluation processes: task dependence of valence concordant EMG activity.

    Science.gov (United States)

    Weinreich, André; Funcke, Jakob Maria

    2014-01-01

    Drawing on recent findings, this study examines whether valence concordant electromyography (EMG) responses can be explained as an unconditional effect of mere stimulus processing or as somatosensory simulation driven by task-dependent processing strategies. While facial EMG over the Corrugator supercilii and the Zygomaticus major was measured, each participant performed two tasks with pictures of album covers. One task was an affective evaluation task and the other was to attribute the album covers to one of five decades. The Embodied Emotion Account predicts that valence concordant EMG is more likely to occur if the task necessitates a somatosensory simulation of the evaluative meaning of stimuli. Results support this prediction with regard to Corrugator supercilii in that valence concordant EMG activity was only present in the affective evaluation task but not in the non-evaluative task. Results for the Zygomaticus major were ambiguous. Our findings are in line with the view that EMG activity is an embodied part of the evaluation process and not a mere physical outcome.

  14. Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia.

    Science.gov (United States)

    Palermo, Romina; Willis, Megan L; Rivolta, Davide; McKone, Elinor; Wilson, C Ellie; Calder, Andrew J

    2011-04-01

    We test 12 individuals with congenital prosopagnosia (CP), who replicate a common pattern of showing severe difficulty in recognising facial identity in conjunction with normal recognition of facial expressions (both basic and 'social'). Strength of holistic processing was examined using standard expression composite and identity composite tasks. Compared to age- and sex-matched controls, group analyses demonstrated that CPs showed weaker holistic processing, for both expression and identity information. Implications are (a) normal expression recognition in CP can derive from compensatory strategies (e.g., over-reliance on non-holistic cues to expression); (b) the split between processing of expression and identity information may take place after a common stage of holistic processing; and (c) contrary to a recent claim, holistic processing of identity is functionally involved in face identification ability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. [Facial palsy].

    Science.gov (United States)

    Cavoy, R

    2013-09-01

    Facial palsy is a daily challenge for the clinicians. Determining whether facial nerve palsy is peripheral or central is a key step in the diagnosis. Central nervous lesions can give facial palsy which may be easily differentiated from peripheral palsy. The next question is the peripheral facial paralysis idiopathic or symptomatic. A good knowledge of anatomy of facial nerve is helpful. A structure approach is given to identify additional features that distinguish symptomatic facial palsy from idiopathic one. The main cause of peripheral facial palsies is idiopathic one, or Bell's palsy, which remains a diagnosis of exclusion. The most common cause of symptomatic peripheral facial palsy is Ramsay-Hunt syndrome. Early identification of symptomatic facial palsy is important because of often worst outcome and different management. The prognosis of Bell's palsy is on the whole favorable and is improved with a prompt tapering course of prednisone. In Ramsay-Hunt syndrome, an antiviral therapy is added along with prednisone. We also discussed of current treatment recommendations. We will review short and long term complications of peripheral facial palsy.

  16. Facial Identification in Observers with Colour-Grapheme Synaesthesia

    DEFF Research Database (Denmark)

    Sørensen, Thomas Alrik

    2013-01-01

    Synaesthesia between colours and graphemes is often reported as one of the most common forms cross modal perception [Colizolo et al, 2012, PLoS ONE, 7(6), e39799]. In this particular synesthetic sub-type the perception of a letterform is followed by an additional experience of a colour quality....... Both colour [McKeefry and Zeki, 1997, Brain, 120(12), 2229–2242] and visual word forms [McCandliss et al, 2003, Trends in Cognitive Sciences, 7(7), 293–299] have previously been linked to the fusiform gyrus. By being neighbouring functions speculations of cross wiring between the areas have been...... of Neuroscience, 17(11), 4302–4311], increased colour-word form representations in observers with colour-grapheme synaesthesia may affect facial identification in people with synaesthesia. This study investigates the ability to process facial features for identification in observers with colour...

  17. Sound-induced facial synkinesis following facial nerve paralysis.

    Science.gov (United States)

    Ma, Ming-San; van der Hoeven, Johannes H; Nicolai, Jean-Philippe A; Meek, Marcel F

    2009-08-01

    Facial synkinesis (or synkinesia) (FS) occurs frequently after paresis or paralysis of the facial nerve and is in most cases due to aberrant regeneration of (branches of) the facial nerve. Patients suffer from inappropriate and involuntary synchronous facial muscle contractions. Here we describe two cases of sound-induced facial synkinesis (SFS) after facial nerve injury. As far as we know, this phenomenon has not been described in the English literature before. Patient A presented with right hemifacial palsy after lesion of the facial nerve due to skull base fracture. He reported involuntary muscle activity at the right corner of the mouth, specifically on hearing ringing keys. Patient B suffered from left hemifacial palsy following otitis media and developed involuntary muscle contraction in the facial musculature specifically on hearing clapping hands or a trumpet sound. Both patients were evaluated by means of video, audio and EMG analysis. Possible mechanisms in the pathophysiology of SFS are postulated and therapeutic options are discussed.

  18. Dynamic facial expressions evoke distinct activation in the face perception network: a connectivity analysis study.

    Science.gov (United States)

    Foley, Elaine; Rippon, Gina; Thai, Ngoc Jade; Longe, Olivia; Senior, Carl

    2012-02-01

    Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223-233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.

  19. Neural circuitry of emotional and cognitive conflict revealed through facial expressions.

    Science.gov (United States)

    Chiew, Kimberly S; Braver, Todd S

    2011-03-09

    Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality. Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC. These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.

  20. Unsupervised learning of facial emotion decoding skills.

    Science.gov (United States)

    Huelle, Jan O; Sack, Benjamin; Broer, Katja; Komlewa, Irina; Anders, Silke

    2014-01-01

    Research on the mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practice without an external teaching signal. Participants saw video clips of dynamic facial expressions of five different women and were asked to decide which of four possible emotions (anger, disgust, fear, and sadness) was shown in each clip. Although no external information about the correctness of the participant's response or the sender's true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several similarities and differences between the unsupervised improvement of facial decoding skills observed in the current study, unsupervised perceptual learning of simple stimuli described in previous studies and practice effects often observed in cognitive tasks.

  1. Response inhibition is modulated by functional cerebral asymmetries for facial expression perception

    Directory of Open Access Journals (Sweden)

    Sebastian eOcklenburg

    2013-11-01

    Full Text Available The efficacy of executive functions is critically modulated by information processing in earlier cognitive stages. For example, initial processing of verbal stimuli in the language-dominant left-hemisphere leads to more efficient response inhibition than initial processing of verbal stimuli in the non-dominant right hemisphere. However, it is unclear whether this organizational principle is specific for the language system, or a general principle that also applies to other types of lateralized cognition. To answer this question, we investigated the neurophysiological correlates of early attentional processes, facial expression perception and response inhibition during tachistoscopic presentation of facial ‘Go’ and ‘Nogo’ stimuli in the left and the right visual field. Participants committed fewer false alarms after Nogo-stimulus presentation in the left compared to the right visual field. This right-hemispheric asymmetry on the behavioral level was also reflected in the neurophysiological correlates of face perception, specifically in a right-sided asymmetry in the N170 amplitude. Moreover, the right-hemispheric dominance for facial expression processing also affected event-related potentials typically related to response inhibition, namely the Nogo-N2 and Nogo-P3. These findings show that an effect of hemispheric asymmetries in early information processing on the efficacy of higher cognitive functions is not limited to left-hemispheric language functions, but can be generalized to predominantly right-hemispheric functions.

  2. Short-term visual deprivation reduces interference effects of task-irrelevant facial expressions on affective prosody judgments

    Directory of Open Access Journals (Sweden)

    Ineke eFengler

    2015-04-01

    Full Text Available Several studies have suggested that neuroplasticity can be triggered by short-term visual deprivation in healthy adults. Specifically, these studies have provided evidence that visual deprivation reversibly affects basic perceptual abilities. The present study investigated the long-lasting effects of short-term visual deprivation on emotion perception. To this aim, we visually deprived a group of young healthy adults, age-matched with a group of non-deprived controls, for 3 hours and tested them before and after visual deprivation (i.e., after 8 h on average and at 4 week follow-up on an audio-visual (i.e., faces and voices emotion discrimination task. To observe changes at the level of basic perceptual skills, we additionally employed a simple audio-visual (i.e., tone bursts and light flashes discrimination task and two unimodal (one auditory and one visual perceptual threshold measures. During the 3 h period, both groups performed a series of auditory tasks. To exclude the possibility that changes in emotion discrimination may emerge as a consequence of the exposure to auditory stimulation during the 3 h stay in the dark, we visually deprived an additional group of age-matched participants who concurrently performed unrelated (i.e., tactile tasks to the later tested abilities. The two visually deprived groups showed enhanced affective prosodic discrimination abilities in the context of incongruent facial expressions following the period of visual deprivation; this effect was partially maintained until follow-up. By contrast, no changes were observed in affective facial expression discrimination and in the basic perception tasks in any group. These findings suggest that short-term visual deprivation per se triggers a reweighting of visual and auditory emotional cues, which seem to possibly prevail for longer durations.

  3. Facial dynamics and emotional expressions in facial aging treatments.

    Science.gov (United States)

    Michaud, Thierry; Gassia, Véronique; Belhaouari, Lakhdar

    2015-03-01

    Facial expressions convey emotions that form the foundation of interpersonal relationships, and many of these emotions promote and regulate our social linkages. Hence, the facial aging symptomatological analysis and the treatment plan must of necessity include knowledge of the facial dynamics and the emotional expressions of the face. This approach aims to more closely meet patients' expectations of natural-looking results, by correcting age-related negative expressions while observing the emotional language of the face. This article will successively describe patients' expectations, the role of facial expressions in relational dynamics, the relationship between facial structures and facial expressions, and the way facial aging mimics negative expressions. Eventually, therapeutic implications for facial aging treatment will be addressed. © 2015 Wiley Periodicals, Inc.

  4. The facial nerve: anatomy and associated disorders for oral health professionals.

    Science.gov (United States)

    Takezawa, Kojiro; Townsend, Grant; Ghabriel, Mounir

    2018-04-01

    The facial nerve, the seventh cranial nerve, is of great clinical significance to oral health professionals. Most published literature either addresses the central connections of the nerve or its peripheral distribution but few integrate both of these components and also highlight the main disorders affecting the nerve that have clinical implications in dentistry. The aim of the current study is to provide a comprehensive description of the facial nerve. Multiple aspects of the facial nerve are discussed and integrated, including its neuroanatomy, functional anatomy, gross anatomy, clinical problems that may involve the nerve, and the use of detailed anatomical knowledge in the diagnosis of the site of facial nerve lesion in clinical neurology. Examples are provided of disorders that can affect the facial nerve during its intra-cranial, intra-temporal and extra-cranial pathways, and key aspects of clinical management are discussed. The current study is complemented by original detailed dissections and sketches that highlight key anatomical features and emphasise the extent and nature of anatomical variations displayed by the facial nerve.

  5. Multiracial Facial Golden Ratio and Evaluation of Facial Appearance.

    Directory of Open Access Journals (Sweden)

    Mohammad Khursheed Alam

    Full Text Available This study aimed to investigate the association of facial proportion and its relation to the golden ratio with the evaluation of facial appearance among Malaysian population. This was a cross-sectional study with 286 randomly selected from Universiti Sains Malaysia (USM Health Campus students (150 females and 136 males; 100 Malaysian Chinese, 100 Malaysian Malay and 86 Malaysian Indian, with the mean age of 21.54 ± 1.56 (Age range, 18-25. Facial indices obtained from direct facial measurements were used for the classification of facial shape into short, ideal and long. A validated structured questionnaire was used to assess subjects' evaluation of their own facial appearance. The mean facial indices of Malaysian Indian (MI, Malaysian Chinese (MC and Malaysian Malay (MM were 1.59 ± 0.19, 1.57 ± 0.25 and 1.54 ± 0.23 respectively. Only MC showed significant sexual dimorphism in facial index (P = 0.047; P<0.05 but no significant difference was found between races. Out of the 286 subjects, 49 (17.1% were of ideal facial shape, 156 (54.5% short and 81 (28.3% long. The facial evaluation questionnaire showed that MC had the lowest satisfaction with mean score of 2.18 ± 0.97 for overall impression and 2.15 ± 1.04 for facial parts, compared to MM and MI, with mean score of 1.80 ± 0.97 and 1.64 ± 0.74 respectively for overall impression; 1.75 ± 0.95 and 1.70 ± 0.83 respectively for facial parts.1 Only 17.1% of Malaysian facial proportion conformed to the golden ratio, with majority of the population having short face (54.5%; 2 Facial index did not depend significantly on races; 3 Significant sexual dimorphism was shown among Malaysian Chinese; 4 All three races are generally satisfied with their own facial appearance; 5 No significant association was found between golden ratio and facial evaluation score among Malaysian population.

  6. Bodily action penetrates affective perception

    Science.gov (United States)

    Rigutti, Sara; Gerbino, Walter

    2016-01-01

    Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on

  7. Bodily action penetrates affective perception

    Directory of Open Access Journals (Sweden)

    Carlo Fantoni

    2016-02-01

    Full Text Available Fantoni & Gerbino (2014 showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP, they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015 would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions, in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top

  8. Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults

    OpenAIRE

    Cooper, Nicholas R.; Simpson, Andrew; Till, Amy; Simmons, Kelly; Puzzo, Ignazio

    2013-01-01

    The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level...

  9. Magnetic resonance imaging of facial muscles

    Energy Technology Data Exchange (ETDEWEB)

    Farrugia, M.E. [Department of Clinical Neurology, University of Oxford, Radcliffe Infirmary, Oxford (United Kingdom)], E-mail: m.e.farrugia@doctors.org.uk; Bydder, G.M. [Department of Radiology, University of California, San Diego, CA 92103-8226 (United States); Francis, J.M.; Robson, M.D. [OCMR, Department of Cardiovascular Medicine, University of Oxford, John Radcliffe Hospital, Oxford (United Kingdom)

    2007-11-15

    Facial and tongue muscles are commonly involved in patients with neuromuscular disorders. However, these muscles are not as easily accessible for biopsy and pathological examination as limb muscles. We have previously investigated myasthenia gravis patients with MuSK antibodies for facial and tongue muscle atrophy using different magnetic resonance imaging sequences, including ultrashort echo time techniques and image analysis tools that allowed us to obtain quantitative assessments of facial muscles. This imaging study had shown that facial muscle measurement is possible and that useful information can be obtained using a quantitative approach. In this paper we aim to review in detail the methods that we applied to our study, to enable clinicians to study these muscles within the domain of neuromuscular disease, oncological or head and neck specialties. Quantitative assessment of the facial musculature may be of value in improving the understanding of pathological processes occurring within facial muscles in certain neuromuscular disorders.

  10. Magnetic resonance imaging of facial muscles

    International Nuclear Information System (INIS)

    Farrugia, M.E.; Bydder, G.M.; Francis, J.M.; Robson, M.D.

    2007-01-01

    Facial and tongue muscles are commonly involved in patients with neuromuscular disorders. However, these muscles are not as easily accessible for biopsy and pathological examination as limb muscles. We have previously investigated myasthenia gravis patients with MuSK antibodies for facial and tongue muscle atrophy using different magnetic resonance imaging sequences, including ultrashort echo time techniques and image analysis tools that allowed us to obtain quantitative assessments of facial muscles. This imaging study had shown that facial muscle measurement is possible and that useful information can be obtained using a quantitative approach. In this paper we aim to review in detail the methods that we applied to our study, to enable clinicians to study these muscles within the domain of neuromuscular disease, oncological or head and neck specialties. Quantitative assessment of the facial musculature may be of value in improving the understanding of pathological processes occurring within facial muscles in certain neuromuscular disorders

  11. Multiracial Facial Golden Ratio and Evaluation of Facial Appearance.

    Science.gov (United States)

    Alam, Mohammad Khursheed; Mohd Noor, Nor Farid; Basri, Rehana; Yew, Tan Fo; Wen, Tay Hui

    2015-01-01

    This study aimed to investigate the association of facial proportion and its relation to the golden ratio with the evaluation of facial appearance among Malaysian population. This was a cross-sectional study with 286 randomly selected from Universiti Sains Malaysia (USM) Health Campus students (150 females and 136 males; 100 Malaysian Chinese, 100 Malaysian Malay and 86 Malaysian Indian), with the mean age of 21.54 ± 1.56 (Age range, 18-25). Facial indices obtained from direct facial measurements were used for the classification of facial shape into short, ideal and long. A validated structured questionnaire was used to assess subjects' evaluation of their own facial appearance. The mean facial indices of Malaysian Indian (MI), Malaysian Chinese (MC) and Malaysian Malay (MM) were 1.59 ± 0.19, 1.57 ± 0.25 and 1.54 ± 0.23 respectively. Only MC showed significant sexual dimorphism in facial index (P = 0.047; Pmean score of 2.18 ± 0.97 for overall impression and 2.15 ± 1.04 for facial parts, compared to MM and MI, with mean score of 1.80 ± 0.97 and 1.64 ± 0.74 respectively for overall impression; 1.75 ± 0.95 and 1.70 ± 0.83 respectively for facial parts. 1) Only 17.1% of Malaysian facial proportion conformed to the golden ratio, with majority of the population having short face (54.5%); 2) Facial index did not depend significantly on races; 3) Significant sexual dimorphism was shown among Malaysian Chinese; 4) All three races are generally satisfied with their own facial appearance; 5) No significant association was found between golden ratio and facial evaluation score among Malaysian population.

  12. Implicit attentional bias for facial emotion in dissociative seizures: Additional evidence.

    Science.gov (United States)

    Pick, Susannah; Mellers, John D C; Goldstein, Laura H

    2018-03-01

    This study sought to extend knowledge about the previously reported preconscious attentional bias (AB) for facial emotion in patients with dissociative seizures (DS) by exploring whether the finding could be replicated, while controlling for concurrent anxiety, depression, and potentially relevant cognitive impairments. Patients diagnosed with DS (n=38) were compared with healthy controls (n=43) on a pictorial emotional Stroop test, in which backwardly masked emotional faces (angry, happy, neutral) were processed implicitly. The group with DS displayed a significantly greater AB to facial emotion relative to controls; however, the bias was not specific to negative or positive emotions. The group effect could not be explained by performance on standardized cognitive tests or self-reported depression/anxiety. The study provides additional evidence of a disproportionate and automatic allocation of attention to facial affect in patients with DS, including both positive and negative facial expressions. Such a tendency could act as a predisposing factor for developing DS initially, or may contribute to triggering individuals' seizures on an ongoing basis. Psychological interventions such as Cognitive Behavioral Therapy (CBT) or AB modification might be suitable approaches to target this bias in clinical practice. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Beauty hinders attention switch in change detection: the role of facial attractiveness and distinctiveness.

    Directory of Open Access Journals (Sweden)

    Wenfeng Chen

    Full Text Available BACKGROUND: Recent research has shown that the presence of a task-irrelevant attractive face can induce a transient diversion of attention from a perceptual task that requires covert deployment of attention to one of the two locations. However, it is not known whether this spontaneous appraisal for facial beauty also modulates attention in change detection among multiple locations, where a slower, and more controlled search process is simultaneously affected by the magnitude of a change and the facial distinctiveness. Using the flicker paradigm, this study examines how spontaneous appraisal for facial beauty affects the detection of identity change among multiple faces. METHODOLOGY/PRINCIPAL FINDINGS: Participants viewed a display consisting of two alternating frames of four faces separated by a blank frame. In half of the trials, one of the faces (target face changed to a different person. The task of the participant was to indicate whether a change of face identity had occurred. The results showed that (1 observers were less efficient at detecting identity change among multiple attractive faces relative to unattractive faces when the target and distractor faces were not highly distinctive from one another; and (2 it is difficult to detect a change if the new face is similar to the old. CONCLUSIONS/SIGNIFICANCE: The findings suggest that attractive faces may interfere with the attention-switch process in change detection. The results also show that attention in change detection was strongly modulated by physical similarity between the alternating faces. Although facial beauty is a powerful stimulus that has well-demonstrated priority, its influence on change detection is easily superseded by low-level image similarity. The visual system appears to take a different approach to facial beauty when a task requires resource-demanding feature comparisons.

  14. Beauty hinders attention switch in change detection: the role of facial attractiveness and distinctiveness.

    Science.gov (United States)

    Chen, Wenfeng; Liu, Chang Hong; Nakabayashi, Kazuyo

    2012-01-01

    Recent research has shown that the presence of a task-irrelevant attractive face can induce a transient diversion of attention from a perceptual task that requires covert deployment of attention to one of the two locations. However, it is not known whether this spontaneous appraisal for facial beauty also modulates attention in change detection among multiple locations, where a slower, and more controlled search process is simultaneously affected by the magnitude of a change and the facial distinctiveness. Using the flicker paradigm, this study examines how spontaneous appraisal for facial beauty affects the detection of identity change among multiple faces. Participants viewed a display consisting of two alternating frames of four faces separated by a blank frame. In half of the trials, one of the faces (target face) changed to a different person. The task of the participant was to indicate whether a change of face identity had occurred. The results showed that (1) observers were less efficient at detecting identity change among multiple attractive faces relative to unattractive faces when the target and distractor faces were not highly distinctive from one another; and (2) it is difficult to detect a change if the new face is similar to the old. The findings suggest that attractive faces may interfere with the attention-switch process in change detection. The results also show that attention in change detection was strongly modulated by physical similarity between the alternating faces. Although facial beauty is a powerful stimulus that has well-demonstrated priority, its influence on change detection is easily superseded by low-level image similarity. The visual system appears to take a different approach to facial beauty when a task requires resource-demanding feature comparisons.

  15. Enhanced MRI in patients with facial palsy

    International Nuclear Information System (INIS)

    Yanagida, Masahiro; Kato, Tsutomu; Ushiro, Koichi; Kitajiri, Masanori; Yamashita, Toshio; Kumazawa, Tadami; Tanaka, Yoshimasa

    1991-01-01

    We performed Gd-DTPA-enhanced magnetic resonance imaging (MRI) examinations at several stages in 40 patients with peripheral facial nerve palsy (Bell's palsy and Ramsay-Hunt syndrome). In 38 of the 40 patients, one and more enhanced region could be seen in certain portion of the facial nerve in the temporal bone on the affected side, whereas no enhanced regions were seen on the intact side. Correlations between the timing of the MRI examination and the location of the enhanced regions were analysed. In all 6 patients examined by MRI within 5 days after the onset of facial nerve palsy, enhanced regions were present in the meatal portion. In 3 of the 8 patients (38%) examined by MRI 6 to 10 days after the onset of facial palsy, enhanced areas were seen in both the meatal and labyrinthine portions. In 8 of the 9 patients (89%) tested 11 to 20 days after the onset of palsy, the vertical portion was enhanced. In the 12 patients examined by MRI 21 to 40 days after the onset of facial nerve palsy, the meatal portion was not enhanced while the labyrinthine portion, the horizontal portion and the vertical portion were enhanced in 5 (42%), 8 (67%) and 11 (92%), respectively. Enhancement in the vertical portion was observed in all 5 patients examined more than 41 days after the onset of facial palsy. These results suggest that the central portion of the facial nerve in the temporal bone tends to be enhanced in the early stage of facial nerve palsy, while the peripheral portion is enhanced in the late stage. These changes of Gd-DTPA enhanced regions in the facial nerve may suggest dromic degeneration of the facial nerve in peripheral facial nerve palsy. (author)

  16. Traumatic facial nerve neuroma with facial palsy presenting in infancy.

    Science.gov (United States)

    Clark, James H; Burger, Peter C; Boahene, Derek Kofi; Niparko, John K

    2010-07-01

    To describe the management of traumatic neuroma of the facial nerve in a child and literature review. Sixteen-month-old male subject. Radiological imaging and surgery. Facial nerve function. The patient presented at 16 months with a right facial palsy and was found to have a right facial nerve traumatic neuroma. A transmastoid, middle fossa resection of the right facial nerve lesion was undertaken with a successful facial nerve-to-hypoglossal nerve anastomosis. The facial palsy improved postoperatively. A traumatic neuroma should be considered in an infant who presents with facial palsy, even in the absence of an obvious history of trauma. The treatment of such lesion is complex in any age group but especially in young children. Symptoms, age, lesion size, growth rate, and facial nerve function determine the appropriate management.

  17. Perceived functional impact of abnormal facial appearance.

    Science.gov (United States)

    Rankin, Marlene; Borah, Gregory L

    2003-06-01

    Functional facial deformities are usually described as those that impair respiration, eating, hearing, or speech. Yet facial scars and cutaneous deformities have a significant negative effect on social functionality that has been poorly documented in the scientific literature. Insurance companies are declining payments for reconstructive surgical procedures for facial deformities caused by congenital disabilities and after cancer or trauma operations that do not affect mechanical facial activity. The purpose of this study was to establish a large, sample-based evaluation of the perceived social functioning, interpersonal characteristics, and employability indices for a range of facial appearances (normal and abnormal). Adult volunteer evaluators (n = 210) provided their subjective perceptions based on facial physical appearance, and an analysis of the consequences of facial deformity on parameters of preferential treatment was performed. A two-group comparative research design rated the differences among 10 examples of digitally altered facial photographs of actual patients among various age and ethnic groups with "normal" and "abnormal" congenital deformities or posttrauma scars. Photographs of adult patients with observable congenital and posttraumatic deformities (abnormal) were digitally retouched to eliminate the stigmatic defects (normal). The normal and abnormal photographs of identical patients were evaluated by the large sample study group on nine parameters of social functioning, such as honesty, employability, attractiveness, and effectiveness, using a visual analogue rating scale. Patients with abnormal facial characteristics were rated as significantly less honest (p = 0.007), less employable (p = 0.001), less trustworthy (p = 0.01), less optimistic (p = 0.001), less effective (p = 0.02), less capable (p = 0.002), less intelligent (p = 0.03), less popular (p = 0.001), and less attractive (p = 0.001) than were the same patients with normal facial

  18. Facial averageness and genetic quality: Testing heritability, genetic correlation with attractiveness, and the paternal age effect.

    Science.gov (United States)

    Lee, Anthony J; Mitchem, Dorian G; Wright, Margaret J; Martin, Nicholas G; Keller, Matthew C; Zietsch, Brendan P

    2016-01-01

    Popular theory suggests that facial averageness is preferred in a partner for genetic benefits to offspring. However, whether facial averageness is associated with genetic quality is yet to be established. Here, we computed an objective measure of facial averageness for a large sample ( N = 1,823) of identical and nonidentical twins and their siblings to test two predictions from the theory that facial averageness reflects genetic quality. First, we use biometrical modelling to estimate the heritability of facial averageness, which is necessary if it reflects genetic quality. We also test for a genetic association between facial averageness and facial attractiveness. Second, we assess whether paternal age at conception (a proxy of mutation load) is associated with facial averageness and facial attractiveness. Our findings are mixed with respect to our hypotheses. While we found that facial averageness does have a genetic component, and a significant phenotypic correlation exists between facial averageness and attractiveness, we did not find a genetic correlation between facial averageness and attractiveness (therefore, we cannot say that the genes that affect facial averageness also affect facial attractiveness) and paternal age at conception was not negatively associated with facial averageness. These findings support some of the previously untested assumptions of the 'genetic benefits' account of facial averageness, but cast doubt on others.

  19. Spontaneous and posed facial expression in Parkinson's disease.

    Science.gov (United States)

    Smith, M C; Smith, M K; Ellgring, H

    1996-09-01

    Spontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n = 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2-5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their emotional experience on 9-point Likert scales. The PD group showed significantly less overall facial reactivity than did controls when viewing the films. The predicted Group X Condition (spontaneous vs. posed) interaction effect on smile intensity was found when PD participants with more severe disease were compared with those with milder disease and with controls. In contrast, ratings of emotional experience were similar for both groups. Depression was positively associated with emotion rating but not with measures of facial activity. Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact.

  20. The impact of high trait social anxiety on neural processing of facial emotion expressions in females.

    Science.gov (United States)

    Felmingham, Kim L; Stewart, Laura F; Kemp, Andrew H; Carr, Andrea R

    2016-05-01

    A cognitive model of social anxiety predicts that an early attentional bias leads to greater cognitive processing of social threat signals, whereas the vigilance-avoidance model predicts there will be subsequent reduction in cognitive processing. This study tests these models by examining neural responses to social threat stimuli using Event-related potentials (ERP). 19 women with high trait social anxiety and 19 women with low trait social anxiety viewed emotional expressions (angry, disgusted, happy and neutral) in a passive viewing task whilst ERP responses were recorded. The HSA group revealed greater automatic attention, or hypervigilance, to all facial expressions, as indexed by greater N1 amplitude compared to the LSA group. They also showed greater sustained attention and elaborative processing of all facial expressions, indexed by significantly increased P2 and P3 amplitudes compared to the LSA group. These results support cognitive models of social anxiety, but are not consistent with predictions of the vigilance-avoidance model. Copyright © 2016. Published by Elsevier B.V.

  1. Characterization of p75{sup +} ectomesenchymal stem cells from rat embryonic facial process tissue

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xiujie; Liu, Luchuan; Deng, Manjing; Zhang, Li; Liu, Rui; Xing, Yongjun; Zhou, Xia [Department of Stomatology, Daping Hospital and Research Institute of Surgery, Third Military Medical University, Chongqing 400042 (China); Nie, Xin, E-mail: dr.xinnie@gmail.com [Department of Stomatology, Daping Hospital and Research Institute of Surgery, Third Military Medical University, Chongqing 400042 (China)

    2012-10-12

    Highlights: Black-Right-Pointing-Pointer Ectomesenchymal stem cells (EMSCs) were found to migrate to rat facial processes at E11.5. Black-Right-Pointing-Pointer We successfully sorted p75NTR positive EMSCs (p75{sup +} EMSCs). Black-Right-Pointing-Pointer p75{sup +} EMSCs up to nine passages showed relative stable proliferative activity. Black-Right-Pointing-Pointer We examined the in vitro multilineage potential of p75{sup +} EMSCs. Black-Right-Pointing-Pointer p75{sup +}EMSCs provide an in vitro model for tooth morphogenesis. -- Abstract: Several populations of stem cells, including those from the dental pulp and periodontal ligament, have been isolated from different parts of the tooth and periodontium. The characteristics of such stem cells have been reported as well. However, as a common progenitor of these cells, ectomesenchymal stem cells (EMSCs), derived from the cranial neural crest have yet to be fully characterized. The aim of this study was to better understand the characteristics of EMSCs isolated from rat embryonic facial processes. Immunohistochemical staining showed that EMSCs had migrated to rat facial processes at E11.5, while the absence of epithelial invagination or tooth-like epithelium suggested that any epithelial-mesenchymal interactions were limited at this stage. The p75 neurotrophin receptor (p75NTR), a typical neural crest marker, was used to select p75NTR-positive EMSCs (p75{sup +} EMSCs), which were found to show a homogeneous fibroblast-like morphology and little change in the growth curve, proliferation capacity, and cell phenotype during cell passage. They also displayed the capacity to differentiate into diverse cell types under chemically defined conditions in vitro. p75{sup +} EMSCs proved to be homogeneous, stable in vitro and potentially capable of multiple lineages, suggesting their potential for application in dental or orofacial tissue engineering.

  2. Facial attractiveness, symmetry and cues of good genes.

    Science.gov (United States)

    Scheib, J E; Gangestad, S W; Thornhill, R

    1999-09-22

    Cues of phenotypic condition should be among those used by women in their choice of mates. One marker of better phenotypic condition is thought to be symmetrical bilateral body and facial features. However, it is not clear whether women use symmetry as the primary cue in assessing the phenotypic quality of potential mates or whether symmetry is correlated with other facial markers affecting physical attractiveness. Using photographs of men's faces, for which facial symmetry had been measured, we found a relationship between women's attractiveness ratings of these faces and symmetry, but the subjects could not rate facial symmetry accurately. Moreover, the relationship between facial attractiveness and symmetry was still observed, even when symmetry cues were removed by presenting only the left or right half of faces. These results suggest that attractive features other than symmetry can be used to assess phenotypic condition. We identified one such cue, facial masculinity (cheek-bone prominence and a relatively longer lower face), which was related to both symmetry and full- and half-face attractiveness.

  3. The reconstruction of male hair-bearing facial regions.

    Science.gov (United States)

    Ridgway, Emily B; Pribaz, Julian J

    2011-01-01

    Loss of hair-bearing regions of the face caused by trauma, tumor resection, or burn presents a difficult reconstructive task for plastic surgeons. The ideal tissue substitute should have the same characteristics as the facial area affected, consisting of thin, pliable tissue with a similar color match and hair-bearing quality. This is a retrospective study of 34 male patients who underwent reconstruction of hair-bearing facial regions performed by the senior author (J.J.P.). Local and pedicled flaps were used primarily to reconstruct defects after tumor extirpation, trauma, infections, and burns. Two patients had irradiation before reconstruction. Two patients had prior facial reconstruction with free flaps. The authors found that certain techniques of reconstructing defects in hair-bearing facial regions were more successful than others in particular facial regions and in different sizes of defects. The authors were able to develop a simple algorithm for management of facial defects involving the hair-bearing regions of the eyebrow, sideburn, beard, and mustache that may prospectively aid the planning of reconstructive strategy in these cases.

  4. Colesteatoma causando paralisia facial Cholesteatoma causing facial paralysis

    Directory of Open Access Journals (Sweden)

    José Ricardo Gurgel Testa

    2003-10-01

    Full Text Available A paralisia facial causada pelo colesteatoma é pouco freqüente. As porções do nervo mais acometidas são a timpânica e a região do 2º joelho. Nos casos de disseminação da lesão colesteatomatosa para o epitímpano anterior, o gânglio geniculado é o segmento do nervo facial mais sujeito à injúria. A etiopatogenia pode estar ligada à compressão do nervo pelo colesteatoma seguida de diminuição do seu suprimento vascular como também pela possível ação de substâncias neurotóxicas produzidas pela matriz do tumor ou pelas bactérias nele contidas. OBJETIVO: Avaliar a incidência, as características clínicas e o tratamento da paralisia facial decorrente da lesão colesteatomatosa. FORMA DE ESTUDO: Clínico retrospectivo. MATERIAL E MÉTODO: Estudo retrospectivo envolvendo dez casos de paralisia facial por colesteatoma selecionados através de levantamento de 206 descompressões do nervo facial com diferentes etiologias, realizadas na UNIFESP-EPM nos últimos dez anos. RESULTADOS: A incidência de paralisia facial por colesteatoma neste estudo foi de 4,85%,com predominância do sexo feminino (60%. A idade média dos pacientes foi de 39 anos. A duração e o grau da paralisia (inicial juntamente com a extensão da lesão foram importantes em relação à recuperação funcional do nervo facial. CONCLUSÃO: O tratamento cirúrgico precoce é fundamental para que ocorra um resultado funcional mais adequado. Nos casos de ruptura ou intensa fibrose do tecido nervoso, o enxerto de nervo (auricular magno/sural e/ou a anastomose hipoglosso-facial podem ser sugeridas.Facial paralysis caused by cholesteatoma is uncommon. The portions most frequently involved are horizontal (tympanic and second genu segments. When cholesteatomas extend over the anterior epitympanic space, the facial nerve is placed in jeopardy in the region of the geniculate ganglion. The aetiology can be related to compression of the nerve followed by impairment of its

  5. Aggressive osteoblastoma in mastoid process of temporal bone with facial palsy

    Directory of Open Access Journals (Sweden)

    Manoj Jain

    2013-01-01

    Full Text Available Osteoblastoma is an uncommon primary bone tumor with a predilection for posterior elements of spine. Its occurrence in temporal bone and middle ear is extremely rare. Clinical symptoms are non-specific and cranial nerve involvement is uncommon. The cytomorphological features of osteoblastoma are not very well defined and the experience is limited to only few reports. We report an interesting and rare case of aggressive osteoblastoma, with progressive hearing loss and facial palsy, involving the mastoid process of temporal bone and middle ear along with the description of cyto-morphological features.

  6. Unsupervised learning of facial emotion decoding skills

    Directory of Open Access Journals (Sweden)

    Jan Oliver Huelle

    2014-02-01

    Full Text Available Research on the mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practise without an external teaching signal. Participants saw video clips of dynamic facial expressions of five different women and were asked to decide which of four possible emotions (anger, disgust, fear and sadness was shown in each clip. Although no external information about the correctness of the participant’s response or the sender’s true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several similarities and differences between the unsupervised improvement of facial decoding skills observed in the current study, unsupervised perceptual learning of simple stimuli described in previous studies and practise effects often observed in cognitive tasks.

  7. The asymmetric facial skin perfusion distribution of Bell's palsy discovered by laser speckle imaging technology.

    Science.gov (United States)

    Cui, Han; Chen, Yi; Zhong, Weizheng; Yu, Haibo; Li, Zhifeng; He, Yuhai; Yu, Wenlong; Jin, Lei

    2016-01-01

    Bell's palsy is a kind of peripheral neural disease that cause abrupt onset of unilateral facial weakness. In the pathologic study, it was evidenced that ischemia of facial nerve at the affected side of face existed in Bell's palsy patients. Since the direction of facial nerve blood flow is primarily proximal to distal, facial skin microcirculation would also be affected after the onset of Bell's palsy. Therefore, monitoring the full area of facial skin microcirculation would help to identify the condition of Bell's palsy patients. In this study, a non-invasive, real time and full field imaging technology - laser speckle imaging (LSI) technology was applied for measuring facial skin blood perfusion distribution of Bell's palsy patients. 85 participants with different stage of Bell's palsy were included. Results showed that Bell's palsy patients' facial skin perfusion of affected side was lower than that of the normal side at the region of eyelid, and that the asymmetric distribution of the facial skin perfusion between two sides of eyelid is positively related to the stage of the disease (P Bell's palsy patients, and we discovered that the facial skin blood perfusion could reflect the stage of Bell's palsy, which suggested that microcirculation should be investigated in patients with this neurological deficit. It was also suggested LSI as potential diagnostic tool for Bell's palsy.

  8. [Treatment of idiopathic peripheral facial nerve paralysis (Bell's palsy)].

    Science.gov (United States)

    Meyer, Martin Willy; Hahn, Christoffer Holst

    2013-01-28

    Bell's palsy is defined as an idiopathic peripheral facial nerve paralysis of sudden onset. It affects 11-40 persons per 100,000 per annum. Many patients recover without intervention; however, up to 30% have poor recovery of facial muscle control and experience facial disfigurement. The aim of this study was to make an overview of which pharmacological treatments have been used to improve outcomes. The available evidence from randomized controlled trials shows significant benefit from treating Bell's palsy with corticosteroids but shows no benefit from antivirals.

  9. Disconnection mechanism and regional cortical atrophy contribute to impaired processing of facial expressions and theory of mind in multiple sclerosis: a structural MRI study.

    Directory of Open Access Journals (Sweden)

    Andrea Mike

    Full Text Available Successful socialization requires the ability of understanding of others' mental states. This ability called as mentalization (Theory of Mind may become deficient and contribute to everyday life difficulties in multiple sclerosis. We aimed to explore the impact of brain pathology on mentalization performance in multiple sclerosis. Mentalization performance of 49 patients with multiple sclerosis was compared to 24 age- and gender matched healthy controls. T1- and T2-weighted three-dimensional brain MRI images were acquired at 3Tesla from patients with multiple sclerosis and 18 gender- and age matched healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right inferior fronto-occipital fasciculus, uncinate fasciculus. Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed, processing of emotions (right entorhinal cortex and socially relevant information (left temporal pole. Thus, both disconnection mechanism due to white matter lesions and cortical thinning of specific brain areas may result in cognitive deficit in multiple sclerosis affecting emotion and mental state processing from facial expressions and contributing to everyday and social life difficulties of these patients.

  10. Disconnection mechanism and regional cortical atrophy contribute to impaired processing of facial expressions and theory of mind in multiple sclerosis: a structural MRI study.

    Science.gov (United States)

    Mike, Andrea; Strammer, Erzsebet; Aradi, Mihaly; Orsi, Gergely; Perlaki, Gabor; Hajnal, Andras; Sandor, Janos; Banati, Miklos; Illes, Eniko; Zaitsev, Alexander; Herold, Robert; Guttmann, Charles R G; Illes, Zsolt

    2013-01-01

    Successful socialization requires the ability of understanding of others' mental states. This ability called as mentalization (Theory of Mind) may become deficient and contribute to everyday life difficulties in multiple sclerosis. We aimed to explore the impact of brain pathology on mentalization performance in multiple sclerosis. Mentalization performance of 49 patients with multiple sclerosis was compared to 24 age- and gender matched healthy controls. T1- and T2-weighted three-dimensional brain MRI images were acquired at 3Tesla from patients with multiple sclerosis and 18 gender- and age matched healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right inferior fronto-occipital fasciculus, uncinate fasciculus). Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed), processing of emotions (right entorhinal cortex) and socially relevant information (left temporal pole). Thus, both disconnection mechanism due to white matter lesions and cortical thinning of specific brain areas may result in cognitive deficit in multiple sclerosis affecting emotion and mental state processing from facial expressions and contributing to everyday and social life difficulties of these patients.

  11. Time perception and dynamics of facial expressions of emotions.

    Directory of Open Access Journals (Sweden)

    Sophie L Fayolle

    Full Text Available Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant, but one was high-arousing (expressing anger and the other low-arousing (expressing sadness. Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.

  12. A specific association between facial disgust recognition and estradiol levels in naturally cycling women.

    Directory of Open Access Journals (Sweden)

    Sunjeev K Kamboj

    Full Text Available Subtle changes in social cognition are associated with naturalistic fluctuations in estrogens and progesterone over the course of the menstrual cycle. Using a dynamic emotion recognition task we aimed to provide a comprehensive description of the association between ovarian hormone levels and emotion recognition performance using a variety of performance metrics. Naturally cycling, psychiatrically healthy women attended a single experimental session during a follicular (days 7-13; n = 16, early luteal (days 15-19; n = 14 or late luteal phase (days 22-27; n = 14 of their menstrual cycle. Correct responses and reaction times to dynamic facial expressions were recorded and a two-high threshold analysis was used to assess discrimination and response bias. Salivary progesterone and estradiol were assayed and subjective measures of premenstrual symptoms, anxiety and positive and negative affect assessed. There was no interaction between cycle phase (follicular, early luteal, late luteal and facial expression (sad, happy, fearful, angry, neutral and disgusted on any of the recognition performance metrics. However, across the sample as a whole, progesterone levels were positively correlated with reaction times to a variety of facial expressions (anger, happiness, sadness and neutral expressions. In contrast, estradiol levels were specifically correlated with disgust processing on three performance indices (correct responses, response bias and discrimination. Premenstrual symptoms, anxiety and positive and negative affect were not associated with emotion recognition indices or hormone levels. The study highlights the role of naturalistic variations in ovarian hormone levels in modulating emotion recognition. In particular, progesterone seems to have a general slowing effect on facial expression processing. Our findings also provide the first behavioural evidence of a specific role for estrogens in the processing of disgust in humans.

  13. Shared Gaussian Process Latent Variable Model for Multi-view Facial Expression Recognition

    NARCIS (Netherlands)

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    Facial-expression data often appear in multiple views either due to head-movements or the camera position. Existing methods for multi-view facial expression recognition perform classification of the target expressions either by using classifiers learned separately for each view or by using a single

  14. Integrative processing of touch and affect in social perception: an fMRI study

    Directory of Open Access Journals (Sweden)

    Sjoerd eEbisch

    2016-05-01

    Full Text Available Social perception commonly employs multiple sources of information. The present study aimed at investigating the integrative processing of affective social signals. Task-related and task-free functional magnetic resonance imaging was performed in 26 healthy adult participants during a social perception task concerning dynamic visual stimuli simultaneously depicting facial expressions of emotion and tactile sensations that could be either congruent or incongruent. Confounding effects due to affective valence, inhibitory top-down influences, cross-modal integration, and conflict processing were minimized. The results showed that the perception of congruent, compared to incongruent stimuli, elicited enhanced neural activity in a set of brain regions including left amygdala, bilateral posterior cingulate cortex (PCC, and left superior parietal cortex. These congruency effects did not differ as a function of emotion or sensation. A complementary task-related functional interaction analysis preliminarily suggested that amygdala activity depended on previous processing stages in fusiform gyrus and PCC. The findings provide support for the integrative processing of social information about others' feelings from manifold bodily sources (sensory-affective information in amygdala and PCC. Given that the congruent stimuli were also judged as being more self-related and more familiar in terms of personal experience in an independent sample of participants, we speculate that such integrative processing might be mediated by the linking of external stimuli with self-experience. Finally, the prediction of task-related responses in amygdala by intrinsic functional connectivity between amygdala and PCC during a task-free state implies a neuro-functional basis for an individual predisposition for the integrative processing of social stimulus content.

  15. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  16. Marker optimization for facial motion acquisition and deformation.

    Science.gov (United States)

    Le, Binh H; Zhu, Mingyang; Deng, Zhigang

    2013-11-01

    A long-standing problem in marker-based facial motion capture is what are the optimal facial mocap marker layouts. Despite its wide range of potential applications, this problem has not yet been systematically explored to date. This paper describes an approach to compute optimized marker layouts for facial motion acquisition as optimization of characteristic control points from a set of high-resolution, ground-truth facial mesh sequences. Specifically, the thin-shell linear deformation model is imposed onto the example pose reconstruction process via optional hard constraints such as symmetry and multiresolution constraints. Through our experiments and comparisons, we validate the effectiveness, robustness, and accuracy of our approach. Besides guiding minimal yet effective placement of facial mocap markers, we also describe and demonstrate its two selected applications: marker-based facial mesh skinning and multiresolution facial performance capture.

  17. Concordant preferences for actual height and facial cues to height

    OpenAIRE

    Re, Daniel Edward; Perrett, David I.

    2012-01-01

    Physical height has a well-documented effect on human mate preferences. In general, both sexes prefer opposite-sex romantic relationships in which the man is taller than the woman, while individual preferences for height are affected by a person’s own height. Research in human mate choice has demonstrated that attraction to facial characteristics, such as facial adiposity, may reflect references for body characteristics. Here, we tested preferences for facial cues to height. In general, incre...

  18. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    Science.gov (United States)

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  19. The role of great auricular-facial nerve neurorrhaphy in facial nerve damage.

    Science.gov (United States)

    Sun, Yan; Liu, Limei; Han, Yuechen; Xu, Lei; Zhang, Daogong; Wang, Haibo

    2015-01-01

    Facial nerve is easy to be damaged, and there are many reconstructive methods for facial nerve reconstructive, such as facial nerve end to end anastomosis, the great auricular nerve graft, the sural nerve graft, or hypoglossal-facial nerve anastomosis. However, there is still little study about great auricular-facial nerve neurorrhaphy. The aim of the present study was to identify the role of great auricular-facial nerve neurorrhaphy and the mechanism. Rat models of facial nerve cut (FC), facial nerve end to end anastomosis (FF), facial-great auricular neurorrhaphy (FG), and control (Ctrl) were established. Apex nasi amesiality observation, electrophysiology and immunofluorescence assays were employed to investigate the function and mechanism. In apex nasi amesiality observation, it was found apex nasi amesiality of FG group was partly recovered. Additionally, electrophysiology and immunofluorescence assays revealed that facial-great auricular neurorrhaphy could transfer nerve impulse and express AChR which was better than facial nerve cut and worse than facial nerve end to end anastomosis. The present study indicated that great auricular-facial nerve neurorrhaphy is a substantial solution for facial lesion repair, as it is efficiently preventing facial muscles atrophy by generating neurotransmitter like ACh.

  20. Facial blanching after inferior alveolar nerve block anesthesia: an unusual complication

    OpenAIRE

    Kang, Sang-Hoon; Won, Yu-Jin

    2017-01-01

    The present case report describes a complication involving facial blanching symptoms occurring during inferior alveolar nerve block anesthesia (IANBA). Facial blanching after IANBA can be caused by the injection of an anesthetic into the maxillary artery area, affecting the infraorbital artery.

  1. Facial blanching after inferior alveolar nerve block anesthesia: an unusual complication.

    Science.gov (United States)

    Kang, Sang-Hoon; Won, Yu-Jin

    2017-12-01

    The present case report describes a complication involving facial blanching symptoms occurring during inferior alveolar nerve block anesthesia (IANBA). Facial blanching after IANBA can be caused by the injection of an anesthetic into the maxillary artery area, affecting the infraorbital artery.

  2. Anaplastology in times of facial transplantation: Still a reasonable treatment option?

    Science.gov (United States)

    Toso, Sabine Maria; Menzel, Kerstin; Motzkus, Yvonne; Klein, Martin; Menneking, Horst; Raguse, Jan-Dirk; Nahles, Susanne; Hoffmeister, Bodo; Adolphs, Nicolai

    2015-09-01

    Optimum functional and aesthetic facial reconstruction is still a challenge in patients who suffer from inborn or acquired facial deformity. It is known that functional and aesthetic impairment can result in significant psychosocial strain, leading to the social isolation of patients who are affected by major facial deformities. Microvascular techniques and increasing experience in facial transplantation certainly contribute to better restorative outcomes. However, these technologies also have some drawbacks, limitations and unsolved problems. Extensive facial defects which include several aesthetic units and dentition can be restored by combining dental prostheses and anaplastology, thus providing an adequate functional and aesthetic outcome in selected patients without the drawbacks of major surgical procedures. Referring to some representative patient cases, it is shown how extreme facial disfigurement after oncological surgery can be palliated by combining intraoral dentures with extraoral facial prostheses using individualized treatment and without the need for major reconstructive surgery. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  3. Nerve crush but not displacement-induced stretch of the intra-arachnoidal facial nerve promotes facial palsy after cerebellopontine angle surgery.

    Science.gov (United States)

    Bendella, Habib; Brackmann, Derald E; Goldbrunner, Roland; Angelov, Doychin N

    2016-10-01

    Little is known about the reasons for occurrence of facial nerve palsy after removal of cerebellopontine angle tumors. Since the intra-arachnoidal portion of the facial nerve is considered to be so vulnerable that even the slightest tension or pinch may result in ruptured axons, we tested whether a graded stretch or controlled crush would affect the postoperative motor performance of the facial (vibrissal) muscle in rats. Thirty Wistar rats, divided into five groups (one with intact controls and four with facial nerve lesions), were used. Under inhalation anesthesia, the occipital squama was opened, the cerebellum gently retracted to the left, and the intra-arachnoidal segment of the right facial nerve exposed. A mechanical displacement of the brainstem with 1 or 3 mm toward the midline or an electromagnet-controlled crush of the facial nerve with a tweezers at a closure velocity of 50 and 100 mm/s was applied. On the next day, whisking motor performance was determined by video-based motion analysis. Even the larger (with 3 mm) mechanical displacement of the brainstem had no harmful effect: The amplitude of the vibrissal whisks was in the normal range of 50°-60°. On the other hand, even the light nerve crush (50 mm/s) injured the facial nerve and resulted in paralyzed vibrissal muscles (amplitude of 10°-15°). We conclude that, contrary to the generally acknowledged assumptions, it is the nerve crush but not the displacement-induced stretching of the intra-arachnoidal facial trunk that promotes facial palsy after cerebellopontine angle surgery in rats.

  4. Frame-Based Facial Expression Recognition Using Geometrical Features

    Directory of Open Access Journals (Sweden)

    Anwar Saeed

    2014-01-01

    Full Text Available To improve the human-computer interaction (HCI to be as good as human-human interaction, building an efficient approach for human emotion recognition is required. These emotions could be fused from several modalities such as facial expression, hand gesture, acoustic data, and biophysiological data. In this paper, we address the frame-based perception of the universal human facial expressions (happiness, surprise, anger, disgust, fear, and sadness, with the help of several geometrical features. Unlike many other geometry-based approaches, the frame-based method does not rely on prior knowledge of a person-specific neutral expression; this knowledge is gained through human intervention and not available in real scenarios. Additionally, we provide a method to investigate the performance of the geometry-based approaches under various facial point localization errors. From an evaluation on two public benchmark datasets, we have found that using eight facial points, we can achieve the state-of-the-art recognition rate. However, this state-of-the-art geometry-based approach exploits features derived from 68 facial points and requires prior knowledge of the person-specific neutral expression. The expression recognition rate using geometrical features is adversely affected by the errors in the facial point localization, especially for the expressions with subtle facial deformations.

  5. Neural correlates of affect processing and aggression in methamphetamine dependence.

    Science.gov (United States)

    Payer, Doris E; Lieberman, Matthew D; London, Edythe D

    2011-03-01

    Methamphetamine abuse is associated with high rates of aggression but few studies have addressed the contributing neurobiological factors. To quantify aggression, investigate function in the amygdala and prefrontal cortex, and assess relationships between brain function and behavior in methamphetamine-dependent individuals. In a case-control study, aggression and brain activation were compared between methamphetamine-dependent and control participants. Participants were recruited from the general community to an academic research center. Thirty-nine methamphetamine-dependent volunteers (16 women) who were abstinent for 7 to 10 days and 37 drug-free control volunteers (18 women) participated in the study; subsets completed self-report and behavioral measures. Functional magnetic resonance imaging (fMRI) was performed on 25 methamphetamine-dependent and 23 control participants. We measured self-reported and perpetrated aggression and self-reported alexithymia. Brain activation was assessed using fMRI during visual processing of facial affect (affect matching) and symbolic processing (affect labeling), the latter representing an incidental form of emotion regulation. Methamphetamine-dependent participants self-reported more aggression and alexithymia than control participants and escalated perpetrated aggression more following provocation. Alexithymia scores correlated with measures of aggression. During affect matching, fMRI showed no differences between groups in amygdala activation but found lower activation in methamphetamine-dependent than control participants in the bilateral ventral inferior frontal gyrus. During affect labeling, participants recruited the dorsal inferior frontal gyrus and exhibited decreased amygdala activity, consistent with successful emotion regulation; there was no group difference in this effect. The magnitude of decrease in amygdala activity during affect labeling correlated inversely with self-reported aggression in control participants

  6. The recognition of facial expressions: an investigation of the influence of age and cognition.

    Science.gov (United States)

    Horning, Sheena M; Cornwell, R Elisabeth; Davis, Hasker P

    2012-11-01

    The present study aimed to investigate changes in facial expression recognition across the lifespan, as well as to determine the influence of fluid intelligence, processing speed, and memory on this ability. Peak performance in the ability to identify facial affect was found to occur in middle-age, with the children and older adults performing the poorest. Specifically, older adults were impaired in their ability to identify fear, sadness, and happiness, but had preserved recognition of anger, disgust, and surprise. Analyses investigating the influence of cognition on emotion recognition demonstrated that cognitive abilities contribute to performance, especially for participants over age 45. However, the cognitive functions did not fully account for the older adults' impairments on expression recognition. Overall, the age-related deficits in facial expression recognition have implications for older adults' use of non-verbal communicative information.

  7. Impaired social brain network for processing dynamic facial expressions in autism spectrum disorders

    Directory of Open Access Journals (Sweden)

    Sato Wataru

    2012-08-01

    Full Text Available Abstract Background Impairment of social interaction via facial expressions represents a core clinical feature of autism spectrum disorders (ASD. However, the neural correlates of this dysfunction remain unidentified. Because this dysfunction is manifested in real-life situations, we hypothesized that the observation of dynamic, compared with static, facial expressions would reveal abnormal brain functioning in individuals with ASD. We presented dynamic and static facial expressions of fear and happiness to individuals with high-functioning ASD and to age- and sex-matched typically developing controls and recorded their brain activities using functional magnetic resonance imaging (fMRI. Result Regional analysis revealed reduced activation of several brain regions in the ASD group compared with controls in response to dynamic versus static facial expressions, including the middle temporal gyrus (MTG, fusiform gyrus, amygdala, medial prefrontal cortex, and inferior frontal gyrus (IFG. Dynamic causal modeling analyses revealed that bi-directional effective connectivity involving the primary visual cortex–MTG–IFG circuit was enhanced in response to dynamic as compared with static facial expressions in the control group. Group comparisons revealed that all these modulatory effects were weaker in the ASD group than in the control group. Conclusions These results suggest that weak activity and connectivity of the social brain network underlie the impairment in social interaction involving dynamic facial expressions in individuals with ASD.

  8. MRI of the facial nerve in idiopathic facial palsy

    International Nuclear Information System (INIS)

    Saatci, I.; Sahintuerk, F.; Sennaroglu, L.; Boyvat, F.; Guersel, B.; Besim, A.

    1996-01-01

    The purpose of this prospective study was to define the enhancement pattern of the facial nerve in idiopathic facial paralysis (Bell's palsy) on magnetic resonance (MR) imaging with routine doses of gadolinium-DTPA (0.1 mmol/kg). Using 0.5 T imager, 24 patients were examined with a mean interval time of 13.7 days between the onset of symptoms and the MR examination. Contralateral asymptomatic facial nerves constituted the control group and five of the normal facial nerves (20.8%) showed enhancement confined to the geniculate ganglion. Hence, contrast enhancement limited to the geniculate ganglion in the abnormal facial nerve (3 of 24) was referred to a equivocal. Not encountered in any of the normal facial nerves, enhancement of other segments alone or associated with geniculate ganglion enhancement was considered to be abnormal and noted in 70.8% of the symptomatic facial nerves. The most frequently enhancing segments were the geniculate ganglion and the distal intracanalicular segment. (orig.)

  9. MRI of the facial nerve in idiopathic facial palsy

    Energy Technology Data Exchange (ETDEWEB)

    Saatci, I. [Dept. of Radiology, Hacettepe Univ., Hospital Sihhiye, Ankara (Turkey); Sahintuerk, F. [Dept. of Radiology, Hacettepe Univ., Hospital Sihhiye, Ankara (Turkey); Sennaroglu, L. [Dept. of Otolaryngology, Head and Neck Surgery, Hacettepe Univ., Hospital Sihhiye, Ankara (Turkey); Boyvat, F. [Dept. of Radiology, Hacettepe Univ., Hospital Sihhiye, Ankara (Turkey); Guersel, B. [Dept. of Otolaryngology, Head and Neck Surgery, Hacettepe Univ., Hospital Sihhiye, Ankara (Turkey); Besim, A. [Dept. of Radiology, Hacettepe Univ., Hospital Sihhiye, Ankara (Turkey)

    1996-10-01

    The purpose of this prospective study was to define the enhancement pattern of the facial nerve in idiopathic facial paralysis (Bell`s palsy) on magnetic resonance (MR) imaging with routine doses of gadolinium-DTPA (0.1 mmol/kg). Using 0.5 T imager, 24 patients were examined with a mean interval time of 13.7 days between the onset of symptoms and the MR examination. Contralateral asymptomatic facial nerves constituted the control group and five of the normal facial nerves (20.8%) showed enhancement confined to the geniculate ganglion. Hence, contrast enhancement limited to the geniculate ganglion in the abnormal facial nerve (3 of 24) was referred to a equivocal. Not encountered in any of the normal facial nerves, enhancement of other segments alone or associated with geniculate ganglion enhancement was considered to be abnormal and noted in 70.8% of the symptomatic facial nerves. The most frequently enhancing segments were the geniculate ganglion and the distal intracanalicular segment. (orig.)

  10. Influence of 125I seed interstitial brachytherapy on recovery of facial nerve function

    International Nuclear Information System (INIS)

    Song Tieli; Zheng Lei; Zhang Jie; Cai Zhigang; Yang Zhaohui; Yu Guangyan; Zhang Jianguo

    2010-01-01

    Objective: To study the influence of 125 I seed interstitial brachytherapy in parotid region on the recovery of facial nerve function. Methods: A total of the data of 21 patients with primary parotid carcinoma were treated with resection and 125 I interstitial brachytherapy. All the patients had no facial palsy before operation and the prescribed dose was 60 Gy. During 4 years of follow-up, the House-Brackmann grading scales and ENoG were used to evaluate the function of facial nerve. According to the modified regional House-Brackmann grading scales, the facial nerve branches of patients in affected side were divided into normal and abnormal groups, and were compared with those in contra-lateral side. Results: Post-operation facial palsy occurred in all the patients, but the facial palsy recovered within 6 months. The latency time differences between affected side and contralateral side were statistically significant in abnormal group from 1 week to 6 months after treatment (t=2.362, P=0.028), and were also different in normal group 1 week after treatment (t=2.522, P=0.027). Conclusions: 125 I interstitital brachytherapy has no influence on recovery of facial nerve function after tumor resection and no delayed facial nerve damage. (authors)

  11. The effect of facial makeup on the frequency of drivers stopping for hitchhikers.

    Science.gov (United States)

    Guéguen, Nicolas; Lamy, Lubomir

    2013-08-01

    Judgments of photographs have shown that makeup enhances ratings of women's facial attractiveness. The present study assessed whether makeup affects the stopping behavior of drivers in response to a hitchhiker's signal. Four 20- to 22-year-old female confederates wore facial makeup, or not, while pretending to be hitchhiking. Frequency of stopping was compared in 1,600 male and female drivers. Facial makeup was associated with an increase in the number of male drivers who stopped to offer a ride. Makeup did not affect frequency of stopping by female drivers.

  12. Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults.

    Directory of Open Access Journals (Sweden)

    Nicholas Robert Cooper

    2013-04-01

    Full Text Available The human mirror neuron system (hMNS has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders. In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression and if so, would this differ according to the individual level of autistic traits (high versus low AQ score.Participants were presented with 3 second films of actors opening and closing their hands (classic hMNS mu-suppression protocol while simultaneously wearing happy, angry or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta ERD to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group.In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self reported traits of autism.

  13. Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults.

    Science.gov (United States)

    Cooper, Nicholas R; Simpson, Andrew; Till, Amy; Simmons, Kelly; Puzzo, Ignazio

    2013-01-01

    The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism.

  14. Facial paralysis

    Science.gov (United States)

    ... otherwise healthy, facial paralysis is often due to Bell palsy . This is a condition in which the facial ... speech, or occupational therapist. If facial paralysis from Bell palsy lasts for more than 6 to 12 months, ...

  15. Lateralization for Processing Facial Emotions in Gay Men, Heterosexual Men, and Heterosexual Women.

    Science.gov (United States)

    Rahman, Qazi; Yusuf, Sifat

    2015-07-01

    This study tested whether male sexual orientation and gender nonconformity influenced functional cerebral lateralization for the processing of facial emotions. We also tested for the effects of sex of poser and emotion displayed on putative differences. Thirty heterosexual men, 30 heterosexual women, and 40 gay men completed measures of demographic variables, recalled childhood gender nonconformity (CGN), IQ, and the Chimeric Faces Test (CFT). The CFT depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression and performance is measured using a "laterality quotient" (LQ) score. We found that heterosexual men were significantly more right-lateralized when viewing female faces compared to heterosexual women and gay men, who did not differ significantly from each other. Heterosexual women and gay men were more left-lateralized for processing female faces. There were no significant group differences in lateralization for male faces. These results remained when controlling for age and IQ scores. There was no significant effect of CGN on LQ scores. These data suggest that gay men are feminized in some aspects of functional cerebral lateralization for facial emotion. The results were discussed in relation to the selectivity of functional lateralization and putative brain mechanisms underlying sexual attraction towards opposite-sex and same-sex targets.

  16. Unwanted facial hair: affects, effects and solutions.

    Science.gov (United States)

    Blume-Peytavi, U; Gieler, U; Hoffmann, R; Lavery, S; Shapiro, J

    2007-01-01

    The following is a review of a satellite symposium held at the EHRS Meeting in June 2006. U.B.P. reminded the audience that unwanted facial hair (UFH) is an important issue; over 40% of the women in the general population have some degree of UFH, and its psychological and psychosocial impact should not be underestimated. The treatment of UFH involves many different disciplines, and the symposium offered the latest thinking in different aspects of the disorder. S.L. outlined the current concepts surrounding polycystic ovarian syndrome, and U.G. addressed the psychological aspects of UFH. J.S. described the current treatment options for UFH, followed by U.B.P.'s evidence-based therapy review. Finally, R.H. reviewed the latest trial results with Trichoscan, a method being investigated for assessing UFH removal. Copyright (c) 2007 S. Karger AG, Basel.

  17. Facial Sports Injuries

    Science.gov (United States)

    ... Marketplace Find an ENT Doctor Near You Facial Sports Injuries Facial Sports Injuries Patient Health Information News ... should receive immediate medical attention. Prevention Of Facial Sports Injuries The best way to treat facial sports ...

  18. Facial Cosmetic Surgery

    Science.gov (United States)

    ... to find out more. Facial Cosmetic Surgery Facial Cosmetic Surgery Extensive education and training in surgical procedures ... to find out more. Facial Cosmetic Surgery Facial Cosmetic Surgery Extensive education and training in surgical procedures ...

  19. Dynamic Facial Expression of Emotion Made Easy

    OpenAIRE

    Broekens, Joost; Qu, Chao; Brinkman, Willem-Paul

    2012-01-01

    Facial emotion expression for virtual characters is used in a wide variety of areas. Often, the primary reason to use emotion expression is not to study emotion expression generation per se, but to use emotion expression in an application or research project. What is then needed is an easy to use and flexible, but also validated mechanism to do so. In this report we present such a mechanism. It enables developers to build virtual characters with dynamic affective facial expressions. The mecha...

  20. Facial biometrics of peri-oral changes in Crohn's disease.

    Science.gov (United States)

    Zou, L; Adegun, O K; Willis, A; Fortune, Farida

    2014-05-01

    Crohn's disease is a chronic relapsing and remitting inflammatory condition which affects any part of the gastrointestinal tract. In the oro-facial region, patients can present peri-oral swellings which results in severe facial disfigurement. To date, assessing the degree of facial changes and evaluation of treatment outcomes relies on clinical observation and semi-quantitative methods. In this paper, we describe the development of a robust and reproducible measurement strategy using 3-D facial biometrics to objectively quantify the extent and progression of oro-facial Crohn's disease. Using facial laser scanning, 32 serial images from 13 Crohn's patients attending the Oral Medicine clinic were acquired during relapse, remission, and post-treatment phases. Utilising theories of coordinate metrology, the facial images were subjected to registration, regions of interest identification, and reproducible repositioning prior to obtaining volume measurements. To quantify the changes in tissue volume, scan images from consecutive appointments were compared to the baseline (first scan image). Reproducibility test was performed to ascertain the degree of uncertainty in volume measurements. 3-D facial biometric imaging is a reliable method to identify and quantify peri-oral swelling in Crohn's patients. Comparison of facial scan images at different phases of the disease revealed precisely profile and volume changes. The volume measurements were highly reproducible as adjudged from the 1% standard deviation. 3-D facial biometrics measurements in Crohn's patients with oro-facial involvement offers a quick, robust, economical and objective approach for guided therapeutic intervention and routine assessment of treatment efficacy on the clinic.

  1. Ethnic differences in the structural properties of facial skin.

    Science.gov (United States)

    Sugiyama-Nakagiri, Yoriko; Sugata, Keiichi; Hachiya, Akira; Osanai, Osamu; Ohuchi, Atsushi; Kitahara, Takashi

    2009-02-01

    Conspicuous facial pores are one type of serious aesthetic defects for many women. However, the mechanism(s) that underlie the conspicuousness of facial pores remains unclear. We previously characterized the epidermal architecture around facial pores that correlated with the appearance of those pores. A survey was carried out to elucidate ethnic-dependent differences in facial pore size and in epidermal architecture. The subjects included 80 healthy women (aged 30-39: Caucasians, Asians, Hispanics and African Americans) living in Dallas in the USA. First, surface replicas were collected to compare pore sizes of cheek skin. Second, horizontal cross-sectioned images from cheek skin were obtained non-invasively from the same subjects using in vivo confocal laser scanning microscopy (CLSM) and the severity of impairment of epidermal architecture around facial pores was determined. Finally, to compare racial differences in the architecture of the interfollicular epidermis of facial cheek skin, horizontal cross-sectioned images were obtained and the numbers of dermal papillae were counted. Asians had the smallest pore areas compared with other racial groups. Regarding the epidermal architecture around facial pores, all ethnic groups observed in this study had similar morphological features and African Americans showed substantially more severe impairment of architecture around facial pores than any other racial group. In addition, significant differences were observed in the architecture of the interfollicular epidermis between ethnic groups. These results suggest that facial pore size, the epidermal architecture around facial pores and the architecture of the interfollicular epidermis differ between ethnic groups. This might affect the appearance of facial pores.

  2. Morphological integration of soft-tissue facial morphology in Down Syndrome and siblings.

    Science.gov (United States)

    Starbuck, John; Reeves, Roger H; Richtsmeier, Joan

    2011-12-01

    Down syndrome (DS), resulting from trisomy of chromosome 21, is the most common live-born human aneuploidy. The phenotypic expression of trisomy 21 produces variable, though characteristic, facial morphology. Although certain facial features have been documented quantitatively and qualitatively as characteristic of DS (e.g., epicanthic folds, macroglossia, and hypertelorism), all of these traits occur in other craniofacial conditions with an underlying genetic cause. We hypothesize that the typical DS face is integrated differently than the face of non-DS siblings, and that the pattern of morphological integration unique to individuals with DS will yield information about underlying developmental associations between facial regions. We statistically compared morphological integration patterns of immature DS faces (N = 53) with those of non-DS siblings (N = 54), aged 6-12 years using 31 distances estimated from 3D coordinate data representing 17 anthropometric landmarks recorded on 3D digital photographic images. Facial features are affected differentially in DS, as evidenced by statistically significant differences in integration both within and between facial regions. Our results suggest a differential affect of trisomy on facial prominences during craniofacial development. 2011 Wiley Periodicals, Inc.

  3. Polyacrylamide gel for facial wasting rehabilitation: how many milliliters per session?

    Science.gov (United States)

    Rauso, R; Gherardini, G; Parlato, V; Amore, R; Tartaro, G

    2012-02-01

    Facial lipoatrophy is most distressing for HIV patients in pharmacologic treatment. Nonabsorbable fillers are widely used to restore facial features in these patients. We evaluated the safety and aesthetic outcomes of two samples of HIV+ patients affected by facial wasting who received different filling protocols of the nonabsorbable filler Aquamid® to restore facial wasting. Thirty-one HIV+ patients affected by facial wasting received injections of the nonabsorbable filler Aquamid for facial wasting rehabilitation. Patients were randomly divided into two groups: A and B. In group A, the facial defect was corrected by injecting up to 8 ml of product in the first session; patients were retreated after every 8th week with touch-up procedures until full correction was observed. In group B, facial defects were corrected by injecting 2 ml of product per session; patients were retreated after every 8th week until full correction was observed. Patients of group A noted a great improvement after the first filling procedure. Patients in group B noted improvement of their face after four filling procedures on average. Local infection, foreign-body reaction, and migration of the product were not observed in either group during follow-up. The rehabilitation obtained with a megafilling session and further touch-up procedures and that with a gradual build-up of the localized soft-tissue loss seem not to have differences in terms of safety for the patients. However, with a megafilling session satisfaction is achieved earlier and it is possible to reduce hospital costs in terms of gauze, gloves, and other items.

  4. Facial emotion recognition in Parkinson's disease: A review and new hypotheses

    Science.gov (United States)

    Vérin, Marc; Sauleau, Paul; Grandjean, Didier

    2018-01-01

    Abstract Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional‐processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia‐based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473661

  5. Deep learning the dynamic appearance and shape of facial action units

    OpenAIRE

    Jaiswal, Shashank; Valstar, Michel F.

    2016-01-01

    Spontaneous facial expression recognition under uncontrolled conditions is a hard task. It depends on multiple factors including shape, appearance and dynamics of the facial features, all of which are adversely affected by environmental noise and low intensity signals typical of such conditions. In this work, we present a novel approach to Facial Action Unit detection using a combination of Convolutional and Bi-directional Long Short-Term Memory Neural Networks (CNN-BLSTM), which jointly lear...

  6. Effect of a Facial Muscle Exercise Device on Facial Rejuvenation.

    Science.gov (United States)

    Hwang, Ui-Jae; Kwon, Oh-Yun; Jung, Sung-Hoon; Ahn, Sun-Hee; Gwak, Gyeong-Tae

    2018-01-20

    The efficacy of facial muscle exercises (FMEs) for facial rejuvenation is controversial. In the majority of previous studies, nonquantitative assessment tools were used to assess the benefits of FMEs. This study examined the effectiveness of FMEs using a Pao (MTG, Nagoya, Japan) device to quantify facial rejuvenation. Fifty females were asked to perform FMEs using a Pao device for 30 seconds twice a day for 8 weeks. Facial muscle thickness and cross-sectional area were measured sonographically. Facial surface distance, surface area, and volumes were determined using a laser scanning system before and after FME. Facial muscle thickness, cross-sectional area, midfacial surface distances, jawline surface distance, and lower facial surface area and volume were compared bilaterally before and after FME using a paired Student t test. The cross-sectional areas of the zygomaticus major and digastric muscles increased significantly (right: P jawline surface distances (right: P = 0.004, left: P = 0.003) decreased significantly after FME using the Pao device. The lower facial surface areas (right: P = 0.005, left: P = 0.006) and volumes (right: P = 0.001, left: P = 0.002) were also significantly reduced after FME using the Pao device. FME using the Pao device can increase facial muscle thickness and cross-sectional area, thus contributing to facial rejuvenation. © 2018 The American Society for Aesthetic Plastic Surgery, Inc.

  7. Exposure to the self-face facilitates identification of dynamic facial expressions: influences on individual differences.

    Science.gov (United States)

    Li, Yuan Hang; Tottenham, Nim

    2013-04-01

    A growing literature suggests that the self-face is involved in processing the facial expressions of others. The authors experimentally activated self-face representations to assess its effects on the recognition of dynamically emerging facial expressions of others. They exposed participants to videos of either their own faces (self-face prime) or faces of others (nonself-face prime) prior to a facial expression judgment task. Their results show that experimentally activating self-face representations results in earlier recognition of dynamically emerging facial expression. As a group, participants in the self-face prime condition recognized expressions earlier (when less affective perceptual information was available) compared to participants in the nonself-face prime condition. There were individual differences in performance, such that poorer expression identification was associated with higher autism traits (in this neurocognitively healthy sample). However, when randomized into the self-face prime condition, participants with high autism traits performed as well as those with low autism traits. Taken together, these data suggest that the ability to recognize facial expressions in others is linked with the internal representations of our own faces. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  8. Comparison of hemihypoglossal-facial nerve transposition with a cross-facial nerve graft and muscle transplant for the rehabilitation of facial paralysis using the facial clima method.

    Science.gov (United States)

    Hontanilla, Bernardo; Vila, Antonio

    2012-02-01

    To compare quantitatively the results obtained after hemihypoglossal nerve transposition and microvascular gracilis transfer associated with a cross facial nerve graft (CFNG) for reanimation of a paralysed face, 66 patients underwent hemihypoglossal transposition (n = 25) or microvascular gracilis transfer and CFNG (n = 41). The commissural displacement (CD) and commissural contraction velocity (CCV) in the two groups were compared using the system known as Facial clima. There was no inter-group variability between the groups (p > 0.10) in either variable. However, intra-group variability was detected between the affected and healthy side in the transposition group (p = 0.036 and p = 0.017, respectively). The transfer group had greater symmetry in displacement of the commissure (CD) and commissural contraction velocity (CCV) than the transposition group and patients were more satisfied. However, the transposition group had correct symmetry at rest but more asymmetry of CCV and CD when smiling.

  9. Positive and negative symptom scores are correlated with activation in different brain regions during facial emotion perception in schizophrenia patients: a voxel-based sLORETA source activity study.

    Science.gov (United States)

    Kim, Do-Won; Kim, Han-Sung; Lee, Seung-Hwan; Im, Chang-Hwan

    2013-12-01

    Schizophrenia is one of the most devastating of all mental illnesses, and has dimensional characteristics that include both positive and negative symptoms. One problem reported in schizophrenia patients is that they tend to show deficits in face emotion processing, on which negative symptoms are thought to have stronger influence. In this study, four event-related potential (ERP) components (P100, N170, N250, and P300) and their source activities were analyzed using EEG data acquired from 23 schizophrenia patients while they were presented with facial emotion picture stimuli. Correlations between positive and negative syndrome scale (PANSS) scores and source activations during facial emotion processing were calculated to identify the brain areas affected by symptom scores. Our analysis demonstrates that PANSS positive scores are negatively correlated with major areas of the left temporal lobule for early ERP components (P100, N170) and with the right middle frontal lobule for a later component (N250), which indicates that positive symptoms affect both early face processing and facial emotion processing. On the other hand, PANSS negative scores are negatively correlated with several clustered regions, including the left fusiform gyrus (at P100), most of which are not overlapped with regions showing correlations with PANSS positive scores. Our results suggest that positive and negative symptoms affect independent brain regions during facial emotion processing, which may help to explain the heterogeneous characteristics of schizophrenia. © 2013 Elsevier B.V. All rights reserved.

  10. Treatment of Facial Pain with I Ching Balance Acupuncture.

    Science.gov (United States)

    Kotlyar, Arkady

    2017-12-01

    Background: Trigeminal neuralgia (TN) is the most common cranial neuralgia in adults, with a slightly higher incidence in women than in men. This chronic pain condition affects the trigeminal nerve, also known as the 5th cranial nerve. It is one of the most deeply distributed nerves in the head. Antiseizure drugs are the main biomedical treatment of TN. However, TN is not the only source of facial pain. Background persistent idiopathic facial pain (PIFP) is also a chronic disorder, recurring daily for more than 2 hours per day over more than 3 months. PIFP occurs in the absence of a neurologic deficit. The underlying pathophysiologies of TN and PIFP are still unknown, and treatment options have not been sufficiently evaluated. Nevertheless, neuropathic mechanisms could be relevant in both TN and PIFP. Cases: A 65-year-old Caucasian female with left facial pain was diagnosed by a neurologist with TN ∼2.5 years prior to seeking acupuncture treatment. A 42-year-old Caucasian female with left and right facial pain was diagnosed by a neurologist with PIFP ∼3 years prior to commencing acupuncture treatment. The cause of facial pain was treated with 60-minute sessions of I Ching Balance Acupuncture (ICBA) twice per week. Prior to each session, the effect of the previous session was recorded carefully in the patients' files. Results: A complete dissipation of pain was achieved after 29 and 60 ICBA sessions in the TN and the PIFP patient, respectively. Conclusions: The present article is the one of the first to demonstrate the efficacy of ICBA treatment for refractory facial pain. As the present article shows, ICBA treatment affects facial pain of different types successfully. However, additional larger-scale studies are necessary to validate the efficacy of ICBA in TN and PIFP treatment.

  11. Facial Fractures.

    Science.gov (United States)

    Ricketts, Sophie; Gill, Hameet S; Fialkov, Jeffery A; Matic, Damir B; Antonyshyn, Oleh M

    2016-02-01

    After reading this article, the participant should be able to: 1. Demonstrate an understanding of some of the changes in aspects of facial fracture management. 2. Assess a patient presenting with facial fractures. 3. Understand indications and timing of surgery. 4. Recognize exposures of the craniomaxillofacial skeleton. 5. Identify methods for repair of typical facial fracture patterns. 6. Discuss the common complications seen with facial fractures. Restoration of the facial skeleton and associated soft tissues after trauma involves accurate clinical and radiologic assessment to effectively plan a management approach for these injuries. When surgical intervention is necessary, timing, exposure, sequencing, and execution of repair are all integral to achieving the best long-term outcomes for these patients.

  12. Holistic Processing of Static and Moving Faces

    Science.gov (United States)

    Zhao, Mintao; Bülthoff, Isabelle

    2017-01-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability--holistic face processing--remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based…

  13. Expanded flap to repair facial scar left by radiotherapy of hemangioma.

    Science.gov (United States)

    Zhao, Donghong; Ma, Xinrong; Li, Jiang; Zhang, Lingfeng; Zhu, Baozhen

    2014-09-01

    This study explored the feasibility and clinical efficacy of expanded flap to repair facial scar left by radiotherapy of hemangioma. From March 2000 to April 2011, 13 cases of facial cicatrices left by radiotherapy of hemangioma have been treated with implantation surgery of facial skin dilator under local anesthesia. After water flood expansion for 1-2 months, resection of facial scar was performed, and wound repairing with expansion flap transfer was done. Thirteen patients were followed up from 5 months to 3 years. All patients tolerated flap transfer well; no contracture occurred during the facial expansion flap transfer. The incision scar was not obvious, and its color and texture were identical to surrounding skin. In conclusion, the use of expanded flap transfer to repair the facial scar left by radiotherapy of hemangioma is advantageous due to its simplicity, flexibility, and large area of repairing. This method does not affect the subsequent facial appearance.

  14. The Prevalence of Cosmetic Facial Plastic Procedures among Facial Plastic Surgeons.

    Science.gov (United States)

    Moayer, Roxana; Sand, Jordan P; Han, Albert; Nabili, Vishad; Keller, Gregory S

    2018-04-01

    This is the first study to report on the prevalence of cosmetic facial plastic surgery use among facial plastic surgeons. The aim of this study is to determine the frequency with which facial plastic surgeons have cosmetic procedures themselves. A secondary aim is to determine whether trends in usage of cosmetic facial procedures among facial plastic surgeons are similar to that of nonsurgeons. The study design was an anonymous, five-question, Internet survey distributed via email set in a single academic institution. Board-certified members of the American Academy of Facial Plastic and Reconstructive Surgery (AAFPRS) were included in this study. Self-reported history of cosmetic facial plastic surgery or minimally invasive procedures were recorded. The survey also queried participants for demographic data. A total of 216 members of the AAFPRS responded to the questionnaire. Ninety percent of respondents were male ( n  = 192) and 10.3% were female ( n  = 22). Thirty-three percent of respondents were aged 31 to 40 years ( n  = 70), 25% were aged 41 to 50 years ( n  = 53), 21.4% were aged 51 to 60 years ( n  = 46), and 20.5% were older than 60 years ( n  = 44). Thirty-six percent of respondents had a surgical cosmetic facial procedure and 75% has at least one minimally invasive cosmetic facial procedure. Facial plastic surgeons are frequent users of cosmetic facial plastic surgery. This finding may be due to access, knowledge base, values, or attitudes. By better understanding surgeon attitudes toward facial plastic surgery, we can improve communication with patients and delivery of care. This study is a first step in understanding use of facial plastic procedures among facial plastic surgeons. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. Psychopathic traits in adolescents and recognition of emotion in facial expressions

    Directory of Open Access Journals (Sweden)

    Silvio José Lemos Vasconcellos

    2014-12-01

    Full Text Available Recent studies have investigated the ability of adult psychopaths and children with psychopathy traits to identify specific facial expressions of emotion. Conclusive results have not yet been found regarding whether psychopathic traits are associated with a specific deficit in the ability of identifying negative emotions such as fear and sadness. This study compared 20 adolescents with psychopathic traits and 21 adolescents without these traits in terms of their ability to recognize facial expressions of emotion using facial stimuli presented during 200 milliseconds, 500 milliseconds, and 1 second expositions. Analyses indicated significant differences between the two groups' performances only for fear and when displayed for 200 ms. This finding is consistent with findings from other studies in the field and suggests that controlling the duration of exposure to affective stimuli in future studies may help to clarify the mechanisms underlying the facial affect recognition deficits of individuals with psychopathic traits.

  16. Trisomy 21 and facial developmental instability.

    Science.gov (United States)

    Starbuck, John M; Cole, Theodore M; Reeves, Roger H; Richtsmeier, Joan T

    2013-05-01

    The most common live-born human aneuploidy is trisomy 21, which causes Down syndrome (DS). Dosage imbalance of genes on chromosome 21 (Hsa21) affects complex gene-regulatory interactions and alters development to produce a wide range of phenotypes, including characteristic facial dysmorphology. Little is known about how trisomy 21 alters craniofacial morphogenesis to create this characteristic appearance. Proponents of the "amplified developmental instability" hypothesis argue that trisomy 21 causes a generalized genetic imbalance that disrupts evolutionarily conserved developmental pathways by decreasing developmental homeostasis and precision throughout development. Based on this model, we test the hypothesis that DS faces exhibit increased developmental instability relative to euploid individuals. Developmental instability was assessed by a statistical analysis of fluctuating asymmetry. We compared the magnitude and patterns of fluctuating asymmetry among siblings using three-dimensional coordinate locations of 20 anatomic landmarks collected from facial surface reconstructions in four age-matched samples ranging from 4 to 12 years: (1) DS individuals (n = 55); (2) biological siblings of DS individuals (n = 55); 3) and 4) two samples of typically developing individuals (n = 55 for each sample), who are euploid siblings and age-matched to the DS individuals and their euploid siblings (samples 1 and 2). Identification in the DS sample of facial prominences exhibiting increased fluctuating asymmetry during facial morphogenesis provides evidence for increased developmental instability in DS faces. We found the highest developmental instability in facial structures derived from the mandibular prominence and lowest in facial regions derived from the frontal prominence. Copyright © 2013 Wiley Periodicals, Inc.

  17. Reproducibility of the dynamics of facial expressions in unilateral facial palsy.

    Science.gov (United States)

    Alagha, M A; Ju, X; Morley, S; Ayoub, A

    2018-02-01

    The aim of this study was to assess the reproducibility of non-verbal facial expressions in unilateral facial paralysis using dynamic four-dimensional (4D) imaging. The Di4D system was used to record five facial expressions of 20 adult patients. The system captured 60 three-dimensional (3D) images per second; each facial expression took 3-4seconds which was recorded in real time. Thus a set of 180 3D facial images was generated for each expression. The procedure was repeated after 30min to assess the reproducibility of the expressions. A mathematical facial mesh consisting of thousands of quasi-point 'vertices' was conformed to the face in order to determine the morphological characteristics in a comprehensive manner. The vertices were tracked throughout the sequence of the 180 images. Five key 3D facial frames from each sequence of images were analyzed. Comparisons were made between the first and second capture of each facial expression to assess the reproducibility of facial movements. Corresponding images were aligned using partial Procrustes analysis, and the root mean square distance between them was calculated and analyzed statistically (paired Student t-test, PFacial expressions of lip purse, cheek puff, and raising of eyebrows were reproducible. Facial expressions of maximum smile and forceful eye closure were not reproducible. The limited coordination of various groups of facial muscles contributed to the lack of reproducibility of these facial expressions. 4D imaging is a useful clinical tool for the assessment of facial expressions. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  18. Psychopathy and facial emotion recognition ability in patients with bipolar affective disorder with or without delinquent behaviors.

    Science.gov (United States)

    Demirel, Husrev; Yesilbas, Dilek; Ozver, Ismail; Yuksek, Erhan; Sahin, Feyzi; Aliustaoglu, Suheyla; Emul, Murat

    2014-04-01

    It is well known that patients with bipolar disorder are more prone to violence and have more criminal behaviors than general population. A strong relationship between criminal behavior and inability to empathize and imperceptions to other person's feelings and facial expressions increases the risk of delinquent behaviors. In this study, we aimed to investigate the deficits of facial emotion recognition ability in euthymic bipolar patients who committed an offense and compare with non-delinquent euthymic patients with bipolar disorder. Fifty-five euthymic patients with delinquent behaviors and 54 non-delinquent euthymic bipolar patients as a control group were included in the study. Ekman's Facial Emotion Recognition Test, sociodemographic data, Hare Psychopathy Checklist, Hamilton Depression Rating Scale and Young Mania Rating Scale were applied to both groups. There were no significant differences between case and control groups in the meaning of average age, gender, level of education, mean age onset of disease and suicide attempt (p>0.05). The three types of most committed delinquent behaviors in patients with euthymic bipolar disorder were as follows: injury (30.8%), threat or insult (20%) and homicide (12.7%). The best accurate percentage of identified facial emotion was "happy" (>99%, for both) while the worst misidentified facial emotion was "fear" in both groups (delinquent behaviors than non-delinquent ones (pdelinquent behaviors. We have shown that patients with bipolar disorder who had delinquent behaviors may have some social interaction problems i.e., misrecognizing fearful and modestly anger facial emotions and need some more time to response facial emotions even in remission. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Automatic three-dimensional quantitative analysis for evaluation of facial movement.

    Science.gov (United States)

    Hontanilla, B; Aubá, C

    2008-01-01

    The aim of this study is to present a new 3D capture system of facial movements called FACIAL CLIMA. It is an automatic optical motion system that involves placing special reflecting dots on the subject's face and video recording with three infrared-light cameras the subject performing several face movements such as smile, mouth puckering, eye closure and forehead elevation. Images from the cameras are automatically processed with a software program that generates customised information such as 3D data on velocities and areas. The study has been performed in 20 healthy volunteers. The accuracy of the measurement process and the intrarater and interrater reliabilities have been evaluated. Comparison of a known distance and angle with those obtained by FACIAL CLIMA shows that this system is accurate to within 0.13 mm and 0.41 degrees . In conclusion, the accuracy of the FACIAL CLIMA system for evaluation of facial movements is demonstrated and also the high intrarater and interrater reliability. It has advantages with respect to other systems that have been developed for evaluation of facial movements, such as short calibration time, short measuring time, easiness to use and it provides not only distances but also velocities and areas. Thus the FACIAL CLIMA system could be considered as an adequate tool to assess the outcome of facial paralysis reanimation surgery. Thus, patients with facial paralysis could be compared between surgical centres such that effectiveness of facial reanimation operations could be evaluated.

  20. Impaired recognition of happy facial expressions in bipolar disorder.

    Science.gov (United States)

    Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M

    2014-08-01

    The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.

  1. Cross-cultural evaluations of avatar facial expressions designed by Western and Japanese Designers

    DEFF Research Database (Denmark)

    Koda, Tomoko; Rehm, Matthias; André, Elisabeth

    2008-01-01

    The goal of the study is to investigate cultural differences in avatar expression evaluation and apply findings from psychological study in human facial expression recognition. Our previous study using Japanese designed avatars showed there are cultural differences in interpreting avatar facial...... expressions, and the psychological theory that suggests physical proximity affects facial expression recognition accuracy is also applicable to avatar facial expressions. This paper summarizes the early results of the successive experiment that uses western designed avatars. We observed tendencies of cultural...

  2. Emotions over time: synchronicity and development of subjective, physiological, and facial affective reactions to music.

    Science.gov (United States)

    Grewe, Oliver; Nagel, Frederik; Kopiez, Reinhard; Altenmüller, Eckart

    2007-11-01

    Most people are able to identify basic emotions expressed in music and experience affective reactions to music. But does music generally induce emotion? Does it elicit subjective feelings, physiological arousal, and motor reactions reliably in different individuals? In this interdisciplinary study, measurement of skin conductance, facial muscle activity, and self-monitoring were synchronized with musical stimuli. A group of 38 participants listened to classical, rock, and pop music and reported their feelings in a two-dimensional emotion space during listening. The first entrance of a solo voice or choir and the beginning of new sections were found to elicit interindividual changes in subjective feelings and physiological arousal. Quincy Jones' "Bossa Nova" motivated movement and laughing in more than half of the participants. Bodily reactions such as "goose bumps" and "shivers" could be stimulated by the "Tuba Mirum" from Mozart's Requiem in 7 of 38 participants. In addition, the authors repeated the experiment seven times with one participant to examine intraindividual stability of effects. This exploratory combination of approaches throws a new light on the astonishing complexity of affective music listening.

  3. Right Hemispheric Dominance in Processing of Unconscious Negative Emotion

    Science.gov (United States)

    Sato, Wataru; Aoki, Satoshi

    2006-01-01

    Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or…

  4. The influence of different facial components on facial aesthetics.

    NARCIS (Netherlands)

    Faure, J.C.; Rieffe, C.; Maltha, J.C.

    2002-01-01

    Facial aesthetics have an important influence on social behaviour and perception in our society. The purpose of the present study was to evaluate the effect of facial symmetry and inter-ocular distance on the assessment of facial aesthetics, factors that are often suggested as major contributors to

  5. Emotion in Stories: Facial EMG Evidence for Both Mental Simulation and Moral Evaluation

    Directory of Open Access Journals (Sweden)

    Björn 't Hart

    2018-04-01

    Full Text Available Facial electromyography research shows that corrugator supercilii (“frowning muscle” activity tracks the emotional valence of linguistic stimuli. Grounded or embodied accounts of language processing take such activity to reflect the simulation or “reenactment” of emotion, as part of the retrieval of word meaning (e.g., of “furious” and/or of building a situation model (e.g., for “Mark is furious”. However, the same muscle also expresses our primary emotional evaluation of things we encounter. Language-driven affective simulation can easily be at odds with the reader's affective evaluation of what language describes (e.g., when we like Mark being furious. To examine what happens in such cases, we independently manipulated simulation valence and moral evaluative valence in short narratives. Participants first read about characters behaving in a morally laudable or objectionable fashion: this immediately led to corrugator activity reflecting positive or negative affect. Next, and critically, a positive or negative event befell these same characters. Here, the corrugator response did not track the valence of the event, but reflected both simulation and moral evaluation. This highlights the importance of unpacking coarse notions of affective meaning in language processing research into components that reflect simulation and evaluation. Our results also call for a re-evaluation of the interpretation of corrugator EMG, as well as other affect-related facial muscles and other peripheral physiological measures, as unequivocal indicators of simulation. Research should explore how such measures behave in richer and more ecologically valid language processing, such as narrative; refining our understanding of simulation within a framework of grounded language comprehension.

  6. Extraction and representation of common feature from uncertain facial expressions with cloud model.

    Science.gov (United States)

    Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing

    2017-12-01

    Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.

  7. Prosthetic management of mid-facial defect with magnet-retained silicone prosthesis

    OpenAIRE

    Buzayan, M. M.

    2014-01-01

    Background and aim: Mid-facial defect is one of the most disfiguring and impairing defects. A design of prosthesis that is aesthetic and stable can be precious to a patient who has lost part of his face due to surgical excision. Prosthesis can restore the patients' self-esteem and confidence, which affects the patients and their life style. The aim of this case report is to describe a technique of mid-facial silicone prosthesis fabrication. Technique: To provide an aesthetic and stable facial...

  8. Greater perceptual sensitivity to happy facial expression.

    Science.gov (United States)

    Maher, Stephen; Ekstrom, Tor; Chen, Yue

    2014-01-01

    Perception of subtle facial expressions is essential for social functioning; yet it is unclear if human perceptual sensitivities differ in detecting varying types of facial emotions. Evidence diverges as to whether salient negative versus positive emotions (such as sadness versus happiness) are preferentially processed. Here, we measured perceptual thresholds for the detection of four types of emotion in faces--happiness, fear, anger, and sadness--using psychophysical methods. We also evaluated the association of the perceptual performances with facial morphological changes between neutral and respective emotion types. Human observers were highly sensitive to happiness compared with the other emotional expressions. Further, this heightened perceptual sensitivity to happy expressions can be attributed largely to the emotion-induced morphological change of a particular facial feature (end-lip raise).

  9. The role of great auricular-facial nerve neurorrhaphy in facial nerve damage

    OpenAIRE

    Sun, Yan; Liu, Limei; Han, Yuechen; Xu, Lei; Zhang, Daogong; Wang, Haibo

    2015-01-01

    Background: Facial nerve is easy to be damaged, and there are many reconstructive methods for facial nerve reconstructive, such as facial nerve end to end anastomosis, the great auricular nerve graft, the sural nerve graft, or hypoglossal-facial nerve anastomosis. However, there is still little study about great auricular-facial nerve neurorrhaphy. The aim of the present study was to identify the role of great auricular-facial nerve neurorrhaphy and the mechanism. Methods: Rat models of facia...

  10. A comparison of facial emotion processing in neurological and psychiatric conditions

    Directory of Open Access Journals (Sweden)

    Benoit eBediou

    2012-04-01

    Full Text Available Investigating the relative severity of emotion recognition deficit across different clinical and high-risk populations has potential implications not only for the prevention, diagnosis and treatment of these diseases, but also for our understanding of the neurobiological mechanisms of emotion perception itself. We reanalyzed data from 4 studies in which we examined facial expression and gender recognition using the same tasks and stimuli. We used a standardized and bias-corrected measure of effect size (Cohen’s D to assess the extent of impairments in frontotemporal dementia (FTD, Parkinson’s disease treated by L-DOPA (PD-ON or not (PD-OFF, amnestic Mild Cognitive Impairment (aMCI, Alzheimer’s disease at mild dementia stage (AD, major depressive disorder (MDD, remitted schizophrenia (SCZ-rem, first-episode schizophrenia before (SCZ-OFF and after (SCZ-ON medication, as well as unaffected siblings of partients with schizophrenia (SIB. Analyses revealed a pattern of differential impairment of emotion (but not gender recognition, consistent with the extent of impairment of the fronto-temporal neural networks involved in the processing of faces and facial expressions. Our transnosographic approach combining clinical and high-risk populations with the impact of medication brings new information on the trajectory of impaired emotion perception in neuropsychiatric conditions, and on the neural networks and neurotransmitter systems subserving emotion perception.

  11. Stereotypes and prejudice affect the recognition of emotional body postures.

    Science.gov (United States)

    Bijlstra, Gijsbert; Holland, Rob W; Dotsch, Ron; Wigboldus, Daniel H J

    2018-03-26

    Most research on emotion recognition focuses on facial expressions. However, people communicate emotional information through bodily cues as well. Prior research on facial expressions has demonstrated that emotion recognition is modulated by top-down processes. Here, we tested whether this top-down modulation generalizes to the recognition of emotions from body postures. We report three studies demonstrating that stereotypes and prejudice about men and women may affect how fast people classify various emotional body postures. Our results suggest that gender cues activate gender associations, which affect the recognition of emotions from body postures in a top-down fashion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Facial soft tissue analysis among various vertical facial patterns

    International Nuclear Information System (INIS)

    Jeelani, W.; Fida, M.; Shaikh, A.

    2016-01-01

    Background: The emergence of soft tissue paradigm in orthodontics has made various soft tissue parameters an integral part of the orthodontic problem list. The purpose of this study was to determine and compare various facial soft tissue parameters on lateral cephalograms among patients with short, average and long facial patterns. Methods: A cross-sectional study was conducted on the lateral cephalograms of 180 adult subjects divided into three equal groups, i.e., short, average and long face according to the vertical facial pattern. Incisal display at rest, nose height, upper and lower lip lengths, degree of lip procumbency and the nasolabial angle were measured for each individual. The gender differences for these soft tissue parameters were determined using Mann-Whitney U test while the comparison among different facial patterns was performed using Kruskal-Wallis test. Results: Significant differences in the incisal display at rest, total nasal height, lip procumbency, the nasolabial angle and the upper and lower lip lengths were found among the three vertical facial patterns. A significant positive correlation of nose and lip dimensions was found with the underlying skeletal pattern. Similarly, the incisal display at rest, upper and lower lip procumbency and the nasolabial angle were significantly correlated with the lower anterior facial height. Conclusion: Short facial pattern is associated with minimal incisal display, recumbent upper and lower lips and acute nasolabial angle while the long facial pattern is associated with excessive incisal display, procumbent upper and lower lips and obtuse nasolabial angle. (author)

  13. Neural and behavioral associations of manipulated determination facial expressions.

    Science.gov (United States)

    Price, Tom F; Hortensius, Ruud; Harmon-Jones, Eddie

    2013-09-01

    Past research associated relative left frontal cortical activity with positive affect and approach motivation, or the urge to move toward a stimulus. Less work has examined relative left frontal activity and positive emotions ranging from low to high approach motivation, to test whether positive affects that differ in approach motivational intensity influence relative left frontal cortical activity. Participants in the present experiment adopted determination (high approach positive), satisfaction (low approach positive), or neutral facial expressions while electroencephalographic (EEG) activity was recorded. Next, participants completed a task measuring motivational persistence behavior and then they completed self-report emotion questionnaires. Determination compared to satisfaction and neutral facial expressions caused greater relative left frontal activity relative to baseline EEG recordings. Facial expressions did not directly influence task persistence. However, relative left frontal activity correlated positively with persistence on insolvable tasks in the determination condition. These results extend embodiment theories and motivational interpretations of relative left frontal activity. Published by Elsevier B.V.

  14. Superficial chemical peeling with salicylic acid in facial dermatoses

    International Nuclear Information System (INIS)

    Bari, A.U.; Iqbal, Z.; Rahman, S.B.

    2007-01-01

    To determine the effectiveness of salicylic acid chemical peeling in common dermatological conditions affecting face in people with predominant Fitzpatrick skin type IV and V. A total of 167 patients of either gender, aged between 13 to 60 years, having some facial dermatoses (melasma, acne vulgaris, postinflammatory hyperpigmentations, freckles, fine lines and wrinkles, post-inflammatory scars, actinic keratoses, and plane facial warts) were included. A series of eight weekly hospital based peeling sessions was conducted in all patients under standardized conditions with 30% salicylic acid. Clinical improvement in different disorders was evaluated by change in MASI score, decrease in the size of affected area and % reduction in lesions count. McNemar test was applied for data analysis. Majority of the patients showed moderate to excellent response. There was 35% to 63% improvement (p< 0.05) in all dermatoses. Significant side effects, as feared in Asian skins were not observed. Chemical peeling with salicylic acid is an effective and safe treatment modality in many superficial facial dermatoses. (author)

  15. Positive, but Not Negative, Facial Expressions Facilitate 3-Month-Olds' Recognition of an Individual Face

    Science.gov (United States)

    Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara

    2013-01-01

    The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…

  16. Emotional facial expression detection in the peripheral visual field.

    Directory of Open Access Journals (Sweden)

    Dimitri J Bayle

    Full Text Available BACKGROUND: In everyday life, signals of danger, such as aversive facial expressions, usually appear in the peripheral visual field. Although facial expression processing in central vision has been extensively studied, this processing in peripheral vision has been poorly studied. METHODOLOGY/PRINCIPAL FINDINGS: Using behavioral measures, we explored the human ability to detect fear and disgust vs. neutral expressions and compared it to the ability to discriminate between genders at eccentricities up to 40°. Responses were faster for the detection of emotion compared to gender. Emotion was detected from fearful faces up to 40° of eccentricity. CONCLUSIONS: Our results demonstrate the human ability to detect facial expressions presented in the far periphery up to 40° of eccentricity. The increasing advantage of emotion compared to gender processing with increasing eccentricity might reflect a major implication of the magnocellular visual pathway in facial expression processing. This advantage may suggest that emotion detection, relative to gender identification, is less impacted by visual acuity and within-face crowding in the periphery. These results are consistent with specific and automatic processing of danger-related information, which may drive attention to those messages and allow for a fast behavioral reaction.

  17. Does Gaze Direction Modulate Facial Expression Processing in Children with Autism Spectrum Disorder?

    Science.gov (United States)

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent…

  18. The Impact of Sex Differences on Odor Identification and Facial Affect Recognition in Patients with Schizophrenia Spectrum Disorders

    OpenAIRE

    Mossaheb, Nilufar; Kaufmann, Rainer M.; Schlögelhofer, Monika; Aninilkumparambil, Thushara; Himmelbauer, Claudia; Gold, Anna; Zehetmayer, Sonja; Hoffmann, Holger; Traue, Harald C.; Aschauer, Harald

    2018-01-01

    Background Social interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls. Methods Standardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL)] and smel...

  19. The Impact of Sex Differences on Odor Identification and Facial Affect Recognition in Patients with Schizophrenia Spectrum Disorders

    OpenAIRE

    Nilufar Mossaheb; Rainer M. Kaufmann; Monika Schlögelhofer; Thushara Aninilkumparambil; Claudia Himmelbauer; Anna Gold; Sonja Zehetmayer; Holger Hoffmann; Harald C. Traue; Harald Aschauer

    2018-01-01

    BackgroundSocial interactive functions such as facial emotion recognition and smell identification have been shown to differ between women and men. However, little is known about how these differences are mirrored in patients with schizophrenia and how these abilities interact with each other and with other clinical variables in patients vs. healthy controls.MethodsStandardized instruments were used to assess facial emotion recognition [Facially Expressed Emotion Labelling (FEEL)] and smell i...

  20. Dynamic Facial Prosthetics for Sufferers of Facial Paralysis

    Directory of Open Access Journals (Sweden)

    Fergal Coulter

    2011-10-01

    Full Text Available BackgroundThis paper discusses the various methods and the materialsfor the fabrication of active artificial facial muscles. Theprimary use for these will be the reanimation of paralysedor atrophied muscles in sufferers of non-recoverableunilateral facial paralysis.MethodThe prosthetic solution described in this paper is based onsensing muscle motion of the contralateral healthy musclesand replicating that motion across a patient’s paralysed sideof the face, via solid state and thin film actuators. Thedevelopment of this facial prosthetic device focused onrecreating a varying intensity smile, with emphasis ontiming, displacement and the appearance of the wrinklesand folds that commonly appear around the nose and eyesduring the expression.An animatronic face was constructed with actuations beingmade to a silicone representation musculature, usingmultiple shape-memory alloy cascades. Alongside theartificial muscle physical prototype, a facial expressionrecognition software system was constructed. This formsthe basis of an automated calibration and reconfigurationsystem for the artificial muscles following implantation, soas to suit the implantee’s unique physiognomy.ResultsAn animatronic model face with silicone musculature wasdesigned and built to evaluate the performance of ShapeMemory Alloy artificial muscles, their power controlcircuitry and software control systems. A dual facial motionsensing system was designed to allow real time control overmodel – a piezoresistive flex sensor to measure physicalmotion, and a computer vision system to evaluate real toartificial muscle performance.Analysis of various facial expressions in real subjects wasmade, which give useful data upon which to base thesystems parameter limits.ConclusionThe system performed well, and the various strengths andshortcomings of the materials and methods are reviewedand considered for the next research phase, when newpolymer based artificial muscles are constructed

  1. Facial Fractures.

    Science.gov (United States)

    Ghosh, Rajarshi; Gopalkrishnan, Kulandaswamy

    2018-06-01

    The aim of this study is to retrospectively analyze the incidence of facial fractures along with age, gender predilection, etiology, commonest site, associated dental injuries, and any complications of patients operated in Craniofacial Unit of SDM College of Dental Sciences and Hospital. This retrospective study was conducted at the Department of OMFS, SDM College of Dental Sciences, Dharwad from January 2003 to December 2013. Data were recorded for the cause of injury, age and gender distribution, frequency and type of injury, localization and frequency of soft tissue injuries, dentoalveolar trauma, facial bone fractures, complications, concomitant injuries, and different treatment protocols.All the data were analyzed using statistical analysis that is chi-squared test. A total of 1146 patients reported at our unit with facial fractures during these 10 years. Males accounted for a higher frequency of facial fractures (88.8%). Mandible was the commonest bone to be fractured among all the facial bones (71.2%). Maxillary central incisors were the most common teeth to be injured (33.8%) and avulsion was the most common type of injury (44.6%). Commonest postoperative complication was plate infection (11%) leading to plate removal. Other injuries associated with facial fractures were rib fractures, head injuries, upper and lower limb fractures, etc., among these rib fractures were seen most frequently (21.6%). This study was performed to compare the different etiologic factors leading to diverse facial fracture patterns. By statistical analysis of this record the authors come to know about the relationship of facial fractures with gender, age, associated comorbidities, etc.

  2. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry?

    Directory of Open Access Journals (Sweden)

    Krystyna Rymarczyk

    Full Text Available Facial mimicry is the spontaneous response to others' facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation. In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing.

  3. Eagle's syndrome with facial palsy

    Directory of Open Access Journals (Sweden)

    Mohammed Al-Hashim

    2017-01-01

    Full Text Available Eagle's syndrome (ES is a rare disease in which the styloid process is elongated and compressing adjacent structures. We describe a rare presentation of ES in which the patient presented with facial palsy. Facial palsy as a presentation of ES is very rare. A review of the English literature revealed only one previously reported case. Our case is a 39-year-old male who presented with left facial palsy. He also reported a 9-year history of the classical symptoms of ES. A computed tomography scan with three-dimensional reconstruction confirmed the diagnoses. He was started on conservative management but without significant improvement. Surgical intervention was offered, but the patient refused. It is important for otolaryngologists, dentists, and other specialists who deal with head and neck problems to be able to recognize ES despite its rarity. Although the patient responded to a treatment similar to that of Bell's palsy because of the clinical features and imaging, ES was most likely the cause of his facial palsy.

  4. The influence of attention toward facial expressions on size perception.

    Science.gov (United States)

    Choi, Jeong-Won; Kim, Kiho; Lee, Jang-Han

    2016-01-01

    According to the New Look theory, size perception is affected by emotional factors. Although previous studies have attempted to explain the effects of both emotion and motivation on size perception, they have failed to identify the underlying mechanisms. This study aimed to investigate the underlying mechanisms of size perception by applying attention toward facial expressions using the Ebbinghaus illusion as a measurement tool. The participants, female university students, were asked to judge the size of a target stimulus relative to the size of facial expressions (i.e., happy, angry, and neutral) surrounding the target. The results revealed that the participants perceived angry and neutral faces to be larger than happy faces. This finding indicates that individuals pay closer attention to neutral and angry faces than happy ones. These results suggest that the mechanisms underlying size perception involve cognitive processes that focus attention toward relevant stimuli and block out irrelevant stimuli.

  5. Neural Mechanism of Inferring Person's Inner Attitude towards Another Person through Observing the Facial Affect in an Emotional Context.

    Science.gov (United States)

    Kim, Ji-Woong; Kim, Jae-Jin; Jeong, Bumseok; Kim, Sung-Eun; Ki, Seon Wan

    2010-03-01

    The goal of the present study was to identify the brain mechanism involved in the attribution of person's attitude toward another person, using facial affective pictures and pictures displaying an affectively-loaded situation. Twenty four right-handed healthy subjects volunteered for our study. We used functional magnetic resonance imaging (MRI) to examine brain activation during attitude attribution task as compared to gender matching tasks. We identified activation in the left inferior frontal cortex, left superior temporal sulcus, and left inferior parietal lobule during the attitude attribution task, compared to the gender matching task. This study suggests that mirror neuron system and ventrolateral inferior frontal cortex play a critical role in the attribution of a person's inner attitude towards another person in an emotional situation.

  6. Fixation to features and neural processing of facial expressions in a gender discrimination task.

    Science.gov (United States)

    Neath, Karly N; Itier, Roxane J

    2015-10-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Affect perception across cultures: the role of cognitive mechanisms

    Directory of Open Access Journals (Sweden)

    Jan B Engelmann

    2013-03-01

    Full Text Available Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to affect perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of affect intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Recent findings indicating high levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions in Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception has identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive style, such as attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations.

  8. Virtual Characters: Visual Realism Affects Response Time and Decision-Making

    Science.gov (United States)

    Sibuma, Bernadette

    2012-01-01

    This study integrates agent research with a neurocognitive technique to study how character faces affect cognitive processing. The N170 event-related potential (ERP) was used to study face processing during simple decision-making tasks. Twenty-five adults responded to facial expressions (fear/neutral) presented in three designs…

  9. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions

    Science.gov (United States)

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2018-01-01

    Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions. PMID:29467691

  10. Recognition of Facial Expressions of Different Emotional Intensities in Patients with Frontotemporal Lobar Degeneration

    Directory of Open Access Journals (Sweden)

    Roy P. C. Kessels

    2007-01-01

    Full Text Available Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD. Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.

  11. Effect of Facial Cosmetic Acupuncture on Facial Elasticity: An Open-Label, Single-Arm Pilot Study

    Directory of Open Access Journals (Sweden)

    Younghee Yun

    2013-01-01

    Full Text Available Background. The use of acupuncture for cosmetic purposes has gained popularity worldwide. Facial cosmetic acupuncture (FCA is applied to the head, face, and neck. However, little evidence supports the efficacy and safety of FCA. We hypothesized that FCA affects facial elasticity by restoring resting mimetic muscle tone through the insertion of needles into the muscles of the head, face, and neck. Methods. This open-label, single-arm pilot study was implemented at Kyung Hee University Hospital at Gangdong from August through September 2011. Participants were women aged 40 to 59 years with a Glogau photoaging scale III. Participants received five treatment sessions over three weeks. Participants were measured before and after FCA. The primary outcome was the Moire topography criteria. The secondary outcome was a patient-oriented self-assessment scale of facial elasticity. Results. Among 50 women screened, 28 were eligible and 27 completed the five FCA treatment sessions. A significant improvement after FCA treatment was evident according to mean change in Moire topography criteria (from 1.70 ± 0.724 to 2.26 ± 1.059, P<0.0001. The most common adverse event was mild bruising at the needle site. Conclusions. In this pilot study, FCA showed promising results as a therapy for facial elasticity. However, further large-scale trials with a controlled design and objective measurements are needed.

  12. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    Science.gov (United States)

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  13. Visuo-spatial interference affects the identification of emotional facial expressions in unmedicated Parkinson's patients.

    Science.gov (United States)

    García-Rodríguez, Beatriz; Guillén, Carmen Casares; Barba, Rosa Jurado; io Valladolid, Gabriel Rub; Arjona, José Antonio Molina; Ellgring, Heiner

    2012-02-15

    There is evidence that visuo-spatial capacity can become overloaded when processing a secondary visual task (Dual Task, DT), as occurs in daily life. Hence, we investigated the influence of the visuo-spatial interference in the identification of emotional facial expressions (EFEs) in early stages of Parkinson's disease (PD). We compared the identification of 24 emotional faces that illustrate six basic emotions in, unmedicated recently diagnosed PD patients (16) and healthy adults (20), under two different conditions: a) simple EFE identification, and b) identification with a concurrent visuo-spatial task (Corsi Blocks). EFE identification by PD patients was significantly worse than that of healthy adults when combined with another visual stimulus. Published by Elsevier B.V.

  14. Dissociable roles of internal feelings and face recognition ability in facial expression decoding.

    Science.gov (United States)

    Zhang, Lin; Song, Yiying; Liu, Ling; Liu, Jia

    2016-05-15

    The problem of emotion recognition has been tackled by researchers in both affective computing and cognitive neuroscience. While affective computing relies on analyzing visual features from facial expressions, it has been proposed that humans recognize emotions by internally simulating the emotional states conveyed by others' expressions, in addition to perceptual analysis of facial features. Here we investigated whether and how our internal feelings contributed to the ability to decode facial expressions. In two independent large samples of participants, we observed that individuals who generally experienced richer internal feelings exhibited a higher ability to decode facial expressions, and the contribution of internal feelings was independent of face recognition ability. Further, using voxel-based morphometry, we found that the gray matter volume (GMV) of bilateral superior temporal sulcus (STS) and the right inferior parietal lobule was associated with facial expression decoding through the mediating effect of internal feelings, while the GMV of bilateral STS, precuneus, and the right central opercular cortex contributed to facial expression decoding through the mediating effect of face recognition ability. In addition, the clusters in bilateral STS involved in the two components were neighboring yet separate. Our results may provide clues about the mechanism by which internal feelings, in addition to face recognition ability, serve as an important instrument for humans in facial expression decoding. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. When your face describes your memories: facial expressions during retrieval of autobiographical memories.

    Science.gov (United States)

    El Haj, Mohamad; Daoudi, Mohamed; Gallouj, Karim; Moustafa, Ahmed A; Nandrino, Jean-Louis

    2018-05-11

    Thanks to the current advances in the software analysis of facial expressions, there is a burgeoning interest in understanding emotional facial expressions observed during the retrieval of autobiographical memories. This review describes the research on facial expressions during autobiographical retrieval showing distinct emotional facial expressions according to the characteristics of retrieved memoires. More specifically, this research demonstrates that the retrieval of emotional memories can trigger corresponding emotional facial expressions (e.g. positive memories may trigger positive facial expressions). Also, this study demonstrates the variations of facial expressions according to specificity, self-relevance, or past versus future direction of memory construction. Besides linking research on facial expressions during autobiographical retrieval to cognitive and affective characteristics of autobiographical memory in general, this review positions this research within the broader context research on the physiologic characteristics of autobiographical retrieval. We also provide several perspectives for clinical studies to investigate facial expressions in populations with deficits in autobiographical memory (e.g. whether autobiographical overgenerality in neurologic and psychiatric populations may trigger few emotional facial expressions). In sum, this review paper demonstrates how the evaluation of facial expressions during autobiographical retrieval may help understand the functioning and dysfunctioning of autobiographical memory.

  16. What Is Expected from a Facial Trauma Caused by Violence?

    Directory of Open Access Journals (Sweden)

    Douglas Rangel Goulart

    2014-12-01

    Full Text Available Objectives: The aim of this retrospective study was to compare the peculiarities of maxillofacial injuries caused by interpersonal violence with other etiologic factors. Material and Methods: Medical records of 3,724 patients with maxillofacial injuries in São Paulo state (Brazil were retrospectively analyzed. The data were submitted to statistical analysis (simple descriptive statistics and Chi-squared test using SPSS 18.0 software. Results: Data of 612 patients with facial injuries caused by violence were analyzed. The majority of the patients were male (81%; n = 496, with a mean age of 31.28 years (standard deviation of 13.33 years. These patients were more affected by mandibular and nose fractures, when compared with all other patients (P < 0.01, although fewer injuries were recorded in other body parts (χ2 = 17.54; P < 0.01; Victims of interpersonal violence exhibited more injuries when the neurocranium was analyzed in isolation (χ2 = 6.85; P < 0.01. Conclusions: Facial trauma due to interpersonal violence seem to be related to a higher rate of facial fractures and lacerations when compared to all patients with facial injuries. Prominent areas of the face and neurocranium were more affected by injuries.

  17. Comparison of Direct Side-to-End and End-to-End Hypoglossal-Facial Anastomosis for Facial Nerve Repair.

    Science.gov (United States)

    Samii, Madjid; Alimohamadi, Maysam; Khouzani, Reza Karimi; Rashid, Masoud Rafizadeh; Gerganov, Venelin

    2015-08-01

    The hypoglossal facial anastomosis (HFA) is the gold standard for facial reanimation in patients with severe facial nerve palsy. The major drawbacks of the classic HFA technique are lingual morbidities due to hypoglossal nerve transection. The side-to-end HFA is a modification of the classic technique with fewer tongue-related morbidities. In this study we compared the outcome of the classic end-to-end and the direct side-to-end HFA surgeries performed at our center in regards to the facial reanimation success rate and tongue-related morbidities. Twenty-six successive cases of HFA were enrolled. In 9 of them end-to-end anastomoses were performed, and 17 had direct side-to-end anastomoses. The House-Brackmann (HB) and Pitty and Tator (PT) scales were used to document surgical outcome. The hemiglossal atrophy, swallowing, and hypoglossal nerve function were assessed at follow-up. The original pathology was vestibular schwannoma in 15, meningioma in 4, brain stem glioma in 4, and other pathologies in 3. The mean interval between facial palsy and HFA was 18 months (range: 0-60). The median follow-up period was 20 months. The PT grade at follow-up was worse in patients with a longer interval from facial palsy and HFA (P value: 0.041). The lesion type was the only other factor that affected PT grade (the best results in vestibular schwannoma and the worst in the other pathologies group, P value: 0.038). The recovery period for facial tonicity was longer in patients with radiation therapy before HFA (13.5 vs. 8.5 months) and those with a longer than 2-year interval from facial palsy to HFA (13.5 vs. 8.5 months). Although no significant difference between the side-to-end and the end-to-end groups was seen in terms of facial nerve functional recovery, patients from the side-to-end group had a significantly lower rate of lingual morbidities (tongue hemiatrophy: 100% vs. 5.8%, swallowing difficulty: 55% vs. 11.7%, speech disorder 33% vs. 0%). With the side-to-end HFA

  18. The Turner Syndrome: Cognitive Deficits, Affective Discrimination, and Behavior Problems.

    Science.gov (United States)

    McCauley, Elizabeth; And Others

    1987-01-01

    The study attemped to link cognitive and social problems seen in girls with Turner syndrome by assessing the girls' ability to process affective cues. Seventeen 9- to 17-year-old girls diagnosed with Turner syndrome were compared to a matched control group on a task which required interpretation of affective intention from facial expression.…

  19. Adolescents with HIV and facial lipoatrophy: response to facial stimulation

    Directory of Open Access Journals (Sweden)

    Jesus Claudio Gabana-Silveira

    2014-08-01

    Full Text Available OBJECTIVES: This study evaluated the effects of facial stimulation over the superficial muscles of the face in individuals with facial lipoatrophy associated with human immunodeficiency virus (HIV and with no indication for treatment with polymethyl methacrylate. METHOD: The study sample comprised four adolescents of both genders ranging from 13 to 17 years in age. To participate in the study, the participants had to score six or less points on the Facial Lipoatrophy Index. The facial stimulation program used in our study consisted of 12 weekly 30-minute sessions during which individuals received therapy. The therapy consisted of intra- and extra-oral muscle contraction and stretching maneuvers of the zygomaticus major and minor and the masseter muscles. Pre- and post-treatment results were obtained using anthropometric static measurements of the face and the Facial Lipoatrophy Index. RESULTS: The results suggest that the therapeutic program effectively improved the volume of the buccinators. No significant differences were observed for the measurements of the medial portion of the face, the lateral portion of the face, the volume of the masseter muscle, or Facial Lipoatrophy Index scores. CONCLUSION: The results of our study suggest that facial maneuvers applied to the superficial muscles of the face of adolescents with facial lipoatrophy associated with HIV improved the facial area volume related to the buccinators muscles. We believe that our results will encourage future research with HIV patients, especially for patients who do not have the possibility of receiving an alternative aesthetic treatment.

  20. The effect of affective context on visuocortical processing of neutral faces in social anxiety - An ERP study

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2015-11-01

    Full Text Available It has been demonstrated that verbal context information alters the neural processing of ambiguous faces such as faces with no apparent facial expression. In social anxiety, neutral faces may be implicitly threatening for socially anxious individuals due to their ambiguous nature, but even more so if these neutral faces are put in self-referential negative contexts. Therefore, we measured event-related brain potentials (ERPs in response to neutral faces which were preceded by affective verbal information (negative, neutral, positive. Participants with low social anxiety (LSA; n = 23 and high social anxiety (HSA; n = 21 were asked to watch and rate valence and arousal of the respective faces while continuous EEG was recorded. ERP analysis revealed that HSA showed elevated P100 amplitudes in response to faces, but reduced structural encoding of faces as indexed by reduced N170 amplitudes. In general, affective context led to an enhanced early posterior negativity (EPN for negative compared to neutral facial expressions. Moreover, HSA compared to LSA showed enhanced late positive potentials (LPP to negatively contextualized faces, whereas in LSA this effect was found for faces in positive contexts. Also, HSA rated faces in negative contexts as more negative compared to LSA. These results point at enhanced vigilance for neutral faces regardless of context in HSA, while structural encoding seems to be diminished (avoidance. Interestingly, later components of sustained processing (LPP indicate that LSA show enhanced visuocortical processing for faces in positive contexts (happy bias, whereas this seems to be the case for negatively contextualized faces in HSA (threat bias. Finally, our results add further new evidence that top-down information in interaction with individual anxiety levels can influence early-stage aspects of visual perception.

  1. Sad Facial Expressions Increase Choice Blindness

    Directory of Open Access Journals (Sweden)

    Yajie Wang

    2018-01-01

    Full Text Available Previous studies have discovered a fascinating phenomenon known as choice blindness—individuals fail to detect mismatches between the face they choose and the face replaced by the experimenter. Although previous studies have reported a couple of factors that can modulate the magnitude of choice blindness, the potential effect of facial expression on choice blindness has not yet been explored. Using faces with sad and neutral expressions (Experiment 1 and faces with happy and neutral expressions (Experiment 2 in the classic choice blindness paradigm, the present study investigated the effects of facial expressions on choice blindness. The results showed that the detection rate was significantly lower on sad faces than neutral faces, whereas no significant difference was observed between happy faces and neutral faces. The exploratory analysis of verbal reports found that participants who reported less facial features for sad (as compared to neutral expressions also tended to show a lower detection rate of sad (as compared to neutral faces. These findings indicated that sad facial expressions increased choice blindness, which might have resulted from inhibition of further processing of the detailed facial features by the less attractive sad expressions (as compared to neutral expressions.

  2. Sad Facial Expressions Increase Choice Blindness.

    Science.gov (United States)

    Wang, Yajie; Zhao, Song; Zhang, Zhijie; Feng, Wenfeng

    2017-01-01

    Previous studies have discovered a fascinating phenomenon known as choice blindness-individuals fail to detect mismatches between the face they choose and the face replaced by the experimenter. Although previous studies have reported a couple of factors that can modulate the magnitude of choice blindness, the potential effect of facial expression on choice blindness has not yet been explored. Using faces with sad and neutral expressions (Experiment 1) and faces with happy and neutral expressions (Experiment 2) in the classic choice blindness paradigm, the present study investigated the effects of facial expressions on choice blindness. The results showed that the detection rate was significantly lower on sad faces than neutral faces, whereas no significant difference was observed between happy faces and neutral faces. The exploratory analysis of verbal reports found that participants who reported less facial features for sad (as compared to neutral) expressions also tended to show a lower detection rate of sad (as compared to neutral) faces. These findings indicated that sad facial expressions increased choice blindness, which might have resulted from inhibition of further processing of the detailed facial features by the less attractive sad expressions (as compared to neutral expressions).

  3. Processing of individual items during ensemble coding of facial expressions

    Directory of Open Access Journals (Sweden)

    Huiyun Li

    2016-09-01

    Full Text Available There is growing evidence that human observers are able to extract the mean emotion or other type of information from a set of faces. The most intriguing aspect of this phenomenon is that observers often fail to identify or form a representation for individual faces in a face set. However, most of these results were based on judgments under limited processing resource. We examined a wider range of exposure time and observed how the relationship between the extraction of a mean and representation of individual facial expressions would change. The results showed that with an exposure time of 50 milliseconds for the faces, observers were more sensitive to mean representation over individual representation, replicating the typical findings in the literature. With longer exposure time, however, observers were able to extract both individual and mean representation more accurately. Furthermore, diffusion model analysis revealed that the mean representation is also more prone to suffer from the noise accumulated in redundant processing time and leads to a more conservative decision bias, whereas individual representations seem more resistant to this noise. Results suggest that the encoding of emotional information from multiple faces may take two forms: single face processing and crowd face processing.

  4. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Decoding facial blends of emotion: visual field, attentional and hemispheric biases.

    Science.gov (United States)

    Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I

    2013-12-01

    Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.

  6. Interest and attention in facial recognition.

    Science.gov (United States)

    Burgess, Melinda C R; Weaver, George E

    2003-04-01

    When applied to facial recognition, the levels of processing paradigm has yielded consistent results: faces processed in deep conditions are recognized better than faces processed under shallow conditions. However, there are multiple explanations for this occurrence. The own-race advantage in facial recognition, the tendency to recognize faces from one's own race better than faces from another race, is also consistently shown but not clearly explained. This study was designed to test the hypothesis that the levels of processing findings in facial recognition are a result of interest and attention, not differences in processing. This hypothesis was tested for both own and other faces with 105 Caucasian general psychology students. Levels of processing was manipulated as a between-subjects variable; students were asked to answer one of four types of study questions, e.g., "deep" or "shallow" processing questions, while viewing the study faces. Students' recognition of a subset of previously presented Caucasian and African-American faces from a test-set with an equal number of distractor faces was tested. They indicated their interest in and attention to the task. The typical levels of processing effect was observed with better recognition performance in the deep conditions than in the shallow conditions for both own- and other-race faces. The typical own-race advantage was also observed regardless of level of processing condition. For both own- and other-race faces, level of processing explained a significant portion of the recognition variance above and beyond what was explained by interest in and attention to the task.

  7. Korelasi indeks morfologi wajah dengan sudut interinsisal dan tinggi wajah secara sefalometri (Cephalometric correlation of facial morphology index with interincisal angle and facial height

    Directory of Open Access Journals (Sweden)

    Pricillia Priska Sianita K

    2013-12-01

    Full Text Available Background: In a disaster or criminal case, comprehensive information is needed for identification process of each victim. Especially for some cases that only leave skull without any information that could help the identification process, including face reconstruction that will be needed. One way of identifications is specific face characteristic, race, some head-neck measurements, such as facial morphology index, interincisal angle and facial height. Purpose: The aim of study was to determine the correlation of facial morphology index with interincisal angle and facial height through cephalometric measurement. Methods: The samples were cephalogram of 31 subjects (Deutro-Malayid race who met the inclusive criteria. Cephalometric analysis were done to all samples and followed by Pearson Correlation statistical test. Results: The correlation was found between facial morphology index and facial height, but no correlation between facial morphology index and interincisal angle. Conclusion: The study showed that the cephalometric measurement of facial morphology index and facial height could be used as the additional information for identification process.Latar belakang: Dalam bencana alam atau kasus kriminal informasi yang komprehensif diperlukan untuk proses identifikasi masing korban. Khususnya pada beberapa kasus yang hanya meninggalkan tengkorak tanpa informasi yang dapat membantu proses identifikasi, termasuk rekonstruksi wajah yang akan dibutuhkan. Salah satu cara identifikasi karakteristik wajah tertentu, ras, beberapa pengukuran kepala leher, seperti indeks morfologi wajah, sudut interincisal dan tinggi wajah. Tujuan: Penelitian ini bertujuan meneliti korelasi indeks morfologi wajah dengan sudut interincisal dan tinggi wajah melalui pengukuran sefalometrik. Metode: Sampel penelitian adalah cephalogram dari 31 subyek ras Deutro - Malayid ras yang memenuhi kriteria inklusif. Analisis cephalometri dilakukan pada semua sampel dan dilanjutkan

  8. Facial Expression Recognition using Multiclass Ensemble Least-Square Support Vector Machine

    Science.gov (United States)

    Lawi, Armin; Sya'Rani Machrizzandi, M.

    2018-03-01

    Facial expression is one of behavior characteristics of human-being. The use of biometrics technology system with facial expression characteristics makes it possible to recognize a person’s mood or emotion. The basic components of facial expression analysis system are face detection, face image extraction, facial classification and facial expressions recognition. This paper uses Principal Component Analysis (PCA) algorithm to extract facial features with expression parameters, i.e., happy, sad, neutral, angry, fear, and disgusted. Then Multiclass Ensemble Least-Squares Support Vector Machine (MELS-SVM) is used for the classification process of facial expression. The result of MELS-SVM model obtained from our 185 different expression images of 10 persons showed high accuracy level of 99.998% using RBF kernel.

  9. [Facial nerve neurinomas].

    Science.gov (United States)

    Sokołowski, Jacek; Bartoszewicz, Robert; Morawski, Krzysztof; Jamróz, Barbara; Niemczyk, Kazimierz

    2013-01-01

    Evaluation of diagnostic, surgical technique, treatment results facial nerve neurinomas and its comparison with literature was the main purpose of this study. Seven cases of patients (2005-2011) with facial nerve schwannomas were included to retrospective analysis in the Department of Otolaryngology, Medical University of Warsaw. All patients were assessed with history of the disease, physical examination, hearing tests, computed tomography and/or magnetic resonance imaging, electronystagmography. Cases were observed in the direction of potential complications and recurrences. Neurinoma of the facial nerve occurred in the vertical segment (n=2), facial nerve geniculum (n=1) and the internal auditory canal (n=4). The symptoms observed in patients were analyzed: facial nerve paresis (n=3), hearing loss (n=2), dizziness (n=1). Magnetic resonance imaging and computed tomography allowed to confirm the presence of the tumor and to assess its staging. Schwannoma of the facial nerve has been surgically removed using the middle fossa approach (n=5) and by antromastoidectomy (n=2). Anatomical continuity of the facial nerve was achieved in 3 cases. In the twelve months after surgery, facial nerve paresis was rated at level II-III° HB. There was no recurrence of the tumor in radiological observation. Facial nerve neurinoma is a rare tumor. Currently surgical techniques allow in most cases, the radical removing of the lesion and reconstruction of the VII nerve function. The rate of recurrence is low. A tumor of the facial nerve should be considered in the differential diagnosis of nerve VII paresis. Copyright © 2013 Polish Otorhinolaryngology - Head and Neck Surgery Society. Published by Elsevier Urban & Partner Sp. z.o.o. All rights reserved.

  10. Contralateral botulinum toxin injection to improve facial asymmetry after acute facial paralysis.

    Science.gov (United States)

    Kim, Jin

    2013-02-01

    The application of botulinum toxin to the healthy side of the face in patients with long-standing facial paralysis has been shown to be a minimally invasive technique that improves facial symmetry at rest and during facial motion, but our experience using botulinum toxin therapy for facial sequelae prompted the idea that botulinum toxin might be useful in acute cases of facial paralysis, leading to improve facial asymmetry. In cases in which medical or surgical treatment options are limited because of existing medical problems or advanced age, most patients with acute facial palsy are advised to await spontaneous recovery or are informed that no effective intervention exists. The purpose of this study was to evaluate the effect of botulinum toxin treatment for facial asymmetry in 18 patients after acute facial palsy who could not be optimally treated by medical or surgical management because of severe medical or other problems. From 2009 to 2011, nine patients with Bell's palsy, 5 with herpes zoster oticus and 4 with traumatic facial palsy (10 men and 8 women; age range, 22-82 yr; mean, 50.8 yr) participated in this study. Botulinum toxin A (Botox; Allergan Incorporated, Irvine, CA, USA) was injected using a tuberculin syringe with a 27-gauge needle. The amount injected per site varied from 2.5 to 3 U, and the total dose used per patient was 32 to 68 U (mean, 47.5 +/- 8.4 U). After administration of a single dose of botulinum toxin A on the nonparalyzed side of 18 patients with acute facial paralysis, marked relief of facial asymmetry was observed in 8 patients within 1 month of injection. Decreased facial asymmetry and strengthened facial function on the paralyzed side led to an increased HB and SB grade within 6 months after injection. Use of botulinum toxin after acute facial palsy cases is of great value. Such therapy decreases the relative hyperkinesis contralateral to the paralysis, leading to greater symmetric function. Especially in patients with medical

  11. Rejuvenecimiento facial

    Directory of Open Access Journals (Sweden)

    L. Daniel Jacubovsky, Dr.

    2010-01-01

    Full Text Available El envejecimiento facial es un proceso único y particular a cada individuo y está regido en especial por su carga genética. El lifting facial es una compleja técnica desarrollada en nuestra especialidad desde principios de siglo, para revertir los principales signos de este proceso. Los factores secundarios que gravitan en el envejecimiento facial son múltiples y por ello las ritidectomías o lifting cérvico faciales descritas han buscado corregir los cambios fisonómicos del envejecimiento excursionando, como se describe, en todos los planos tisulares involucrados. Esta cirugía por lo tanto, exige conocimiento cabal de la anatomía quirúrgica, pericia y experiencia para reducir las complicaciones, estigmas quirúrgicos y revisiones secundarias. La ridectomía facial ha evolucionado hacia un procedimiento más simple, de incisiones más cortas y disecciones menos extensas. Las suspensiones musculares han variado en su ejecución y los vectores de montaje y resección cutánea son cruciales en los resultados estéticos de la cirugía cérvico facial. Hoy estos vectores son de tracción más vertical. La corrección de la flaccidez va acompañada de un interés en reponer el volumen de la superficie del rostro, en especial el tercio medio. Las técnicas quirúrgicas de rejuvenecimiento, en especial el lifting facial, exigen una planificación para cada paciente. Las técnicas adjuntas al lifting, como blefaroplastias, mentoplastía, lipoaspiración de cuello, implantes faciales y otras, también han tenido una positiva evolución hacia la reducción de riesgos y mejor éxito estético.

  12. Cognitive penetrability and emotion recognition in human facial expressions

    Directory of Open Access Journals (Sweden)

    Francesco eMarchi

    2015-06-01

    Full Text Available Do our background beliefs, desires, and mental images influence our perceptual experience of the emotions of others? In this paper, we will address the possibility of cognitive penetration of perceptual experience in the domain of social cognition. In particular, we focus on emotion recognition based on the visual experience of facial expressions. After introducing the current debate on cognitive penetration, we review examples of perceptual adaptation for facial expressions of emotion. This evidence supports the idea that facial expressions are perceptually processed as wholes. That is, the perceptual system integrates lower-level facial features, such as eyebrow orientation, mouth angle etc., into facial compounds. We then present additional experimental evidence showing that in some cases, emotion recognition on the basis of facial expression is sensitive to and modified by the background knowledge of the subject. We argue that such sensitivity is best explained as a difference in the visual experience of the facial expression, not just as a modification of the judgment based on this experience. The difference in experience is characterized as the result of the interference of background knowledge with the perceptual integration process for faces. Thus, according to the best explanation, we have to accept cognitive penetration in some cases of emotion recognition. Finally, we highlight a recent model of social vision in order to propose a mechanism for cognitive penetration used in the face-based recognition of emotion.

  13. IncobotulinumtoxinA treatment of facial nerve palsy after neurosurgery.

    Science.gov (United States)

    Akulov, Mihail A; Orlova, Ol'ga R; Orlova, Aleksandra S; Usachev, Dmitrij J; Shimansky, Vadim N; Tanjashin, Sergey V; Khatkova, Svetlana E; Yunosha-Shanyavskaya, Anna V

    2017-10-15

    This study evaluates the effect of incobotulinumtoxinA in the acute and chronic phases of facial nerve palsy after neurosurgical interventions. Patients received incobotulinumtoxinA injections (active treatment group) or standard rehabilitation treatment (control group). Functional efficacy was assessed using House-Brackmann, Yanagihara System and Sunnybrook Facial Grading scales, and Facial Disability Index self-assessment. Significant improvements on all scales were seen after 1month of incobotulinumtoxinA treatment (active treatment group, р<0.05), but only after 3months of rehabilitation treatment (control group, р<0.05). At 1 and 2years post-surgery, the prevalence of synkinesis was significantly higher in patients in the control group compared with those receiving incobotulinumtoxinA treatment (р<0.05 and р<0.001, respectively). IncobotulinumtoxinA treatment resulted in significant improvements in facial symmetry in patients with facial nerve injury following neurosurgical interventions. Treatment was effective for the correction of the compensatory hyperactivity of mimic muscles on the unaffected side that develops in the acute period of facial nerve palsy, and for the correction of synkinesis in the affected side that develops in the long-term period. Appropriate dosing and patient education to perform exercises to restore mimic muscle function should be considered in multimodal treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Do facial movements express emotions or communicate motives?

    Science.gov (United States)

    Parkinson, Brian

    2005-01-01

    This article addresses the debate between emotion-expression and motive-communication approaches to facial movements, focusing on Ekman's (1972) and Fridlund's (1994) contrasting models and their historical antecedents. Available evidence suggests that the presence of others either reduces or increases facial responses, depending on the quality and strength of the emotional manipulation and on the nature of the relationship between interactants. Although both display rules and social motives provide viable explanations of audience "inhibition" effects, some audience facilitation effects are less easily accommodated within an emotion-expression perspective. In particular, emotion is not a sufficient condition for a corresponding "expression," even discounting explicit regulation, and, apparently, "spontaneous" facial movements may be facilitated by the presence of others. Further, there is no direct evidence that any particular facial movement provides an unambiguous expression of a specific emotion. However, information communicated by facial movements is not necessarily extrinsic to emotion. Facial movements not only transmit emotion-relevant information but also contribute to ongoing processes of emotional action in accordance with pragmatic theories.

  15. Mere social categorization modulates identification of facial expressions of emotion.

    Science.gov (United States)

    Young, Steven G; Hugenberg, Kurt

    2010-12-01

    The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  16. Peripheral facial palsy: Speech, communication and oral motor function.

    Science.gov (United States)

    Movérare, T; Lohmander, A; Hultcrantz, M; Sjögreen, L

    2017-02-01

    The aim of the present study was to examine the effect of acquired unilateral peripheral facial palsy on speech, communication and oral functions and to study the relationship between the degree of facial palsy and articulation, saliva control, eating ability and lip force. In this descriptive study, 27 patients (15 men and 12 women, mean age 48years) with unilateral peripheral facial palsy were included if they were graded under 70 on the Sunnybrook Facial Grading System. The assessment was carried out in connection with customary visits to the ENT Clinic and comprised lip force, articulation and intelligibility, together with perceived ability to communicate and ability to eat and control saliva conducted through self-response questionnaires. The patients with unilateral facial palsy had significantly lower lip force, poorer articulation and ability to eat and control saliva compared with reference data in healthy populations. The degree of facial palsy correlated significantly with lip force but not with articulation, intelligibility, perceived communication ability or reported ability to eat and control saliva. Acquired peripheral facial palsy may affect communication and the ability to eat and control saliva. Physicians should be aware that there is no direct correlation between the degree of facial palsy and the possible effect on communication, eating ability and saliva control. Physicians are therefore recommended to ask specific questions relating to problems with these functions during customary medical visits and offer possible intervention by a speech-language pathologist or a physiotherapist. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  17. Fusiform Correlates of Facial Memory in Autism

    Directory of Open Access Journals (Sweden)

    Nicholas Lange

    2013-07-01

    Full Text Available Prior studies have shown that performance on standardized measures of memory in children with autism spectrum disorder (ASD is substantially reduced in comparison to matched typically developing controls (TDC. Given reported deficits in face processing in autism, the current study compared performance on an immediate and delayed facial memory task for individuals with ASD and TDC. In addition, we examined volumetric differences in classic facial memory regions of interest (ROI between the two groups, including the fusiform, amygdala, and hippocampus. We then explored the relationship between ROI volume and facial memory performance. We found larger volumes in the autism group in the left amygdala and left hippocampus compared to TDC. In contrast, TDC had larger left fusiform gyrus volumes when compared with ASD. Interestingly, we also found significant negative correlations between delayed facial memory performance and volume of the left and right fusiform and the left hippocampus for the ASD group but not for TDC. The possibility of larger fusiform volume as a marker of abnormal connectivity and decreased facial memory is discussed.

  18. The visibility of social class from facial cues.

    Science.gov (United States)

    Bjornsdottir, R Thora; Rule, Nicholas O

    2017-10-01

    Social class meaningfully impacts individuals' life outcomes and daily interactions, and the mere perception of one's socioeconomic standing can have significant ramifications. To better understand how people infer others' social class, we therefore tested the legibility of class (operationalized as monetary income) from facial images, finding across 4 participant samples and 2 stimulus sets that perceivers categorized the faces of rich and poor targets significantly better than chance. Further investigation showed that perceivers categorize social class using minimal facial cues and employ a variety of stereotype-related impressions to make their judgments. Of these, attractiveness accurately cued higher social class in self-selected dating profile photos. However, only the stereotype that well-being positively relates to wealth served as a valid cue in neutral faces. Indeed, neutrally posed rich targets displayed more positive affect relative to poor targets and perceivers used this affective information to categorize their social class. Impressions of social class from these facial cues also influenced participants' evaluations of the targets' employability, demonstrating that face-based perceptions of social class may have important downstream consequences. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. ExpNet: Landmark-Free, Deep, 3D Facial Expressions

    OpenAIRE

    Chang, Feng-Ju; Tran, Anh Tuan; Hassner, Tal; Masi, Iacopo; Nevatia, Ram; Medioni, Gerard

    2018-01-01

    We describe a deep learning based method for estimating 3D facial expression coefficients. Unlike previous work, our process does not relay on facial landmark detection methods as a proxy step. Recent methods have shown that a CNN can be trained to regress accurate and discriminative 3D morphable model (3DMM) representations, directly from image intensities. By foregoing facial landmark detection, these methods were able to estimate shapes for occluded faces appearing in unprecedented in-the-...

  20. A neurophysiological study of facial numbness in multiple sclerosis: Integration with clinical data and imaging findings.

    Science.gov (United States)

    Koutsis, Georgios; Kokotis, Panagiotis; Papagianni, Aikaterini E; Evangelopoulos, Maria-Eleftheria; Kilidireas, Constantinos; Karandreas, Nikolaos

    2016-09-01

    To integrate neurophysiological findings with clinical and imaging data in a consecutive series of multiple sclerosis (MS) patients developing facial numbness during the course of an MS attack. Nine consecutive patients with MS and recent-onset facial numbness were studied clinically, imaged with routine MRI, and assessed neurophysiologically with trigeminal somatosensory evoked potential (TSEP), blink reflex (BR), masseter reflex (MR), facial nerve conduction, facial muscle and masseter EMG studies. All patients had unilateral facial hypoesthesia on examination and lesions in the ipsilateral pontine tegmentum on MRI. All patients had abnormal TSEPs upon stimulation of the affected side, excepting one that was tested following remission of numbness. BR was the second most sensitive neurophysiological method with 6/9 examinations exhibiting an abnormal R1 component. The MR was abnormal in 3/6 patients, always on the affected side. Facial conduction and EMG studies were normal in all patients but one. Facial numbness was always related to abnormal TSEPs. A concomitant R1 abnormality on BR allowed localization of the responsible pontine lesion, which closely corresponded with MRI findings. We conclude that neurophysiological assessment of MS patients with facial numbness is a sensitive tool, which complements MRI, and can improve lesion localization. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Facial attractiveness of skeletal class I and class II malocclusion as perceived by laypeople, patients and clinicians.

    Science.gov (United States)

    Pace, Michela; Cioffi, Iacopo; D'antò, Vincenzo; Valletta, Alessandra; Valletta, Rosa; Amato, Massimo

    2018-06-01

    Physical attractiveness is dependent on facial appearance. The facial profile plays a crucial role in facial attractiveness and can be improved with orthodontic treatment. The aesthetic assessment of facial appearance may be influenced by the cultural background and education of the assessor and dependent upon the experience level of dental professionals. This study aimed to evaluate how the sagittal jaw relationship in Class I and Class II individuals affects facial attractiveness, and whether the assessor's professional education and background affect the perception of facial attractiveness. Facial silhouettes simulating mandibular retrusion, maxillary protrusion, mandibular retrusion combined with maxillary protrusion, bimaxillary protrusion and severe bimaxillary protrusion in class I and class II patients were assessed by five groups of people with different backgrounds and education levels (i.e., 23 expert orthodontists, 21 orthodontists, 15 maxillofacial surgeons, 19 orthodontic patients and 28 laypeople). Straight facial profiles were judged to be more attractive than convex profiles due to severe mandibular retrusion and to mandibular retrusion combined with maxillary protrusion (all Pattractive by clinicians than by patients and laypeople (all Pattractive than Class I profiles. The assessment of facial attractiveness is dependent on the assessor's education and background. Laypeople and patients are considerably less sensitive to abnormal sagittal jaw relationships than orthodontists.

  2. [Neurological disease and facial recognition].

    Science.gov (United States)

    Kawamura, Mitsuru; Sugimoto, Azusa; Kobayakawa, Mutsutaka; Tsuruya, Natsuko

    2012-07-01

    To discuss the neurological basis of facial recognition, we present our case reports of impaired recognition and a review of previous literature. First, we present a case of infarction and discuss prosopagnosia, which has had a large impact on face recognition research. From a study of patient symptoms, we assume that prosopagnosia may be caused by unilateral right occipitotemporal lesion and right cerebral dominance of facial recognition. Further, circumscribed lesion and degenerative disease may also cause progressive prosopagnosia. Apperceptive prosopagnosia is observed in patients with posterior cortical atrophy (PCA), pathologically considered as Alzheimer's disease, and associative prosopagnosia in frontotemporal lobar degeneration (FTLD). Second, we discuss face recognition as part of communication. Patients with Parkinson disease show social cognitive impairments, such as difficulty in facial expression recognition and deficits in theory of mind as detected by the reading the mind in the eyes test. Pathological and functional imaging studies indicate that social cognitive impairment in Parkinson disease is possibly related to damages in the amygdalae and surrounding limbic system. The social cognitive deficits can be observed in the early stages of Parkinson disease, and even in the prodromal stage, for example, patients with rapid eye movement (REM) sleep behavior disorder (RBD) show impairment in facial expression recognition. Further, patients with myotonic dystrophy type 1 (DM 1), which is a multisystem disease that mainly affects the muscles, show social cognitive impairment similar to that of Parkinson disease. Our previous study showed that facial expression recognition impairment of DM 1 patients is associated with lesion in the amygdalae and insulae. Our study results indicate that behaviors and personality traits in DM 1 patients, which are revealed by social cognitive impairment, are attributable to dysfunction of the limbic system.

  3. Facial exercises for facial rejuvenation: a control group study.

    Science.gov (United States)

    De Vos, Marie-Camille; Van den Brande, Helen; Boone, Barbara; Van Borsel, John

    2013-01-01

    Facial exercises are a noninvasive alternative to medical approaches to facial rejuvenation. Logopedists could be involved in providing these exercises. Little research has been conducted, however, on the effectiveness of exercises for facial rejuvenation. This study assessed the effectiveness of 4 exercises purportedly reducing wrinkles and sagging of the facial skin. A control group study was conducted with 18 participants, 9 of whom (the experimental group) underwent daily training for 7 weeks. Pictures taken before and after 7 weeks of 5 facial areas (forehead, nasolabial folds, area above the upper lip, jawline and area under the chin) were evaluated by a panel of laypersons. In addition, the participants of the experimental group evaluated their own pictures. Evaluation included the pairwise presentation of pictures before and after 7 weeks and scoring of the same pictures by means of visual analogue scales in a random presentation. Only one significant difference was found between the control and experimental group. In the experimental group, the picture after therapy of the upper lip was more frequently chosen to be the younger-looking one by the panel. It cannot be concluded that facial exercises are effective. More systematic research is needed. © 2013 S. Karger AG, Basel.

  4. Automated facial acne assessment from smartphone images

    Science.gov (United States)

    Amini, Mohammad; Vasefi, Fartash; Valdebran, Manuel; Huang, Kevin; Zhang, Haomiao; Kemp, William; MacKinnon, Nicholas

    2018-02-01

    A smartphone mobile medical application is presented, that provides analysis of the health of skin on the face using a smartphone image and cloud-based image processing techniques. The mobile application employs the use of the camera to capture a front face image of a subject, after which the captured image is spatially calibrated based on fiducial points such as position of the iris of the eye. A facial recognition algorithm is used to identify features of the human face image, to normalize the image, and to define facial regions of interest (ROI) for acne assessment. We identify acne lesions and classify them into two categories: those that are papules and those that are pustules. Automated facial acne assessment was validated by performing tests on images of 60 digital human models and 10 real human face images. The application was able to identify 92% of acne lesions within five facial ROIs. The classification accuracy for separating papules from pustules was 98%. Combined with in-app documentation of treatment, lifestyle factors, and automated facial acne assessment, the app can be used in both cosmetic and clinical dermatology. It allows users to quantitatively self-measure acne severity and treatment efficacy on an ongoing basis to help them manage their chronic facial acne.

  5. Avaliação comparativa entre agradabilidade facial e análise subjetiva do Padrão Facial Comparative evaluation among facial attractiveness and subjective analysis of Facial Pattern

    Directory of Open Access Journals (Sweden)

    Olívia Morihisa

    2009-12-01

    Full Text Available OBJETIVO: estudar duas análises subjetivas faciais utilizadas para o diagnóstico ortodôntico, avaliação da agradabilidade facial e definição de Padrão Facial, e verificar a associação existente entre elas. MÉTODOS: utilizou-se 208 fotografias faciais padronizadas (104 laterais e 104 frontais de 104 indivíduos escolhidos aleatoriamente, as quais foram submetidas à avaliação da agradabilidade por dois grupos distintos (Grupo " Ortodontia" e Grupo " Leigos" , que classificaram os indivíduos em " agradável" , " aceitável" ou " desagradável" . Os indivíduos também foram classificados quanto ao Padrão Facial por três examinadores calibrados, utilizando-se apenas a vista lateral. RESULTADOS E CONCLUSÃO: após a análise estatística, verificou-se que houve associação fortemente positiva entre a agradabilidade facial e o Padrão Facial para a norma lateral, porém não para a frontal, em que os indivíduos tenderam a ser bem classificados mesmo no Padrão II.AIM: To study two subjective facial analysis commonly used on orthodontic diagnosis and to verify the association between the evaluation of facial attractiveness and Facial Pattern definition. METHODS: Two hundred and eight standardized face photographs (104 in lateral view and 104 in frontal view of 104 randomly chosen individuals were used in the present study. They were classified as " pleasant" , " acceptable" and " not pleasant" by two distinct groups: " Lay people" and " Orthodontists" . The individuals were either classified according to their Facial Pattern using lateral view images. RESULTS AND CONCLUSION: After statistical analysis, it was noted a strong positive concordance between facial attractiveness in lateral view and Facial Pattern, however, frontal view attractiveness classification did not have good concordance with Facial Pattern, tending to have good attractiveness classification even in Facial Pattern II.

  6. Retrospective case series of the imaging findings of facial nerve hemangioma.

    Science.gov (United States)

    Yue, Yunlong; Jin, Yanfang; Yang, Bentao; Yuan, Hui; Li, Jiandong; Wang, Zhenchang

    2015-09-01

    The aim was to compare high-resolution computed tomography (HRCT) and thin-section magnetic resonance imaging (MRI) findings of facial nerve hemangioma. The HRCT and MRI characteristics of 17 facial nerve hemangiomas diagnosed between 2006 and 2013 were retrospectively analyzed. All patients included in the study suffered from a space-occupying lesion of soft tissues at the geniculate ganglion fossa. Affected nerve was compared for size and shape with the contralateral unaffected nerve. HRCT showed irregular expansion and broadening of the facial nerve canal, damage of the bone wall and destruction of adjacent bone, with "point"-like or "needle"-like calcifications in 14 cases. The average CT value was 320.9 ± 141.8 Hu. Fourteen patients had a widened labyrinthine segment; 6/17 had a tympanic segment widening; 2/17 had a greater superficial petrosal nerve canal involvement, and 2/17 had an affected internal auditory canal (IAC) segment. On MRI, all lesions were significantly enhanced due to high blood supply. Using 2D FSE T2WI, the lesion detection rate was 82.4 % (14/17). 3D fast imaging employing steady-state acquisition (3D FIESTA) revealed the lesions in all patients. HRCT showed that the average number of involved segments in the facial nerve canal was 2.41, while MRI revealed an average of 2.70 segments (P facial nerve hemangioma were typical, revealing irregular masses growing along the facial nerve canal, with calcifications and rich blood supply. Thin-section enhanced MRI was more accurate in lesion detection and assessment compared with HRCT.

  7. Facial trauma.

    Science.gov (United States)

    Peeters, N; Lemkens, P; Leach, R; Gemels B; Schepers, S; Lemmens, W

    Facial trauma. Patients with facial trauma must be assessed in a systematic way so as to avoid missing any injury. Severe and disfiguring facial injuries can be distracting. However, clinicians must first focus on the basics of trauma care, following the Advanced Trauma Life Support (ATLS) system of care. Maxillofacial trauma occurs in a significant number of severely injured patients. Life- and sight-threatening injuries must be excluded during the primary and secondary surveys. Special attention must be paid to sight-threatening injuries in stabilized patients through early referral to an appropriate specialist or the early initiation of emergency care treatment. The gold standard for the radiographic evaluation of facial injuries is computed tomography (CT) imaging. Nasal fractures are the most frequent isolated facial fractures. Isolated nasal fractures are principally diagnosed through history and clinical examination. Closed reduction is the most frequently performed treatment for isolated nasal fractures, with a fractured nasal septum as a predictor of failure. Ear, nose and throat surgeons, maxillofacial surgeons and ophthalmologists must all develop an adequate treatment plan for patients with complex maxillofacial trauma.

  8. Effect of an observer's presence on facial behavior during dyadic communication.

    Science.gov (United States)

    Yamamoto, K; Suzuki, N

    2012-06-01

    In everyday life, people communicate not only with another person but also in front of other people. How do people behave during communication when observed by others? Effects of an observer (presence vs absence) and interpersonal relationship (friends vs strangers vs alone) on facial behavior were examined. Participants viewed film clips that elicited positive affect (film presentation) and discussed their impressions about the clips (conversation). Participants rated their subjective emotions and social motives. Durations of smiles, gazes, and utterances of each participant were coded. The presence of an observer did not affect facial behavior during the film presentation, but did affect gazes during conversation. Whereas the presence of an observer seemed to facilitate affiliation in pairs of strangers, communication between friends was exclusive and not affected by an observer.

  9. An analysis of facial nerve function in irradiated and unirradiated facial nerve grafts

    International Nuclear Information System (INIS)

    Brown, Paul D.; Eshleman, Jeffrey S.; Foote, Robert L.; Strome, Scott E.

    2000-01-01

    Purpose: The effect of high-dose radiation therapy on facial nerve grafts is controversial. Some authors believe radiotherapy is so detrimental to the outcome of facial nerve graft function that dynamic or static slings should be performed instead of facial nerve grafts in all patients who are to receive postoperative radiation therapy. Unfortunately, the facial function achieved with dynamic and static slings is almost always inferior to that after facial nerve grafts. In this retrospective study, we compared facial nerve function in irradiated and unirradiated nerve grafts. Methods and Materials: The medical records of 818 patients with neoplasms involving the parotid gland who received treatment between 1974 and 1997 were reviewed, of whom 66 underwent facial nerve grafting. Fourteen patients who died or had a recurrence less than a year after their facial nerve graft were excluded. The median follow-up for the remaining 52 patients was 10.6 years. Cable nerve grafts were performed in 50 patients and direct anastomoses of the facial nerve in two. Facial nerve function was scored by means of the House-Brackmann (H-B) facial grading system. Twenty-eight of the 52 patients received postoperative radiotherapy. The median time from nerve grafting to start of radiotherapy was 5.1 weeks. The median and mean doses of radiation were 6000 and 6033 cGy, respectively, for the irradiated grafts. One patient received preoperative radiotherapy to a total dose of 5000 cGy in 25 fractions and underwent surgery 1 month after the completion of radiotherapy. This patient was placed, by convention, in the irradiated facial nerve graft cohort. Results: Potential prognostic factors for facial nerve function such as age, gender, extent of surgery at the time of nerve grafting, preoperative facial nerve palsy, duration of preoperative palsy if present, or number of previous operations in the parotid bed were relatively well balanced between irradiated and unirradiated patients. However

  10. Oro-facial-digital syndrome Type 1: A case report

    Directory of Open Access Journals (Sweden)

    Kanika Singh Dhull

    2014-01-01

    Full Text Available Oro-Facial Digital Syndrome (OFDS is a generic term for group of apparently distinctive genetic diseases that affect the development of the oral cavity, facial features, and digits. One of these is OFDS type I (OFDS-I which has rarely been reported in Asian countries. This is the case report of a 13 year old patient with OFDS type I who reported to the Department of Pedodontics and Preventive Dentistry, with the complaint of discolored upper front teeth.

  11. Urinary oxytocin positively correlates with performance in facial visual search in unmarried males, without specific reaction to infant face.

    Science.gov (United States)

    Saito, Atsuko; Hamada, Hiroki; Kikusui, Takefumi; Mogi, Kazutaka; Nagasawa, Miho; Mitsui, Shohei; Higuchi, Takashi; Hasegawa, Toshikazu; Hiraki, Kazuo

    2014-01-01

    The neuropeptide oxytocin plays a central role in prosocial and parental behavior in non-human mammals as well as humans. It has been suggested that oxytocin may affect visual processing of infant faces and emotional reaction to infants. Healthy male volunteers (N = 13) were tested for their ability to detect infant or adult faces among adult or infant faces (facial visual search task). Urine samples were collected from all participants before the study to measure the concentration of oxytocin. Urinary oxytocin positively correlated with performance in the facial visual search task. However, task performance and its correlation with oxytocin concentration did not differ between infant faces and adult faces. Our data suggests that endogenous oxytocin is related to facial visual cognition, but does not promote infant-specific responses in unmarried men who are not fathers.

  12. Perceptual Processing Affects Conceptual Processing

    Science.gov (United States)

    van Dantzig, Saskia; Pecher, Diane; Zeelenberg, Rene; Barsalou, Lawrence W.

    2008-01-01

    According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task…

  13. In the face of threat: neural and endocrine correlates of impaired facial emotion recognition in cocaine dependence.

    Science.gov (United States)

    Ersche, K D; Hagan, C C; Smith, D G; Jones, P S; Calder, A J; Williams, G B

    2015-05-26

    The ability to recognize facial expressions of emotion in others is a cornerstone of human interaction. Selective impairments in the recognition of facial expressions of fear have frequently been reported in chronic cocaine users, but the nature of these impairments remains poorly understood. We used the multivariate method of partial least squares and structural magnetic resonance imaging to identify gray matter brain networks that underlie facial affect processing in both cocaine-dependent (n = 29) and healthy male volunteers (n = 29). We hypothesized that disruptions in neuroendocrine function in cocaine-dependent individuals would explain their impairments in fear recognition by modulating the relationship with the underlying gray matter networks. We found that cocaine-dependent individuals not only exhibited significant impairments in the recognition of fear, but also for facial expressions of anger. Although recognition accuracy of threatening expressions co-varied in all participants with distinctive gray matter networks implicated in fear and anger processing, in cocaine users it was less well predicted by these networks than in controls. The weaker brain-behavior relationships for threat processing were also mediated by distinctly different factors. Fear recognition impairments were influenced by variations in intelligence levels, whereas anger recognition impairments were associated with comorbid opiate dependence and related reduction in testosterone levels. We also observed an inverse relationship between testosterone levels and the duration of crack and opiate use. Our data provide novel insight into the neurobiological basis of abnormal threat processing in cocaine dependence, which may shed light on new opportunities facilitating the psychosocial integration of these patients.

  14. Facial aging: A clinical classification

    Directory of Open Access Journals (Sweden)

    Shiffman Melvin

    2007-01-01

    Full Text Available The purpose of this classification of facial aging is to have a simple clinical method to determine the severity of the aging process in the face. This allows a quick estimate as to the types of procedures that the patient would need to have the best results. Procedures that are presently used for facial rejuvenation include laser, chemical peels, suture lifts, fillers, modified facelift and full facelift. The physician is already using his best judgment to determine which procedure would be best for any particular patient. This classification may help to refine these decisions.

  15. Facial Expression Recognition via Non-Negative Least-Squares Sparse Coding

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2014-05-01

    Full Text Available Sparse coding is an active research subject in signal processing, computer vision, and pattern recognition. A novel method of facial expression recognition via non-negative least squares (NNLS sparse coding is presented in this paper. The NNLS sparse coding is used to form a facial expression classifier. To testify the performance of the presented method, local binary patterns (LBP and the raw pixels are extracted for facial feature representation. Facial expression recognition experiments are conducted on the Japanese Female Facial Expression (JAFFE database. Compared with other widely used methods such as linear support vector machines (SVM, sparse representation-based classifier (SRC, nearest subspace classifier (NSC, K-nearest neighbor (KNN and radial basis function neural networks (RBFNN, the experiment results indicate that the presented NNLS method performs better than other used methods on facial expression recognition tasks.

  16. Effects of strong bite force on the facial vertical dimension of pembarong performers

    Directory of Open Access Journals (Sweden)

    C. Christina

    2017-06-01

    Full Text Available Background: A pembarong performer is a reog dancer who bites on a piece of wood inserted into his/her mouth in order to support a 60 kg Barongan or Dadak Merak mask. The teeth supporting this large and heavy mask are directly affected, as the strong bite force exerted during a dance could affect their vertical and sagital facial dimensions. Purpose: This study aimed to examine the influence of the bite force of pembarong performers due to their vertical and sagital facial dimensions. Methods: The study reported here involved fifteen pembarong performers and thirteen individuals with normal occlusion (with specific criteria. The bite force of these subjects was measured with a dental prescale sensor during its centric occlusion. A cephalometric variation measurement was subsequently performed on all subjects with its effects on their vertical and sagital facial dimensions being measured. Results: The bite force value of the pembarong performers was 394.3816 ± 7.68787 Newtons, while the normal occlusion was 371.7784 ± 4.77791 Newtons. There was no correlation between the bite force and the facial sagital dimension of these subjects. However, a significant correlation did exist between bite force and lower facial height/total facial height (LFH/TFH ratio (p = 0.013. Conversely, no significant correlation between bite force and posterior facial height/total facial height (PFH/TFH ratio (p = 0.785 was detected. There was an inverse correlation between bite force and LFH/TFH ratio (r = -.464. Conclusion: Bite force is directly related to the decrease in LFH/TFH ratio. Occlusal pressure exerted by the posterior teeth on the alveolar bone may increase bone density at the endosteal surface of cortical bone.

  17. Dimensional Information-Theoretic Measurement of Facial Emotion Expressions in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Jihun Hamm

    2014-01-01

    Full Text Available Altered facial expressions of emotions are characteristic impairments in schizophrenia. Ratings of affect have traditionally been limited to clinical rating scales and facial muscle movement analysis, which require extensive training and have limitations based on methodology and ecological validity. To improve reliable assessment of dynamic facial expression changes, we have developed automated measurements of facial emotion expressions based on information-theoretic measures of expressivity of ambiguity and distinctiveness of facial expressions. These measures were examined in matched groups of persons with schizophrenia (n=28 and healthy controls (n=26 who underwent video acquisition to assess expressivity of basic emotions (happiness, sadness, anger, fear, and disgust in evoked conditions. Persons with schizophrenia scored higher on ambiguity, the measure of conditional entropy within the expression of a single emotion, and they scored lower on distinctiveness, the measure of mutual information across expressions of different emotions. The automated measures compared favorably with observer-based ratings. This method can be applied for delineating dynamic emotional expressivity in healthy and clinical populations.

  18. Multispace Behavioral Model for Face-Based Affective Social Agents

    Directory of Open Access Journals (Sweden)

    DiPaola Steve

    2007-01-01

    Full Text Available This paper describes a behavioral model for affective social agents based on three independent but interacting parameter spaces: knowledge, personality, and mood. These spaces control a lower-level geometry space that provides parameters at the facial feature level. Personality and mood use findings in behavioral psychology to relate the perception of personality types and emotional states to the facial actions and expressions through two-dimensional models for personality and emotion. Knowledge encapsulates the tasks to be performed and the decision-making process using a specially designed XML-based language. While the geometry space provides an MPEG-4 compatible set of parameters for low-level control, the behavioral extensions available through the triple spaces provide flexible means of designing complicated personality types, facial expression, and dynamic interactive scenarios.

  19. Multispace Behavioral Model for Face-Based Affective Social Agents

    Directory of Open Access Journals (Sweden)

    Ali Arya

    2007-03-01

    Full Text Available This paper describes a behavioral model for affective social agents based on three independent but interacting parameter spaces: knowledge, personality, and mood. These spaces control a lower-level geometry space that provides parameters at the facial feature level. Personality and mood use findings in behavioral psychology to relate the perception of personality types and emotional states to the facial actions and expressions through two-dimensional models for personality and emotion. Knowledge encapsulates the tasks to be performed and the decision-making process using a specially designed XML-based language. While the geometry space provides an MPEG-4 compatible set of parameters for low-level control, the behavioral extensions available through the triple spaces provide flexible means of designing complicated personality types, facial expression, and dynamic interactive scenarios.

  20. Neural Mechanism of Facial Expression Perception in Intellectually Gifted Adolescents

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The current study investigated the relationship between general intelligence and the three stages of facial expression processing. Two groups of adolescents with different levels of general intelligence were required to identify three types of facial expressions (happy, sad, and neutral faces...

  1. EXPRESS METHOD OF BARCODE GENERATION FROM FACIAL IMAGES

    Directory of Open Access Journals (Sweden)

    G. A. Kukharev

    2014-03-01

    Full Text Available In the paper a method of generating of standard type linear barcodes from facial images is proposed. The method is based on use of the histogram of facial image brightness, averaging the histogram on a limited number of intervals, quantization of results in a range of decimal numbers from 0 to 9 and table conversion into the final barcode. The proposed solution is computationally low-cost and not requires the use of specialized software on image processing that allows generating of facial barcodes in mobile systems, and thus the proposed method can be interpreted as an express method. Results of tests on the Face94 and CUHK Face Sketch FERET Databases showed that the proposed method is a new solution for use in the real-world practice and ensures the stability of generated barcodes in changes of scale, pose and mirroring of a facial image, and also changes of a facial expression and shadows on faces from local lighting. The proposed method is based on generating of a standard barcode directly from the facial image, and thus contains the subjective information about a person's face.

  2. Slowing down Presentation of Facial Movements and Vocal Sounds Enhances Facial Expression Recognition and Induces Facial-Vocal Imitation in Children with Autism

    Science.gov (United States)

    Tardif, Carole; Laine, France; Rodriguez, Melissa; Gepner, Bruno

    2007-01-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on…

  3. Capturing Physiology of Emotion along Facial Muscles: A Method of Distinguishing Feigned from Involuntary Expressions

    Science.gov (United States)

    Khan, Masood Mehmood; Ward, Robert D.; Ingleby, Michael

    The ability to distinguish feigned from involuntary expressions of emotions could help in the investigation and treatment of neuropsychiatric and affective disorders and in the detection of malingering. This work investigates differences in emotion-specific patterns of thermal variations along the major facial muscles. Using experimental data extracted from 156 images, we attempted to classify patterns of emotion-specific thermal variations into neutral, and voluntary and involuntary expressions of positive and negative emotive states. Initial results suggest (i) each facial muscle exhibits a unique thermal response to various emotive states; (ii) the pattern of thermal variances along the facial muscles may assist in classifying voluntary and involuntary facial expressions; and (iii) facial skin temperature measurements along the major facial muscles may be used in automated emotion assessment.

  4. Prosthetic management of mid-facial defect with magnet-retained silicone prosthesis.

    Science.gov (United States)

    Buzayan, Muaiyed M

    2014-02-01

    Mid-facial defect is one of the most disfiguring and impairing defects. A design of prosthesis that is aesthetic and stable can be precious to a patient who has lost part of his face due to surgical excision. Prosthesis can restore the patients' self-esteem and confidence, which affects the patients and their life style. The aim of this case report is to describe a technique of mid-facial silicone prosthesis fabrication. To provide an aesthetic and stable facial prosthesis, the extra-oral prosthesis was fabricated using silicone material, while the intra-oral defect was restored with obturator prosthesis, and then both prostheses were connected and attached to each other using magnets. This clinical report describes the rehabilitation of a large mid-facial defect with a two-piece prosthesis. The silicone facial prosthesis was made hollow and lighter by using an acrylic framework. Two acrylic channels were included within the facial prosthesis to provide the patient with clean and patent airways. A sectional mid-facial prosthesis was made and retained in place by using magnets, which resulted in a significant improvement in the aesthetical and functional outcome without the need for plastic surgery. Silicone prostheses are reliable alternatives to surgery and should be considered in selected cases.

  5. Emotion and sex of facial stimuli modulate conditional automaticity in behavioral and neuronal interference in healthy men.

    Science.gov (United States)

    Kohn, Nils; Fernández, Guillén

    2017-12-06

    Our surrounding provides a host of sensory input, which we cannot fully process without streamlining and automatic processing. Levels of automaticity differ for different cognitive and affective processes. Situational and contextual interactions between cognitive and affective processes in turn influence the level of automaticity. Automaticity can be measured by interference in Stroop tasks. We applied an emotional version of the Stroop task to investigate how stress as a contextual factor influences the affective valence-dependent level of automaticity. 120 young, healthy men were investigated for behavioral and brain interference following a stress induction or control procedure in a counter-balanced cross-over-design. Although Stroop interference was always observed, sex and emotion of the face strongly modulated interference, which was larger for fearful and male faces. These effects suggest higher automaticity when processing happy and also female faces. Supporting behavioral patterns, brain data show lower interference related brain activity in executive control related regions in response to happy and female faces. In the absence of behavioral stress effects, congruent compared to incongruent trials (reverse interference) showed little to no deactivation under stress in response to happy female and fearful male trials. These congruency effects are potentially based on altered context- stress-related facial processing that interact with sex-emotion stereotypes. Results indicate that sex and facial emotion modulate Stroop interference in brain and behavior. These effects can be explained by altered response difficulty as a consequence of the contextual and stereotype related modulation of automaticity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Diplegia facial traumatica Traumatic facial diplegia: a case report

    Directory of Open Access Journals (Sweden)

    J. Fortes-Rego

    1975-12-01

    Full Text Available É relatado um caso de paralisia facial bilateral, incompleta, associada a hipoacusia esquerda, após traumatismo cranioencefálico, com fraturas evidenciadas radiológicamente. Algumas considerações são formuladas tentando relacionar ditas manifestações com fraturas do osso temporal.A case of traumatic facial diplegia with left partial loss of hearing following head injury is reported. X-rays showed fractures on the occipital and left temporal bones. A review of traumatic facial paralysis is made.

  7. Facial Expression Recognition Teaching to Preschoolers with Autism

    DEFF Research Database (Denmark)

    Christinaki, Eirini; Vidakis, Nikolaos; Triantafyllidis, Georgios

    2013-01-01

    The recognition of facial expressions is important for the perception of emotions. Understanding emotions is essential in human communication and social interaction. Children with autism have been reported to exhibit deficits in the recognition of affective expressions. Their difficulties...

  8. Facial preservation following extreme mummification: Shrunken heads.

    Science.gov (United States)

    Houlton, Tobias M R; Wilkinson, Caroline

    2018-05-01

    Shrunken heads are a mummification phenomenon unique to South America. Ceremonial tsantsa are ritually reduced heads from enemy victims of the Shuar, Achuar, Awajún (Aguaruna), Wampís (Huambisa), and Candoshi-Shapra cultures. Commercial shrunken heads are comparatively modern and fraudulently produced for the curio-market, often using stolen bodies from hospital mortuaries and graves. To achieve shrinkage and desiccation, heads undergo skinning, simmering (in water) and drying. Considering the intensive treatments applied, this research aims to identify how the facial structure can alter and impact identification using post-mortem depiction. Sixty-five human shrunken heads were assessed: 6 ceremonial, 36 commercial, and 23 ambiguous. Investigations included manual inspection, multi-detector computerised tomography, infrared reflectography, ultraviolet fluorescence and microscopic hair analysis. The mummification process disfigures the outer face, cheeks, nasal root and bridge form, including brow ridge, eyes, ears, mouth, and nose projection. Melanin depletion, epidermal degeneration, and any applied staining changes the natural skin complexion. Papillary and reticular dermis separation is possible. Normal hair structure (cuticle, cortex, medulla) is retained. Hair appears longer (unless cut) and more profuse following shrinkage. Significant features retained include skin defects, facial creases, hairlines and earlobe form. Hair conditions that only affect living scalps are preserved (e.g. nits, hair casts). Ear and nose cartilage helps to retain some morphological information. Commercial heads appear less distorted than ceremonial tsantsa, often presenting a definable eyebrow shape, vermillion lip shape, lip thickness (if mouth is open), philtrum form, and palpebral slit angle. Facial identification capabilities are considered limited, and only perceived possible for commercial heads. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.

    Science.gov (United States)

    Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi

    2012-12-01

    We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  10. Multiple dental anomalies accompany unilateral disturbances in abducens and facial nerves: A case report

    Directory of Open Access Journals (Sweden)

    Elham Talatahari

    2016-01-01

    Full Text Available This article describes the oral rehabilitation of an 8-year-old girl with extensively affected primary and permanent dentition. This report is unique in which distinct dental anomalies including enamel hypoplasia, irregular dentin formation, taurodontism, hpodontia and dens in dente accompany unilateral disturbance of abducens and facial nerves which control the lateral eye movement, and facial expression, respectively.   Keywords: enamel hypoplasia; irregular dentin formation; taurodontism; hypodontia; dens in dente; abducens and facial nerves;

  11. [Peripheral facial nerve lesion induced long-term dendritic retraction in pyramidal cortico-facial neurons].

    Science.gov (United States)

    Urrego, Diana; Múnera, Alejandro; Troncoso, Julieta

    2011-01-01

    Little evidence is available concerning the morphological modifications of motor cortex neurons associated with peripheral nerve injuries, and the consequences of those injuries on post lesion functional recovery. Dendritic branching of cortico-facial neurons was characterized with respect to the effects of irreversible facial nerve injury. Twenty-four adult male rats were distributed into four groups: sham (no lesion surgery), and dendritic assessment at 1, 3 and 5 weeks post surgery. Eighteen lesion animals underwent surgical transection of the mandibular and buccal branches of the facial nerve. Dendritic branching was examined by contralateral primary motor cortex slices stained with the Golgi-Cox technique. Layer V pyramidal (cortico-facial) neurons from sham and injured animals were reconstructed and their dendritic branching was compared using Sholl analysis. Animals with facial nerve lesions displayed persistent vibrissal paralysis throughout the five week observation period. Compared with control animal neurons, cortico-facial pyramidal neurons of surgically injured animals displayed shrinkage of their dendritic branches at statistically significant levels. This shrinkage persisted for at least five weeks after facial nerve injury. Irreversible facial motoneuron axonal damage induced persistent dendritic arborization shrinkage in contralateral cortico-facial neurons. This morphological reorganization may be the physiological basis of functional sequelae observed in peripheral facial palsy patients.

  12. [Clinical experience in facial nerve tumors: a review of 27 cases].

    Science.gov (United States)

    Zhang, Fan; Wang, Yucheng; Dai, Chunfu; Chi, Fanglu; Zhou, Liang; Chen, Bing; Li, Huawei

    2010-01-01

    To analyze the clinical manifestations and the diagnosis of the facial nerve tumor according to the clinical information, and evaluate the different surgical approaches depending on tumor location. Twenty-seven cases of facial nerve tumors with general clinical informations available from 1999.9 to 2006.12 in the Shanghai EENT Hospital were reviewed retrospectively. Twenty (74.1%) schwannomas, 4 (14.8%) neurofibromas ,and 3 (11.1%) hemangiomas were identified with histopathology postoperatively. During the course of the disease, 23 patients (85.2%) suffered facial paralysis, both hearing loss and tinnitus affected 11 (40.7%) cases, 5 (18.5%) manifested infra-auricular mass and the others showed some of otalgia or vertigo or ear fullness or facial numbness/twitches. CT or/and MRI results in 24 cases indicated that the tumors originated from the facial nerve. Intra-operative findings showed that 24 (88.9%) cases involved no less than 2 segments of the facial nerve, of these 24 cases 87.5% (21/24) involved the mastoid portion, 70.8% (17/24) involved the tympanic portion, 62.5% (15/24) involved the geniculate ganglion, only 4.2% (1/24) involved the internal acoustic canal (IAC), and 3 cases (11.1%) had only one segments involved. In all of these 27 cases, the tumors were completely excised, of which 13 were resected followed by an immediate facial nerve reconstruction, including 11 sural nerve cable graft, 1 facial nerve end-to-end anastomosis and 1 hypoglossal-facial nerve end-to-end anastomosis. Tumors were removed with preservation of facial nerve continuity in 2 cases. Facial nerve tumor is a rare and benign lesion, and has numerous clinical manifestations. CT and MRI can help surgeons to make a right diagnosis preoperatively. When and how to give the patients an operation depends on the patients individually.

  13. Facial Pain Followed by Unilateral Facial Nerve Palsy: A Case Report with Literature Review

    OpenAIRE

    GV, Sowmya; BS, Manjunatha; Goel, Saurabh; Singh, Mohit Pal; Astekar, Madhusudan

    2014-01-01

    Peripheral facial nerve palsy is the commonest cranial nerve motor neuropathy. The causes range from cerebrovascular accident to iatrogenic damage, but there are few reports of facial nerve paralysis attributable to odontogenic infections. In majority of the cases, recovery of facial muscle function begins within first three weeks after onset. This article reports a unique case of 32-year-old male patient who developed facial pain followed by unilateral facial nerve paralysis due to odontogen...

  14. Urinary oxytocin positively correlates with performance in facial visual search in unmarried males, without specific reaction to infant face

    Directory of Open Access Journals (Sweden)

    Atsuko eSaito

    2014-07-01

    Full Text Available The neuropeptide oxytocin plays a central role in prosocial and parental behavior in non-human mammals as well as humans. It has been suggested that oxytocin may affect visual processing of infant faces and emotional reaction to infants. Healthy male volunteers (N = 13 were tested for their ability to detect infant or adult faces among adult or infant faces (facial visual search task. Urine samples were collected from all participants before the study to measure the concentration of oxytocin. Urinary oxytocin positively correlated with performance in the facial visual search task. However, task performance and its correlation with oxytocin concentration did not differ between infant faces and adult faces. Our data suggests that endogenous oxytocin is related to facial visual cognition, but does not promote infant-specific responses in unmarried men who are not fathers.

  15. The Influence of Facial Signals on the Automatic Imitation of Hand Actions.

    Science.gov (United States)

    Butler, Emily E; Ward, Robert; Ramsey, Richard

    2016-01-01

    Imitation and facial signals are fundamental social cues that guide interactions with others, but little is known regarding the relationship between these behaviors. It is clear that during expression detection, we imitate observed expressions by engaging similar facial muscles. It is proposed that a cognitive system, which matches observed and performed actions, controls imitation and contributes to emotion understanding. However, there is little known regarding the consequences of recognizing affective states for other forms of imitation, which are not inherently tied to the observed emotion. The current study investigated the hypothesis that facial cue valence would modulate automatic imitation of hand actions. To test this hypothesis, we paired different types of facial cue with an automatic imitation task. Experiments 1 and 2 demonstrated that a smile prompted greater automatic imitation than angry and neutral expressions. Additionally, a meta-analysis of this and previous studies suggests that both happy and angry expressions increase imitation compared to neutral expressions. By contrast, Experiments 3 and 4 demonstrated that invariant facial cues, which signal trait-levels of agreeableness, had no impact on imitation. Despite readily identifying trait-based facial signals, levels of agreeableness did not differentially modulate automatic imitation. Further, a Bayesian analysis showed that the null effect was between 2 and 5 times more likely than the experimental effect. Therefore, we show that imitation systems are more sensitive to prosocial facial signals that indicate "in the moment" states than enduring traits. These data support the view that a smile primes multiple forms of imitation including the copying actions that are not inherently affective. The influence of expression detection on wider forms of imitation may contribute to facilitating interactions between individuals, such as building rapport and affiliation.

  16. Can the usage of human growth hormones affect facial appearance and the accuracy of face recognition systems?

    Science.gov (United States)

    Rose, Jake; Martin, Michael; Bourlai, Thirimachos

    2014-06-01

    In law enforcement and security applications, the acquisition of face images is critical in producing key trace evidence for the successful identification of potential threats. The goal of the study is to demonstrate that steroid usage significantly affects human facial appearance and hence, the performance of commercial and academic face recognition (FR) algorithms. In this work, we evaluate the performance of state-of-the-art FR algorithms on two unique face image datasets of subjects before (gallery set) and after (probe set) steroid (or human growth hormone) usage. For the purpose of this study, datasets of 73 subjects were created from multiple sources found on the Internet, containing images of men and women before and after steroid usage. Next, we geometrically pre-processed all images of both face datasets. Then, we applied image restoration techniques on the same face datasets, and finally, we applied FR algorithms in order to match the pre-processed face images of our probe datasets against the face images of the gallery set. Experimental results demonstrate that only a specific set of FR algorithms obtain the most accurate results (in terms of the rank-1 identification rate). This is because there are several factors that influence the efficiency of face matchers including (i) the time lapse between the before and after image pre-processing and restoration face photos, (ii) the usage of different drugs (e.g. Dianabol, Winstrol, and Decabolan), (iii) the usage of different cameras to capture face images, and finally, (iv) the variability of standoff distance, illumination and other noise factors (e.g. motion noise). All of the previously mentioned complicated scenarios make clear that cross-scenario matching is a very challenging problem and, thus, further investigation is required.

  17. Extracranial Facial Nerve Schwannoma Treated by Hypo-fractionated CyberKnife Radiosurgery.

    Science.gov (United States)

    Sasaki, Ayaka; Miyazaki, Shinichiro; Hori, Tomokatsu

    2016-09-21

    Facial nerve schwannoma is a rare intracranial tumor. Treatment for this benign tumor has been controversial. Here, we report a case of extracranial facial nerve schwannoma treated successfully by hypo-fractionated CyberKnife (Accuray, Sunnyvale, CA) radiosurgery and discuss the efficacy of this treatment. A 34-year-old female noticed a swelling in her right mastoid process. The lesion enlarged over a seven-month period, and she experienced facial spasm on the right side. She was diagnosed with a facial schwannoma via a magnetic resonance imaging (MRI) scan of the head and neck and was told to wait until the facial nerve palsy subsides. She was referred to our hospital for radiation therapy. We planned a fractionated CyberKnife radiosurgery for three consecutive days. After CyberKnife radiosurgery, the mass in the right parotid gradually decreased in size, and the facial nerve palsy disappeared. At her eight-month follow-up, her facial spasm had completely disappeared. There has been no recurrence and the facial nerve function has been normal. We successfully demonstrated the efficacy of CyberKnife radiosurgery as an alternative treatment that also preserves neurofunction for facial nerve schwannomas.

  18. La extracción dentaria en la celulitis facial odontogénica Dental extraction in odontogenic facial cellulitis

    Directory of Open Access Journals (Sweden)

    Pedro A Ducasse Olivera

    2004-08-01

    Full Text Available Se realizó un estudio retrospectivo de los pacientes ingresados en el Hospital "Héroes del Baire" con el diagnóstico de celulitis facial odontogénica, con el objetivo de caracterizar la celulitis facial en nuestro medio, así como el nivel de conocimiento de los estomatólogos y la población tiene de esta. Se obtuvieron los siguientes resultados: el sexo masculino y la región mandibular en pacientes de 15 a 29 años fueron los mas afectados. El antibiótico más utilizado fue la penicilina, y predominaron los casos moderados y leves. El nivel de información sobre el tema de los estomatólogos es adecuado, no así el de la población, que es deficiente.A retrospective study of patients admitted to "Heroes del Baire" hospital and diagnosed with odontogenic facial cellulitis was undertaken to characterize facial cellulitis behavior under our conditions as well as the level of knowledge by dentists and the population about this entity. The results were as follows: males and the mandibular region in 15-29 years-old patients were the most affected, penicillin was the most used antibiotic and moderate and mild cases predominated. The level of knowledge by dentists was adequate; however that of the population was poor.

  19. Chondromyxoid fibroma of the mastoid facial nerve canal mimicking a facial nerve schwannoma.

    Science.gov (United States)

    Thompson, Andrew L; Bharatha, Aditya; Aviv, Richard I; Nedzelski, Julian; Chen, Joseph; Bilbao, Juan M; Wong, John; Saad, Reda; Symons, Sean P

    2009-07-01

    Chondromyxoid fibroma of the skull base is a rare entity. Involvement of the temporal bone is particularly rare. We present an unusual case of progressive facial nerve paralysis with imaging and clinical findings most suggestive of a facial nerve schwannoma. The lesion was tubular in appearance, expanded the mastoid facial nerve canal, protruded out of the stylomastoid foramen, and enhanced homogeneously. The only unusual imaging feature was minor calcification within the tumor. Surgery revealed an irregular, cystic lesion. Pathology diagnosed a chondromyxoid fibroma involving the mastoid portion of the facial nerve canal, destroying the facial nerve.

  20. Functional integration of the posterior superior temporal sulcus correlates with facial expression recognition.

    Science.gov (United States)

    Wang, Xu; Song, Yiying; Zhen, Zonglei; Liu, Jia

    2016-05-01

    Face perception is essential for daily and social activities. Neuroimaging studies have revealed a distributed face network (FN) consisting of multiple regions that exhibit preferential responses to invariant or changeable facial information. However, our understanding about how these regions work collaboratively to facilitate facial information processing is limited. Here, we focused on changeable facial information processing, and investigated how the functional integration of the FN is related to the performance of facial expression recognition. To do so, we first defined the FN as voxels that responded more strongly to faces than objects, and then used a voxel-based global brain connectivity method based on resting-state fMRI to characterize the within-network connectivity (WNC) of each voxel in the FN. By relating the WNC and performance in the "Reading the Mind in the Eyes" Test across participants, we found that individuals with stronger WNC in the right posterior superior temporal sulcus (rpSTS) were better at recognizing facial expressions. Further, the resting-state functional connectivity (FC) between the rpSTS and right occipital face area (rOFA), early visual cortex (EVC), and bilateral STS were positively correlated with the ability of facial expression recognition, and the FCs of EVC-pSTS and OFA-pSTS contributed independently to facial expression recognition. In short, our study highlights the behavioral significance of intrinsic functional integration of the FN in facial expression processing, and provides evidence for the hub-like role of the rpSTS for facial expression recognition. Hum Brain Mapp 37:1930-1940, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Cocaine users manifest impaired prosodic and cross-modal emotion processing

    Directory of Open Access Journals (Sweden)

    Lea M Hulka

    2013-09-01

    Full Text Available Background: A small number of previous studies have provided evidence that cocaine users exhibit impairments in complex social cognition tasks, while the more basic facial emotion recognition is widely unaffected. However, prosody and cross-modal emotion processing has not been systematically investigated in cocaine users so far. Therefore, the aim of the present study was to assess complex multisensory emotion processing in cocaine users in comparison to controls and to examine a potential association with drug use patterns.Method: The abbreviated version of the Comprehensive Affect Testing System (CATS-A was used to measure emotion perception across the three channels of facial affect, prosody, and semantic content in 58 cocaine users and 48 healthy control subjects who were matched for age, sex, verbal intelligence, and years of education.Results: Cocaine users had significantly lower scores than controls in the quotient scales of Emotion Recognition and Prosody Recognition and the subtests Conflicting Prosody/Meaning – Attend to Prosody and Match Emotional Prosody to Emotional Face either requiring to attend to prosody or to integrate cross-modal information. In contrast, no group difference emerged for the Affect Recognition Quotient. Cumulative cocaine doses and duration of cocaine use correlated negatively with emotion processing.Conclusion: Cocaine users show impaired cross-modal integration of different emotion processing channels particularly with regard to prosody, whereas more basic aspects of emotion processing such as facial affect perception are comparable to the performance of healthy controls.

  2. MRI enhancement of the facial nerve with Gd-DTPA, 1

    International Nuclear Information System (INIS)

    Yanagida, Masahiro

    1993-01-01

    Although there have recently been numerous reports of enhanced MRI in patients with facial palsy, the mechanism of enhancement remains largely unknown. In the present study, animal models with experimentally induced facial paralysis were prepared, and the vascular permeabilities of normal and damaged facial nerves were assessed using Evans blue albumin (EBA) as a tracer. The Gd-DTPA contents in normal and compressively damaged facial nerves were also investigated. In the normal intratemporal facial nerve, EBA remained in the vessels, and did not leak into the endoneurium. In contrast, vascular permeability was very high in the epineurium and the geniculate ganglion which showed leakage of large amounts of EBA from vessels. At the site of compression in the damaged nerve, EBA leakage was also seen in the endoneurism, indicating accentuated vascular permeability. This accentuation of vascular permeability shifted toward the distal side. However, no EBA leakage was seen on the side proximal to the site of compression. Significantly higher Gd-DTPA contents were obtained in the facial nerve on the paralytic side than in that on the normal side (p<0.001). As for differences between the distal and proximal sides, the distal side had a significantly higher Gd-DTPA content (p<0.01). Assessment of vascular permeability with EBA revealed accentuated vascular permeability on the side distal to the site of compression. These results showed the presence of a blood nerve barrier (BNB) in the facial nerve. Furthermore, the present findings suggest that the enhancement of the facial nerve on the affected side is caused by BNB destruction due to nerve damage and subsequent Gd-DTPA leakage from the vessels. Furthermore, it is suggested that the facial nerve enhancement appears to occur mainly on the distal side of the damaged portion of the nerve. (author)

  3. Common and distinct neural correlates of facial emotion processing in social anxiety disorder and Williams syndrome: A systematic review and voxel-based meta-analysis of functional resonance imaging studies.

    Science.gov (United States)

    Binelli, C; Subirà, S; Batalla, A; Muñiz, A; Sugranyés, G; Crippa, J A; Farré, M; Pérez-Jurado, L; Martín-Santos, R

    2014-11-01

    Social Anxiety Disorder (SAD) and Williams-Beuren Syndrome (WS) are two conditions which seem to be at opposite ends in the continuum of social fear but show compromised abilities in some overlapping areas, including some social interactions, gaze contact and processing of facial emotional cues. The increase in the number of neuroimaging studies has greatly expanded our knowledge of the neural bases of facial emotion processing in both conditions. However, to date, SAD and WS have not been compared. We conducted a systematic review of functional magnetic resonance imaging (fMRI) studies comparing SAD and WS cases to healthy control participants (HC) using facial emotion processing paradigms. Two researchers conducted comprehensive PubMed/Medline searches to identify all fMRI studies of facial emotion processing in SAD and WS. The following search key-words were used: "emotion processing"; "facial emotion"; "social anxiety"; "social phobia"; "Williams syndrome"; "neuroimaging"; "functional magnetic resonance"; "fMRI" and their combinations, as well as terms specifying individual facial emotions. We extracted spatial coordinates from each study and conducted two separate voxel-wise activation likelihood estimation meta-analyses, one for SAD and one for WS. Twenty-two studies met the inclusion criteria: 17 studies of SAD and five of WS. We found evidence for both common and distinct patterns of neural activation. Limbic engagement was common to SAD and WS during facial emotion processing, although we observed opposite patterns of activation for each disorder. Compared to HC, SAD cases showed hyperactivation of the amygdala, the parahippocampal gyrus and the globus pallidus. Compared to controls, participants with WS showed hypoactivation of these regions. Differential activation in a number of regions specific to either condition was also identified: SAD cases exhibited greater activation of the insula, putamen, the superior temporal gyrus, medial frontal regions and

  4. Web-based Visualisation of Head Pose and Facial Expressions Changes:

    DEFF Research Database (Denmark)

    Kalliatakis, Grigorios; Vidakis, Nikolaos; Triantafyllidis, Georgios

    2016-01-01

    Despite significant recent advances in the field of head pose estimation and facial expression recognition, raising the cognitive level when analysing human activity presents serious challenges to current concepts. Motivated by the need of generating comprehensible visual representations from...... and accurately estimate head pose changes in unconstrained environment. In order to complete the secondary process of recognising four universal dominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial expressions (ERFE) was adopted. After that, a lightweight data...

  5. The Emotional Modulation of Facial Mimicry: A Kinematic Study

    Directory of Open Access Journals (Sweden)

    Antonella Tramacere

    2018-01-01

    Full Text Available It is well-established that the observation of emotional facial expression induces facial mimicry responses in the observers. However, how the interaction between emotional and motor components of facial expressions can modulate the motor behavior of the perceiver is still unknown. We have developed a kinematic experiment to evaluate the effect of different oro-facial expressions on perceiver's face movements. Participants were asked to perform two movements, i.e., lip stretching and lip protrusion, in response to the observation of four meaningful (i.e., smile, angry-mouth, kiss, and spit and two meaningless mouth gestures. All the stimuli were characterized by different motor patterns (mouth aperture or mouth closure. Response Times and kinematics parameters of the movements (amplitude, duration, and mean velocity were recorded and analyzed. Results evidenced a dissociated effect on reaction times and movement kinematics. We found shorter reaction time when a mouth movement was preceded by the observation of a meaningful and motorically congruent oro-facial gesture, in line with facial mimicry effect. On the contrary, during execution, the perception of smile was associated with the facilitation, in terms of shorter duration and higher velocity of the incongruent movement, i.e., lip protrusion. The same effect resulted in response to kiss and spit that significantly facilitated the execution of lip stretching. We called this phenomenon facial mimicry reversal effect, intended as the overturning of the effect normally observed during facial mimicry. In general, the findings show that both motor features and types of emotional oro-facial gestures (conveying positive or negative valence affect the kinematics of subsequent mouth movements at different levels: while congruent motor features facilitate a general motor response, motor execution could be speeded by gestures that are motorically incongruent with the observed one. Moreover, valence

  6. The Emotional Modulation of Facial Mimicry: A Kinematic Study.

    Science.gov (United States)

    Tramacere, Antonella; Ferrari, Pier F; Gentilucci, Maurizio; Giuffrida, Valeria; De Marco, Doriana

    2017-01-01

    It is well-established that the observation of emotional facial expression induces facial mimicry responses in the observers. However, how the interaction between emotional and motor components of facial expressions can modulate the motor behavior of the perceiver is still unknown. We have developed a kinematic experiment to evaluate the effect of different oro-facial expressions on perceiver's face movements. Participants were asked to perform two movements, i.e., lip stretching and lip protrusion, in response to the observation of four meaningful (i.e., smile, angry-mouth, kiss, and spit) and two meaningless mouth gestures. All the stimuli were characterized by different motor patterns (mouth aperture or mouth closure). Response Times and kinematics parameters of the movements (amplitude, duration, and mean velocity) were recorded and analyzed. Results evidenced a dissociated effect on reaction times and movement kinematics. We found shorter reaction time when a mouth movement was preceded by the observation of a meaningful and motorically congruent oro-facial gesture, in line with facial mimicry effect. On the contrary, during execution, the perception of smile was associated with the facilitation, in terms of shorter duration and higher velocity of the incongruent movement, i.e., lip protrusion. The same effect resulted in response to kiss and spit that significantly facilitated the execution of lip stretching. We called this phenomenon facial mimicry reversal effect , intended as the overturning of the effect normally observed during facial mimicry. In general, the findings show that both motor features and types of emotional oro-facial gestures (conveying positive or negative valence) affect the kinematics of subsequent mouth movements at different levels: while congruent motor features facilitate a general motor response, motor execution could be speeded by gestures that are motorically incongruent with the observed one. Moreover, valence effect depends on

  7. How Do We Update Faces? Effects of Gaze Direction and Facial Expressions on Working Memory Updating

    OpenAIRE

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enh...

  8. Recognition of computerized facial approximations by familiar assessors.

    Science.gov (United States)

    Richard, Adam H; Monson, Keith L

    2017-11-01

    Studies testing the effectiveness of facial approximations typically involve groups of participants who are unfamiliar with the approximated individual(s). This limitation requires the use of photograph arrays including a picture of the subject for comparison to the facial approximation. While this practice is often necessary due to the difficulty in obtaining a group of assessors who are familiar with the approximated subject, it may not accurately simulate the thought process of the target audience (friends and family members) in comparing a mental image of the approximated subject to the facial approximation. As part of a larger process to evaluate the effectiveness and best implementation of the ReFace facial approximation software program, the rare opportunity arose to conduct a recognition study using assessors who were personally acquainted with the subjects of the approximations. ReFace facial approximations were generated based on preexisting medical scans, and co-workers of the scan donors were tested on whether they could accurately pick out the approximation of their colleague from arrays of facial approximations. Results from the study demonstrated an overall poor recognition performance (i.e., where a single choice within a pool is not enforced) for individuals who were familiar with the approximated subjects. Out of 220 recognition tests only 10.5% resulted in the assessor selecting the correct approximation (or correctly choosing not to make a selection when the array consisted only of foils), an outcome that was not significantly different from the 9% random chance rate. When allowed to select multiple approximations the assessors felt resembled the target individual, the overall sensitivity for ReFace approximations was 16.0% and the overall specificity was 81.8%. These results differ markedly from the results of a previous study using assessors who were unfamiliar with the approximated subjects. Some possible explanations for this disparity in

  9. Caricaturing facial expressions.

    Science.gov (United States)

    Calder, A J; Rowland, D; Young, A W; Nimmo-Smith, I; Keane, J; Perrett, D I

    2000-08-14

    The physical differences between facial expressions (e.g. fear) and a reference norm (e.g. a neutral expression) were altered to produce photographic-quality caricatures. In Experiment 1, participants rated caricatures of fear, happiness and sadness for their intensity of these three emotions; a second group of participants rated how 'face-like' the caricatures appeared. With increasing levels of exaggeration the caricatures were rated as more emotionally intense, but less 'face-like'. Experiment 2 demonstrated a similar relationship between emotional intensity and level of caricature for six different facial expressions. Experiments 3 and 4 compared intensity ratings of facial expression caricatures prepared relative to a selection of reference norms - a neutral expression, an average expression, or a different facial expression (e.g. anger caricatured relative to fear). Each norm produced a linear relationship between caricature and rated intensity of emotion; this finding is inconsistent with two-dimensional models of the perceptual representation of facial expression. An exemplar-based multidimensional model is proposed as an alternative account.

  10. Effects of Early Neglect Experience on Recognition and Processing of Facial Expressions: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Victoria Doretto

    2018-01-01

    Full Text Available Background: Child neglect is highly prevalent and associated with a series of biological and social consequences. Early neglect may alter the recognition of emotional faces, but its precise impact remains unclear. We aim to review and analyze data from recent literature about recognition and processing of facial expressions in individuals with history of childhood neglect. Methods: We conducted a systematic review using PubMed, PsycINFO, ScIELO and EMBASE databases in the search of studies for the past 10 years. Results: In total, 14 studies were selected and critically reviewed. A heterogeneity was detected across methods and sample frames. Results were mixed across studies. Different forms of alterations to perception of facial expressions were found across 12 studies. There was alteration to the recognition and processing of both positive and negative emotions, but for emotional face processing there was predominance in alteration toward negative emotions. Conclusions: This is the first review to examine specifically the effects of early neglect experience as a prevalent condition of child maltreatment. The results of this review are inconclusive due to methodological diversity, implement of distinct instruments and differences in the composition of the samples. Despite these limitations, some studies support our hypothesis that individuals with history of early negligence may present alteration to the ability to perceive face expressions of emotions. The article brings relevant information that can help in the development of more effective therapeutic strategies to reduce the impact of neglect on the cognitive and emotional development of the child.

  11. Parotidectomía y vena facial Parotidectomy and facial vein

    Directory of Open Access Journals (Sweden)

    F. Hernández Altemir

    2009-10-01

    Full Text Available La cirugía de los tumores benignos de la parótida, es una cirugía de relaciones con estructuras fundamentalmente nerviosas cuyo daño, representa un gravísimo problema psicosomático por definirlo de una manera genérica. Para ayudar al manejo quirúrgico del nervio facial periférico, es por lo que en el presente artículo tratamos de enfatizar la importancia de la vena facial en la disección y conservación del nervio, precisamente donde su disección suele ser más comprometida, esto es en las ramas más caudales. El trabajo que vamos a desarrollar hay que verlo pues, como un ensalzamiento de las estructuras venosas en el seguimiento y control del nervio facial periférico y de porqué no, el nervio auricular mayor no siempre suficientemente valorado en la cirugía de la parótida al perder protagonismo con el facial.Benign parotid tumor surgery is related to fundamental nervous structures, defined simply: that when damaged cause great psychosomatic problems. In order to make peripheral facial nerve surgery easy to handle for the surgeon this article emphasizes the importance of the facial vein in the dissection and conservation of the nerve. Its dissection can be compromised if the caudal branches are damaged. The study that we develop should be seen as praise for the vein structures in the follow up and control of the peripheral facial nerve, and the main auricular nerve that is often undervalued when it is no longer the protagonist in the face.

  12. Automatic Facial Expression Recognition and Operator Functional State

    Science.gov (United States)

    Blanson, Nina

    2012-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions

  13. Automatic Facial Expression Recognition and Operator Functional State

    Science.gov (United States)

    Blanson, Nina

    2011-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions.

  14. Facial reanimation by muscle-nerve neurotization after facial nerve sacrifice. Case report.

    Science.gov (United States)

    Taupin, A; Labbé, D; Babin, E; Fromager, G

    2016-12-01

    Recovering a certain degree of mimicry after sacrifice of the facial nerve is a clinically recognized finding. The authors report a case of hemifacial reanimation suggesting a phenomenon of neurotization from muscle-to-nerve. A woman benefited from a parotidectomy with sacrifice of the left facial nerve indicated for recurrent tumor in the gland. The distal branches of the facial nerve, isolated at the time of resection, were buried in the masseter muscle underneath. The patient recovered a voluntary hémifacial motricity. The electromyographic analysis of the motor activity of the zygomaticus major before and after block of the masseter nerve showed a dependence between mimic muscles and the masseter muscle. Several hypotheses have been advanced to explain the spontaneous reanimation of facial paralysis. The clinical case makes it possible to argue in favor of muscle-to-nerve neurotization from masseter muscle to distal branches of the facial nerve. It illustrates the quality of motricity that can be obtained thanks to this procedure. The authors describe a simple implantation technique of distal branches of the facial nerve in the masseter muscle during a radical parotidectomy with facial nerve sacrifice and recovery of resting tone but also a quality voluntary mimicry. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  15. Hemispheric and facial asymmetry: faces of academe.

    Science.gov (United States)

    Smith, W M

    1998-11-01

    Facial asymmetry (facedness) of selected academic faculty members was studied in relation to brain asymmetry and cognitive specialization. Comparisons of facedness were made among humanities faculty (H), faculty members of mathematics and physics (M-P), psychologists (P), and a group of randomly selected individuals (R). Facedness was defined in terms of the relative sizes (in square centimeters) of the two hemifaces. It was predicted that the four groups would show differences in facedness, namely, H, right face bias; M-P, left face bias; P, no bias; and R, no bias. The predictions were confirmed, and the results interpreted in terms of known differences in hemispheric specialization of cognitive functions as they relate to the dominant cognitive activity of each of the different groups. In view of the contralateral control of the two hemifaces (below the eyes) by the two hemispheres of the brain, the two sides of the face undergo differential muscular development, thus creating facial asymmetry. Other factors, such as gender, also may affect facial asymmetry. Suggestions for further research on facedness are discussed.

  16. Cues of fatigue: effects of sleep deprivation on facial appearance.

    Science.gov (United States)

    Sundelin, Tina; Lekander, Mats; Kecklund, Göran; Van Someren, Eus J W; Olsson, Andreas; Axelsson, John

    2013-09-01

    To investigate the facial cues by which one recognizes that someone is sleep deprived versus not sleep deprived. Experimental laboratory study. Karolinska Institutet, Stockholm, Sweden. Forty observers (20 women, mean age 25 ± 5 y) rated 20 facial photographs with respect to fatigue, 10 facial cues, and sadness. The stimulus material consisted of 10 individuals (five women) photographed at 14:30 after normal sleep and after 31 h of sleep deprivation following a night with 5 h of sleep. Ratings of fatigue, fatigue-related cues, and sadness in facial photographs. The faces of sleep deprived individuals were perceived as having more hanging eyelids, redder eyes, more swollen eyes, darker circles under the eyes, paler skin, more wrinkles/fine lines, and more droopy corners of the mouth (effects ranging from b = +3 ± 1 to b = +15 ± 1 mm on 100-mm visual analog scales, P sleep deprivation (P sleep deprivation, nor associated with judgements of fatigue. In addition, sleep-deprived individuals looked sadder than after normal sleep, and sadness was related to looking fatigued (P sleep deprivation affects features relating to the eyes, mouth, and skin, and that these features function as cues of sleep loss to other people. Because these facial regions are important in the communication between humans, facial cues of sleep deprivation and fatigue may carry social consequences for the sleep deprived individual in everyday life.

  17. The role of the posed smile in overall facial esthetics.

    Science.gov (United States)

    Havens, David C; McNamara, James A; Sigler, Lauren M; Baccetti, Tiziano

    2010-03-01

    To evaluate the role of the posed smile in overall facial esthetics, as determined by laypersons and orthodontists. Twenty orthodontists and 20 lay evaluators were asked to perform six Q-sorts on different photographs of 48 white female subjects. The six Q-sorts consisted of three different photographs for each of two time points (pre- and posttreatment), as follows: (1) smile-only, (2) face without the smile, and (3) face with the smile. The evaluators determined a split-line for attractive and unattractive images at the end of each Q-sort. The proportions of attractive patients were compared across Q-sorts using a Wilcoxon signed-rank test for paired data. The evaluators also ranked nine facial/dental characteristics at the completion of the six Q-sorts. Evaluators found the pretreatment face without the smile to be significantly more attractive than the face with the smile or the smile-only photographs. Dissimilar results were seen posttreatment; there was not a significant difference between the three posttreatment photographs. The two panels agreed on the proportion of "attractive" subjects but differed on the attractiveness level of each individual subject. The presence of a malocclusion has a negative impact on facial attractiveness. Orthodontic correction of a malocclusion affects overall facial esthetics positively. Laypeople and orthodontists agree on what is attractive. Overall facial harmony is the most important characteristic used in deciding facial attractiveness.

  18. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  19. Retention interval affects visual short-term memory encoding.

    Science.gov (United States)

    Bankó, Eva M; Vidnyánszky, Zoltán

    2010-03-01

    Humans can efficiently store fine-detailed facial emotional information in visual short-term memory for several seconds. However, an unresolved question is whether the same neural mechanisms underlie high-fidelity short-term memory for emotional expressions at different retention intervals. Here we show that retention interval affects the neural processes of short-term memory encoding using a delayed facial emotion discrimination task. The early sensory P100 component of the event-related potentials (ERP) was larger in the 1-s interstimulus interval (ISI) condition than in the 6-s ISI condition, whereas the face-specific N170 component was larger in the longer ISI condition. Furthermore, the memory-related late P3b component of the ERP responses was also modulated by retention interval: it was reduced in the 1-s ISI as compared with the 6-s condition. The present findings cannot be explained based on differences in sensory processing demands or overall task difficulty because there was no difference in the stimulus information and subjects' performance between the two different ISI conditions. These results reveal that encoding processes underlying high-precision short-term memory for facial emotional expressions are modulated depending on whether information has to be stored for one or for several seconds.

  20. When is facial paralysis Bell palsy? Current diagnosis and treatment.

    Science.gov (United States)

    Ahmed, Anwar

    2005-05-01

    Bell palsy is largely a diagnosis of exclusion, but certain features in the history and physical examination help distinguish it from facial paralysis due to other conditions: eg, abrupt onset with complete, unilateral facial weakness at 24 to 72 hours, and, on the affected side, numbness or pain around the ear, a reduction in taste, and hypersensitivity to sounds. Corticosteroids and antivirals given within 10 days of onset have been shown to help. But Bell palsy resolves spontaneously without treatment in most patients within 6 months.

  1. Recurrent unilateral facial nerve palsy in a child with dehiscent facial nerve canal

    Directory of Open Access Journals (Sweden)

    Christopher Liu

    2016-12-01

    Full Text Available Objective: The dehiscent facial nerve canal has been well documented in histopathological studies of temporal bones as well as in clinical setting. We describe clinical and radiologic features of a child with recurrent facial nerve palsy and dehiscent facial nerve canal. Methods: Retrospective chart review. Results: A 5-year-old male was referred to the otolaryngology clinic for evaluation of recurrent acute otitis media and hearing loss. He also developed recurrent left peripheral FN palsy associated with episodes of bilateral acute otitis media. High resolution computed tomography of the temporal bones revealed incomplete bony coverage of the tympanic segment of the left facial nerve. Conclusions: Recurrent peripheral FN palsy may occur in children with recurrent acute otitis media in the presence of a dehiscent facial nerve canal. Facial nerve canal dehiscence should be considered in the differential diagnosis of children with recurrent peripheral FN palsy.

  2. Pediatric facial injuries: It's management

    OpenAIRE

    Singh, Geeta; Mohammad, Shadab; Pal, U. S.; Hariram,; Malkunje, Laxman R.; Singh, Nimisha

    2011-01-01

    Background: Facial injuries in children always present a challenge in respect of their diagnosis and management. Since these children are of a growing age every care should be taken so that later the overall growth pattern of the facial skeleton in these children is not jeopardized. Purpose: To access the most feasible method for the management of facial injuries in children without hampering the facial growth. Materials and Methods: Sixty child patients with facial trauma were selected rando...

  3. Recognition of schematic facial displays of emotion in parents of children with autism.

    Science.gov (United States)

    Palermo, Mark T; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria

    2006-07-01

    Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. 'Autism fathers' performed worse than 'autism mothers', who performed worse than controls in decoding displays representing sadness or disgust. This indicates the need to include facial expression decoding tasks in genetic research of autism. In addition, emotional expression interactions between parents and their children with autism, particularly through play, where affect and prosody are 'physiologically' exaggerated, may stimulate development of social competence. Future studies could benefit from a combination of stimuli including photographs and schematic drawings, with and without associated verbal categories. This may allow the subdivision of patients and relatives on the basis of the amount of information needed to understand and process social-emotionally relevant information.

  4. Facial Sports Injuries

    Science.gov (United States)

    ... the patient has HIV or hepatitis. Facial Fractures Sports injuries can cause potentially serious broken bones or fractures of the face. Common symptoms of facial fractures include: swelling and bruising, ...

  5. Judgment of facial expressions and depression persistence

    NARCIS (Netherlands)

    Hale, WW

    1998-01-01

    In research it has been demonstrated that cognitive and interpersonal processes play significant roles in depression development and persistence. The judgment of emotions displayed in facial expressions by depressed patients allows for a better understanding of these processes. In this study, 48

  6. Outcome of a graduated minimally invasive facial reanimation in patients with facial paralysis.

    Science.gov (United States)

    Holtmann, Laura C; Eckstein, Anja; Stähr, Kerstin; Xing, Minzhi; Lang, Stephan; Mattheis, Stefan

    2017-08-01

    Peripheral paralysis of the facial nerve is the most frequent of all cranial nerve disorders. Despite advances in facial surgery, the functional and aesthetic reconstruction of a paralyzed face remains a challenge. Graduated minimally invasive facial reanimation is based on a modular principle. According to the patients' needs, precondition, and expectations, the following modules can be performed: temporalis muscle transposition and facelift, nasal valve suspension, endoscopic brow lift, and eyelid reconstruction. Applying a concept of a graduated minimally invasive facial reanimation may help minimize surgical trauma and reduce morbidity. Twenty patients underwent a graduated minimally invasive facial reanimation. A retrospective chart review was performed with a follow-up examination between 1 and 8 months after surgery. The FACEgram software was used to calculate pre- and postoperative eyelid closure, the level of brows, nasal, and philtral symmetry as well as oral commissure position at rest and oral commissure excursion with smile. As a patient-oriented outcome parameter, the Glasgow Benefit Inventory questionnaire was applied. There was a statistically significant improvement in the postoperative score of eyelid closure, brow asymmetry, nasal asymmetry, philtral asymmetry as well as oral commissure symmetry at rest (p facial nerve repair or microneurovascular tissue transfer cannot be applied, graduated minimally invasive facial reanimation is a promising option to restore facial function and symmetry at rest.

  7. Identity modulates short-term memory for facial emotion.

    Science.gov (United States)

    Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert

    2009-12-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.

  8. When Age Matters: Differences in Facial Mimicry and Autonomic Responses to Peers' Emotions in Teenagers and Adults

    Science.gov (United States)

    Ardizzi, Martina; Sestito, Mariateresa; Martini, Francesca; Umiltà, Maria Alessandra; Ravera, Roberto; Gallese, Vittorio

    2014-01-01

    Age-group membership effects on explicit emotional facial expressions recognition have been widely demonstrated. In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions. To this aim, facial Electromyography (EMG) and Respiratory Sinus Arrhythmia (RSA) were recorded from teenager and adult participants during the observation of facial expressions performed by teenager and adult models. Results highlighted that teenagers exhibited greater facial EMG responses to peers' facial expressions, whereas adults showed higher RSA-responses to adult facial expressions. The different physiological modalities through which young and adults respond to peers' emotional expressions are likely to reflect two different ways to engage in social interactions with coetaneous. Findings confirmed that age is an important and powerful social feature that modulates interpersonal interactions by influencing low-level physiological responses. PMID:25337916

  9. Facial Expression Recognition By Using Fisherface Methode With Backpropagation Neural Network

    Directory of Open Access Journals (Sweden)

    Zaenal Abidin

    2011-01-01

    Full Text Available Abstract— In daily lives, especially in interpersonal communication, face often used for expression. Facial expressions give information about the emotional state of the person. A facial expression is one of the behavioral characteristics. The components of a basic facial expression analysis system are face detection, face data extraction, and facial expression recognition. Fisherface method with backpropagation artificial neural network approach can be used for facial expression recognition. This method consists of two-stage process, namely PCA and LDA. PCA is used to reduce the dimension, while the LDA is used for features extraction of facial expressions. The system was tested with 2 databases namely JAFFE database and MUG database. The system correctly classified the expression with accuracy of 86.85%, and false positive 25 for image type I of JAFFE, for image type II of JAFFE 89.20% and false positive 15,  for type III of JAFFE 87.79%, and false positive for 16. The image of MUG are 98.09%, and false positive 5. Keywords— facial expression, fisherface method, PCA, LDA, backpropagation neural network.

  10. Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease

    Science.gov (United States)

    Argaud, Soizic; Delplanque, Sylvain; Houvenaghel, Jean-François; Auffret, Manon; Duprez, Joan; Vérin, Marc; Grandjean, Didier; Sauleau, Paul

    2016-01-01

    According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such. PMID:27467393

  11. Enhanced MRI in patients with facial palsy; Study of time-related enhancement

    Energy Technology Data Exchange (ETDEWEB)

    Yanagida, Masahiro; Kato, Tsutomu; Ushiro, Koichi; Kitajiri, Masanori; Yamashita, Toshio; Kumazawa, Tadami; Tanaka, Yoshimasa (Kansai Medical School, Moriguchi, Osaka (Japan))

    1991-03-01

    We performed Gd-DTPA-enhanced magnetic resonance imaging (MRI) examinations at several stages in 40 patients with peripheral facial nerve palsy (Bell's palsy and Ramsay-Hunt syndrome). In 38 of the 40 patients, one and more enhanced region could be seen in certain portion of the facial nerve in the temporal bone on the affected side, whereas no enhanced regions were seen on the intact side. Correlations between the timing of the MRI examination and the location of the enhanced regions were analysed. In all 6 patients examined by MRI within 5 days after the onset of facial nerve palsy, enhanced regions were present in the meatal portion. In 3 of the 8 patients (38%) examined by MRI 6 to 10 days after the onset of facial palsy, enhanced areas were seen in both the meatal and labyrinthine portions. In 8 of the 9 patients (89%) tested 11 to 20 days after the onset of palsy, the vertical portion was enhanced. In the 12 patients examined by MRI 21 to 40 days after the onset of facial nerve palsy, the meatal portion was not enhanced while the labyrinthine portion, the horizontal portion and the vertical portion were enhanced in 5 (42%), 8 (67%) and 11 (92%), respectively. Enhancement in the vertical portion was observed in all 5 patients examined more than 41 days after the onset of facial palsy. These results suggest that the central portion of the facial nerve in the temporal bone tends to be enhanced in the early stage of facial nerve palsy, while the peripheral portion is enhanced in the late stage. These changes of Gd-DTPA enhanced regions in the facial nerve may suggest dromic degeneration of the facial nerve in peripheral facial nerve palsy. (author).

  12. The Associations between Visual Attention and Facial Expression Identification in Patients with Schizophrenia.

    Science.gov (United States)

    Lin, I-Mei; Fan, Sheng-Yu; Huang, Tiao-Lai; Wu, Wan-Ting; Li, Shi-Ming

    2013-12-01

    Visual search is an important attention process that precedes the information processing. Visual search also mediates the relationship between cognition function (attention) and social cognition (such as facial expression identification). However, the association between visual attention and social cognition in patients with schizophrenia remains unknown. The purposes of this study were to examine the differences in visual search performance and facial expression identification between patients with schizophrenia and normal controls, and to explore the relationship between visual search performance and facial expression identification in patients with schizophrenia. Fourteen patients with schizophrenia (mean age=46.36±6.74) and 15 normal controls (mean age=40.87±9.33) participated this study. The visual search task, including feature search and conjunction search, and Japanese and Caucasian Facial Expression of Emotion were administered. Patients with schizophrenia had worse visual search performance both in feature search and conjunction search than normal controls, as well as had worse facial expression identification, especially in surprised and sadness. In addition, there were negative associations between visual search performance and facial expression identification in patients with schizophrenia, especially in surprised and sadness. However, this phenomenon was not showed in normal controls. Patients with schizophrenia who had visual search deficits had the impairment on facial expression identification. Increasing ability of visual search and facial expression identification may improve their social function and interpersonal relationship.

  13. Extracranial Facial Nerve Schwannoma Treated by Hypo-fractionated CyberKnife Radiosurgery

    OpenAIRE

    Sasaki, Ayaka; Miyazaki, Shinichiro; Hori, Tomokatsu

    2016-01-01

    Facial nerve schwannoma is a rare intracranial tumor. Treatment for this benign tumor has been controversial. Here, we report a case of extracranial facial nerve schwannoma treated successfully by hypo-fractionated CyberKnife (Accuray, Sunnyvale, CA) radiosurgery?and discuss the efficacy of this treatment. A 34-year-old female noticed a swelling in her right mastoid process. The lesion enlarged over a seven-month period, and she experienced facial spasm on the right side. She was diagnosed wi...

  14. Sad or fearful? The influence of body posture on adults' and children's perception of facial displays of emotion.

    Science.gov (United States)

    Mondloch, Catherine J

    2012-02-01

    The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception. 2011 Elsevier Inc. All rights reserved.

  15. Development and validation of an Argentine set of facial expressions of emotion.

    Science.gov (United States)

    Vaiman, Marcelo; Wagner, Mónica Anna; Caicedo, Estefanía; Pereno, Germán Leandro

    2017-02-01

    Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion research is receiving in this region. Here we present the development and validation of the Universidad Nacional de Cordoba, Expresiones de Emociones Faciales (UNCEEF), a Facial Action Coding System (FACS)-verified set of pictures of Argentineans expressing the six basic emotions, plus neutral expressions. FACS scores, recognition rates, Hu scores, and discrimination indices are reported. Evidence of convergent validity was obtained using the Pictures of Facial Affect in an Argentine sample. However, recognition accuracy was greater for UNCEEF. The importance of local sets of emotion pictures is discussed.

  16. Incongruence Between Observers’ and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli

    Directory of Open Access Journals (Sweden)

    Tanja S. H. Wingenbach

    2018-06-01

    Full Text Available According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a explicit imitation of viewed facial emotional expressions (stimulus-congruent condition, (b pen-holding with the lips (stimulus-incongruent condition, and (c passive viewing (control condition. It was hypothesised that (1 experimental condition (a and (b result in greater facial muscle activity than (c, (2 experimental condition (a increases emotion recognition accuracy from others’ faces compared to (c, (3 experimental condition (b lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c. Participants (42 males, 42 females underwent a facial emotion recognition experiment (ADFES-BIV while electromyography (EMG was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.

  17. Incongruence Between Observers' and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli.

    Science.gov (United States)

    Wingenbach, Tanja S H; Brosnan, Mark; Pfaltz, Monique C; Plichta, Michael M; Ashwin, Chris

    2018-01-01

    According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.

  18. Incongruence Between Observers’ and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli

    Science.gov (United States)

    Wingenbach, Tanja S. H.; Brosnan, Mark; Pfaltz, Monique C.; Plichta, Michael M.; Ashwin, Chris

    2018-01-01

    According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others’ faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed. PMID:29928240

  19. Facial orientation and facial shape in extant great apes: a geometric morphometric analysis of covariation.

    Science.gov (United States)

    Neaux, Dimitri; Guy, Franck; Gilissen, Emmanuel; Coudyzer, Walter; Vignaud, Patrick; Ducrocq, Stéphane

    2013-01-01

    The organization of the bony face is complex, its morphology being influenced in part by the rest of the cranium. Characterizing the facial morphological variation and craniofacial covariation patterns in extant hominids is fundamental to the understanding of their evolutionary history. Numerous studies on hominid facial shape have proposed hypotheses concerning the relationship between the anterior facial shape, facial block orientation and basicranial flexion. In this study we test these hypotheses in a sample of adult specimens belonging to three extant hominid genera (Homo, Pan and Gorilla). Intraspecific variation and covariation patterns are analyzed using geometric morphometric methods and multivariate statistics, such as partial least squared on three-dimensional landmarks coordinates. Our results indicate significant intraspecific covariation between facial shape, facial block orientation and basicranial flexion. Hominids share similar characteristics in the relationship between anterior facial shape and facial block orientation. Modern humans exhibit a specific pattern in the covariation between anterior facial shape and basicranial flexion. This peculiar feature underscores the role of modern humans' highly-flexed basicranium in the overall integration of the cranium. Furthermore, our results are consistent with the hypothesis of a relationship between the reduction of the value of the cranial base angle and a downward rotation of the facial block in modern humans, and to a lesser extent in chimpanzees.

  20. Facial dysmorphopsia: a notable variant of the "thin man" phenomenon?

    Science.gov (United States)

    Ganssauge, Martin; Papageorgiou, Eleni; Schiefer, Ulrich

    2012-10-01

    The aim of this work is to investigate the facial distortion (dysmorphopsia) experienced by patients with homonymous paracentral scotomas and to analyze the interrelationship with the previously described "thin man" phenomenon. Routine neuro-ophthalmological examination and brain MRI in three patients who suffered from small homonymous paracentral scotomas due to infarction or arteriovenous malformations of the occipital lobe. They all complained of distortion and shrinkage of their interlocutor's face contralateral to the brain lesion. The phenomenon appeared some seconds after steady fixation on the interlocutor's nose and was evident with both left and right homonymous scotomas. The patients did not notice a gap in the area corresponding to the scotoma and objects other than faces were perceived normally. Homonymous paracentral scotomas can lead to focal displacement of facial features towards the center of the field defect with resulting distortion of the face on the affected side. This so-called "dysmorphopsia" makes faces appear regionally narrower than they are in reality and may be induced even by visual field defects that remain undetected by conventional perimetry using 6° × 6° grids. Predilection for faces is probably associated with the superior location of scotomas or specific impairment of face processing abilities related to the lesion site. Facial dysmorphopsia is most probably associated with cortical "filling-in" and spatial distortion, and can hence be regarded as a special entity of the "thin man" phenomenon.

  1. Evaluating Posed and Evoked Facial Expressions of Emotion from Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Faso, Daniel J.; Sasson, Noah J.; Pinkham, Amy E.

    2015-01-01

    Though many studies have examined facial affect perception by individuals with autism spectrum disorder (ASD), little research has investigated how facial expressivity in ASD is perceived by others. Here, naïve female observers (n = 38) judged the intensity, naturalness and emotional category of expressions produced by adults with ASD (n = 6) and…

  2. Comparison of emotion recognition from facial expression and music.

    Science.gov (United States)

    Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija

    2011-01-01

    The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.

  3. Gender differences in emotion experience perception under different facial muscle manipulations.

    Science.gov (United States)

    Wang, Yufeng; Zhang, Dongjun; Zou, Feng; Li, Hao; Luo, Yanyan; Zhang, Meng; Liu, Yijun

    2016-04-01

    According to embodied emotion theory, facial manipulations should modulate and initiate particular emotions. However, whether there are gender differences in emotion experience perception under different facial muscle manipulations is not clear. Therefore, we conducted two behavioral experiments to examine gender differences in emotional perception in response to facial expressions (sad, neutral, and happy) under three conditions: (1) holding a pen using only the teeth (HPT), which facilitates the muscles typically associated with smiling; (2) holding a pen using only the lips (HPL), which inhibits the muscles typically associated with smiling; and (3) a control condition--hold no pen (HNP). We found that HPT made the emotional feelings more positive, and that the change degree of female's ratings of sad facial expressions between conditions (HPL to HPT) was larger than males'. These results suggested cognition can be affected by the interaction of the stimuli and the body, especially the female. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Oral-facial-digital syndrome type 1: Report of a case

    Directory of Open Access Journals (Sweden)

    Peter W Duda

    2014-01-01

    Full Text Available Oral-facial-digital syndrome (OFD is a collective term describing thirteen distinctive, rare genetic disorders based on inheritance pattern and phenotypic expression. OFD is characterized by malformations of the oral cavity, the maxillofacial region and the arms and legs. Central nervous system anomalies, include intracerebral cysts, agenesis of the corpus callosum, hydrocephalus, cerebral/cerebellar atrophy, and berry aneurysms. Some degree of compromised intellectual ability and speech are present in affected individuals that correlate with the degree of central nervous system involvement. Furthermore, renal involvement in the form of polycystic kidney disease is evident in affected individuals in adulthood. In this article, we present a 37-year-old female patient that presented to the Rutgers School of Dental Medicine with oral-facial-digital syndrome, type 1.

  5. Hypoglossal-facial nerve "side"-to-side neurorrhaphy for facial paralysis resulting from closed temporal bone fractures.

    Science.gov (United States)

    Su, Diya; Li, Dezhi; Wang, Shiwei; Qiao, Hui; Li, Ping; Wang, Binbin; Wan, Hong; Schumacher, Michael; Liu, Song

    2018-06-06

    Closed temporal bone fractures due to cranial trauma often result in facial nerve injury, frequently inducing incomplete facial paralysis. Conventional hypoglossal-facial nerve end-to-end neurorrhaphy may not be suitable for these injuries because sacrifice of the lesioned facial nerve for neurorrhaphy destroys the remnant axons and/or potential spontaneous innervation. we modified the classical method by hypoglossal-facial nerve "side"-to-side neurorrhaphy using an interpositional predegenerated nerve graft to treat these injuries. Five patients who experienced facial paralysis resulting from closed temporal bone fractures due to cranial trauma were treated with the "side"-to-side neurorrhaphy. An additional 4 patients did not receive the neurorrhaphy and served as controls. Before treatment, all patients had suffered House-Brackmann (H-B) grade V or VI facial paralysis for a mean of 5 months. During the 12-30 months of follow-up period, no further detectable deficits were observed, but an improvement in facial nerve function was evidenced over time in the 5 neurorrhaphy-treated patients. At the end of follow-up, the improved facial function reached H-B grade II in 3, grade III in 1 and grade IV in 1 of the 5 patients, consistent with the electrophysiological examinations. In the control group, two patients showed slightly spontaneous innervation with facial function improved from H-B grade VI to V, and the other patients remained unchanged at H-B grade V or VI. We concluded that the hypoglossal-facial nerve "side"-to-side neurorrhaphy can preserve the injured facial nerve and is suitable for treating significant incomplete facial paralysis resulting from closed temporal bone fractures, providing an evident beneficial effect. Moreover, this treatment may be performed earlier after the onset of facial paralysis in order to reduce the unfavorable changes to the injured facial nerve and atrophy of its target muscles due to long-term denervation and allow axonal

  6. [Descending hypoglossal branch-facial nerve anastomosis in treating unilateral facial palsy after acoustic neuroma resection].

    Science.gov (United States)

    Liang, Jiantao; Li, Mingchu; Chen, Ge; Guo, Hongchuan; Zhang, Qiuhang; Bao, Yuhai

    2015-12-15

    To evaluate the efficiency of the descending hypoglossal branch-facial nerve anastomosis for the severe facial palsy after acoustic neuroma resection. The clinical data of 14 patients (6 males, 8 females, average age 45. 6 years old) underwent descending hypoglossal branch-facial nerve anastomosis for treatment of unilateral facial palsy was analyzed retrospectively. All patients previously had undergone resection of a large acoustic neuroma. House-Brackmann (H-B) grading system was used to evaluate the pre-, post-operative and follow up facial nerve function status. 12 cases (85.7%) had long follow up, with an average follow-up period of 24. 6 months. 6 patients had good outcome (H-B 2 - 3 grade); 5 patients had fair outcome (H-B 3 - 4 grade) and 1 patient had poor outcome (H-B 5 grade) Only 1 patient suffered hemitongue myoparalysis owing to the operation. Descending hypoglossal branch-facial nerve anastomosis is effective for facial reanimation, and it has little impact on the function of chewing, swallowing and pronunciation of the patients compared with the traditional hypoglossal-facial nerve anastomosis.

  7. The face is not an empty canvas: how facial expressions interact with facial appearance.

    Science.gov (United States)

    Hess, Ursula; Adams, Reginald B; Kleck, Robert E

    2009-12-12

    Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.

  8. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  9. From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome

    Science.gov (United States)

    Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques

    2009-01-01

    Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…

  10. Accurate landmarking of three-dimensional facial data in the presence of facial expressions and occlusions using a three-dimensional statistical facial feature model.

    Science.gov (United States)

    Zhao, Xi; Dellandréa, Emmanuel; Chen, Liming; Kakadiaris, Ioannis A

    2011-10-01

    Three-dimensional face landmarking aims at automatically localizing facial landmarks and has a wide range of applications (e.g., face recognition, face tracking, and facial expression analysis). Existing methods assume neutral facial expressions and unoccluded faces. In this paper, we propose a general learning-based framework for reliable landmark localization on 3-D facial data under challenging conditions (i.e., facial expressions and occlusions). Our approach relies on a statistical model, called 3-D statistical facial feature model, which learns both the global variations in configurational relationships between landmarks and the local variations of texture and geometry around each landmark. Based on this model, we further propose an occlusion classifier and a fitting algorithm. Results from experiments on three publicly available 3-D face databases (FRGC, BU-3-DFE, and Bosphorus) demonstrate the effectiveness of our approach, in terms of landmarking accuracy and robustness, in the presence of expressions and occlusions.

  11. Unattractive infant faces elicit negative affect from adults.

    Science.gov (United States)

    Schein, Stevie S; Langlois, Judith H

    2015-02-01

    We examined the relationship between infant attractiveness and adult affect by investigating whether differing levels of infant facial attractiveness elicit facial muscle movement correlated with positive and negative affect from adults (N=87) using electromyography. Unattractive infant faces evoked significantly more corrugator supercilii and levator labii superioris movement (physiological correlates of negative affect) than attractive infant faces. These results suggest that unattractive infants may be at risk for negative affective responses from adults, though the relationship between those responses and caregiving behavior remains elusive. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. The Role of the Amygdala in Facial Trustworthiness Processing: A Systematic Review and Meta-Analyses of fMRI Studies

    Science.gov (United States)

    Oliveiros, Bárbara

    2016-01-01

    Background Faces play a key role in signaling social cues such as signals of trustworthiness. Although several studies identify the amygdala as a core brain region in social cognition, quantitative approaches evaluating its role are scarce. Objectives This review aimed to assess the role of the amygdala in the processing of facial trustworthiness, by analyzing its amplitude BOLD response polarity to untrustworthy versus trustworthy facial signals under fMRI tasks through a Meta-analysis of effect sizes (MA). Activation Likelihood Estimation (ALE) analyses were also conducted. Data sources Articles were retrieved from MEDLINE, ScienceDirect and Web-of-Science in January 2016. Following the PRISMA statement guidelines, a systematic review of original research articles in English language using the search string “(face OR facial) AND (trustworthiness OR trustworthy OR untrustworthy OR trustee) AND fMRI” was conducted. Study selection and data extraction The MA concerned amygdala responses to facial trustworthiness for the contrast Untrustworthy vs. trustworthy faces, and included whole-brain and ROI studies. To prevent potential bias, results were considered even when at the single study level they did not survive correction for multiple comparisons or provided non-significant results. ALE considered whole-brain studies, using the same methodology to prevent bias. A summary of the methodological options (design and analysis) described in the articles was finally used to get further insight into the characteristics of the studies and to perform a subgroup analysis. Data were extracted by two authors and checked independently. Data synthesis Twenty fMRI studies were considered for systematic review. An MA of effect sizes with 11 articles (12 studies) showed high heterogeneity between studies [Q(11) = 265.68, p < .0001; I2 = 95.86%, 94.20% to 97.05%, with 95% confidence interval, CI]. Random effects analysis [RE(183) = 0.851, .422 to .969, 95% CI] supported the

  13. Digital analysis of facial landmarks in determining facial midline among Punjabi population

    Directory of Open Access Journals (Sweden)

    Nirmal Kurian

    2018-01-01

    Full Text Available Introduction: Prosthodontic rehabilitation aims to achieve the best possible facial esthetic appearance for a patient. Attaining facial symmetry forms the basic element for esthetics, and knowledge of the midline of face will result in a better understanding of dentofacial esthetics. Currently, there are no guidelines that direct the choice of specific anatomic landmarks to determine the midline of the face or mouth. Most clinicians choose one specific anatomic landmark and an imaginary line passing through it. Thus, the clinician is left with no established guidelines to determine facial midline. Objective: The purpose of the study is to digitally determine the relationship of facial landmarks with midline of face and formulate a guideline for choosing anatomic landmark among Punjabi population. Materials and Methods: Three commonly used anatomic landmarks, namely nasion, tip of the nose, and tip of the philtrum, were marked clinically on 100 participants (age range: 21–45 years. Frontal full-face digital images of the participants in smile were then made under standardized conditions. Midline analysis was carried out digitally using an image analyzing software. The entire process of midline analysis was done by a single observer and repeated twice. Reliability analysis and one-sample t-tests were conducted. Results: The results indicated that each of the four landmarks deviated uniquely and significantly (P < 0.001 from the midlines of the face as well as the mouth. Conclusions: Within the limitations of the study, the hierarchy of anatomic landmarks closest to the midline of the face in smile was as follows: (1 Intercommissural midlines, (2 Tip of philtrum, (3 Nasion, (4 Tip of the nose, and (5 Dental midlines. The hierarchy of anatomical landmarks closest to the intercommissural/mouth midline was: (1 Tip of philtrum, (2 Tip of the nose, (3 Nasion, and (4 dental midline.

  14. Personality Trait and Facial Expression Filter-Based Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Seongah Chin

    2013-02-01

    Full Text Available In this paper, we present technical approaches that bridge the gap in the research related to the use of brain-computer interfaces for entertainment and facial expressions. Such facial expressions that reflect an individual's personal traits can be used to better realize artificial facial expressions in a gaming environment based on a brain-computer interface. First, an emotion extraction filter is introduced in order to classify emotions on the basis of the users' brain signals in real time. Next, a personality trait filter is defined to classify extrovert and introvert types, which manifest as five traits: very extrovert, extrovert, medium, introvert and very introvert. In addition, facial expressions derived from expression rates are obtained by an extrovert-introvert fuzzy model through its defuzzification process. Finally, we confirm this validation via an analysis of the variance of the personality trait filter, a k-fold cross validation of the emotion extraction filter, an accuracy analysis, a user study of facial synthesis and a test case game.

  15. Gene expression profile data for mouse facial development

    Directory of Open Access Journals (Sweden)

    Sonia M. Leach

    2017-08-01

    Full Text Available This article contains data related to the research articles "Spatial and Temporal Analysis of Gene Expression during Growth and Fusion of the Mouse Facial Prominences" (Feng et al., 2009 [1] and “Systems Biology of facial development: contributions of ectoderm and mesenchyme” (Hooper et al., 2017 In press [2]. Embryonic mammalian craniofacial development is a complex process involving the growth, morphogenesis, and fusion of distinct facial prominences into a functional whole. Aberrant gene regulation during this process can lead to severe craniofacial birth defects, including orofacial clefting. As a means to understand the genes involved in facial development, we had previously dissected the embryonic mouse face into distinct prominences: the mandibular, maxillary or nasal between E10.5 and E12.5. The prominences were then processed intact, or separated into ectoderm and mesenchyme layers, prior analysis of RNA expression using microarrays (Feng et al., 2009, Hooper et al., 2017 in press [1,2]. Here, individual gene expression profiles have been built from these datasets that illustrate the timing of gene expression in whole prominences or in the separated tissue layers. The data profiles are presented as an indexed and clickable list of the genes each linked to a graphical image of that gene׳s expression profile in the ectoderm, mesenchyme, or intact prominence. These data files will enable investigators to obtain a rapid assessment of the relative expression level of any gene on the array with respect to time, tissue, prominence, and expression trajectory.

  16. The Change in Facial Emotion Recognition Ability in Inpatients with Treatment Resistant Schizophrenia After Electroconvulsive Therapy.

    Science.gov (United States)

    Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi

    2017-09-01

    People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.

  17. Facial Expression Enhances Emotion Perception Compared to Vocal Prosody: Behavioral and fMRI Studies.

    Science.gov (United States)

    Zhang, Heming; Chen, Xuhai; Chen, Shengdong; Li, Yansong; Chen, Changming; Long, Quanshan; Yuan, Jiajin

    2018-05-09

    Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.

  18. The Influence of Executive Functioning on Facial and Subjective Pain Responses in Older Adults

    Science.gov (United States)

    2016-01-01

    Cognitive decline is known to reduce reliability of subjective pain reports. Although facial expressions of pain are generally considered to be less affected by this decline, empirical support for this assumption is sparse. The present study therefore examined how cognitive functioning relates to facial expressions of pain and whether cognition acts as a moderator between nociceptive intensity and facial reactivity. Facial and subjective responses of 51 elderly participants to mechanical stimulation at three intensities levels (50 kPa, 200 kPa, and 400 kPa) were assessed. Moreover, participants completed a neuropsychological examination of executive functioning (planning, cognitive inhibition, and working memory), episodic memory, and psychomotor speed. The results showed that executive functioning has a unique relationship with facial reactivity at low pain intensity levels (200 kPa). Moreover, cognitive inhibition (but not other executive functions) moderated the effect of pressure intensity on facial pain expressions, suggesting that the relationship between pressure intensity and facial reactivity was less pronounced in participants with high levels of cognitive inhibition. A similar interaction effect was found for cognitive inhibition and subjective pain report. Consequently, caution is needed when interpreting facial (as well as subjective) pain responses in individuals with a high level of cognitive inhibition. PMID:27274618

  19. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  20. A Real-Time Interactive System for Facial Makeup of Peking Opera

    Science.gov (United States)

    Cai, Feilong; Yu, Jinhui

    In this paper we present a real-time interactive system for making facial makeup of Peking Opera. First, we analyze the process of drawing facial makeup and characteristics of the patterns used in it, and then construct a SVG pattern bank based on local features like eye, nose, mouth, etc. Next, we pick up some SVG patterns from the pattern bank and composed them to make a new facial makeup. We offer a vector-based free form deformation (FFD) tool to edit patterns and, based on editing, our system creates automatically texture maps for a template head model. Finally, the facial makeup is rendered on the 3D head model in real time. Our system offers flexibility in designing and synthesizing various 3D facial makeup. Potential applications of the system include decoration design, digital museum exhibition and education of Peking Opera.

  1. Survey of methods of facial palsy documentation in use by members of the Sir Charles Bell Society

    NARCIS (Netherlands)

    Fattah, A.Y.; Gavilan, J.; Hadlock, T.A.; Marcus, J.R.; Marres, H.A.; Nduka, C.; Slattery, W.H.; Snyder-Warwick, A.K.

    2014-01-01

    OBJECTIVES/HYPOTHESIS: Facial palsy manifests a broad array of deficits affecting function, form, and psychological well-being. Assessment scales were introduced to standardize and document the features of facial palsy and to facilitate the exchange of information and comparison of outcomes. The aim

  2. Impaired Overt Facial Mimicry in Response to Dynamic Facial Expressions in High-Functioning Autism Spectrum Disorders

    Science.gov (United States)

    Yoshimura, Sayaka; Sato, Wataru; Uono, Shota; Toichi, Motomi

    2015-01-01

    Previous electromyographic studies have reported that individuals with autism spectrum disorders (ASD) exhibited atypical patterns of facial muscle activity in response to facial expression stimuli. However, whether such activity is expressed in visible facial mimicry remains unknown. To investigate this issue, we videotaped facial responses in…

  3. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Directory of Open Access Journals (Sweden)

    Letizia Palumbo

    Full Text Available Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1. This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2. Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3. The bias survived insertion of a 400 ms blank (Experiment 4. These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects. We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism, which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  4. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Science.gov (United States)

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  5. A dietary supplement improves facial photoaging and skin sebum, hydration and tonicity modulating serum fibronectin, neutrophil elastase 2, hyaluronic acid and carbonylated proteins.

    Science.gov (United States)

    Di Cerbo, Alessandro; Laurino, Carmen; Palmieri, Beniamino; Iannitti, Tommaso

    2015-03-01

    Excessive exposure to the sun can cause severe photoaging as early as the second decade of life resulting in a loss of physiological elastic fiber functions. We designed a first study to assess differences in facial skin pH, sebum, elasticity, hydration and tonicity and serum levels of fibronectin, elastin, neutrophil elastase 2, hyaluronic acid and carbonylated proteins between patients affected by facial photoaging and healthy controls. In a second study we tested the hypothesis that a dietary supplement would improve facial photoaging, also promoting changes in the above mentioned skin and serum parameters. In the first study we enrolled 30 women [age: 47.5 ± 1.6 years (mean ± standard error of the mean)] affected by moderate facial photoaging (4 cm ≤ Visual Analogue Scale (VAS)Skin Tester was used to analyze differences in facial skin parameters between patients affected by facial photoaging and healthy controls. Skin Tester was also used to assess the effect of VISCODERM Pearls on facial skin parameters and compared with placebo 2 weeks after the end of treatment. Serum levels of fibronectin, elastin, neutrophil elastase 2, hyaluronic acid and carbonylated proteins were measured by enzyme-linked immunosorbent assay in the first cohort of patients affected by facial photoaging and healthy controls and, at baseline and 2 weeks after the end of treatment, in the second cohort of patients who underwent treatment with VISCODERM Pearls and placebo. VAS photoaging score was higher in patients affected by photoaging, if compared with healthy controls (p hydration and tonicity were decreased in patients affected by photoaging, if compared with healthy controls (all p hydration and tonicity were increased in the active treatment group vs. placebo (p skin hydration, tonicity and elasticity and increased skin pH and sebum. Treatment with the dietary supplement VISCODERM Pearls significantly improved VAS photoaging score and skin hydration, sebum and tonicity 2 weeks

  6. Facial Emotion Recognition Impairment in Patients with Parkinson's Disease and Isolated Apathy

    Directory of Open Access Journals (Sweden)

    Mercè Martínez-Corral

    2010-01-01

    Full Text Available Apathy is a frequent feature of Parkinson's disease (PD, usually related with executive dysfunction. However, in a subgroup of PD patients apathy may represent the only or predominant neuropsychiatric feature. To understand the mechanisms underlying apathy in PD, we investigated emotional processing in PD patients with and without apathy and in healthy controls (HC, assessed by a facial emotion recognition task (FERT. We excluded PD patients with cognitive impairment, depression, other affective disturbances and previous surgery for PD. PD patients with apathy scored significantly worse in the FERT, performing worse in fear, anger, and sadness recognition. No differences, however, were found between nonapathetic PD patients and HC. These findings suggest the existence of a disruption of emotional-affective processing in cognitive preserved PD patients with apathy. To identify specific dysfunction of limbic structures in PD, patients with isolated apathy may have therapeutic and prognostic implications.

  7. Bilateral Facial Diplegia: A Rare Presenting Symptom of Lyme

    Directory of Open Access Journals (Sweden)

    John Ashurst

    2017-01-01

    Full Text Available Lyme disease is a common disease that is faced by the physician but also acts a mimicker of many other disease processes. Facial palsies, especially bilateral, are a relatively rare presenting symptom of Lyme disease and may warrant further investigation. A thorough history and physical examination coupled with precision testing may aid the physician when faced with a patient with the diagnostic dilemma of facial diplegia.

  8. Facial EMG responses to dynamic emotional facial expressions in boys with disruptive behavior disorders

    NARCIS (Netherlands)

    Wied, de M.; Boxtel, van Anton; Zaalberg, R.; Goudena, P.P.; Matthys, W.

    2006-01-01

    Based on the assumption that facial mimicry is a key factor in emotional empathy, and clinical observations that children with disruptive behavior disorders (DBD) are weak empathizers, the present study explored whether DBD boys are less facially responsive to facial expressions of emotions than

  9. Case Report: A true median facial cleft (crano-facial dysraphia ...

    African Journals Online (AJOL)

    Case Report: A true median facial cleft (crano-facial dysraphia, atessier type O) in Bingham University Teaching Hospital, Jos. ... Patient had a multidisciplinary care by the obstetrician, Neonatologist, anesthesiologist and the plastic surgery team who scheduled a soft tissue repair of the upper lip defect, columella and ...

  10. Outcome of different facial nerve reconstruction techniques

    OpenAIRE

    Mohamed, Aboshanif; Omi, Eigo; Honda, Kohei; Suzuki, Shinsuke; Ishikawa, Kazuo

    2016-01-01

    Abstract Introduction: There is no technique of facial nerve reconstruction that guarantees facial function recovery up to grade III. Objective: To evaluate the efficacy and safety of different facial nerve reconstruction techniques. Methods: Facial nerve reconstruction was performed in 22 patients (facial nerve interpositional graft in 11 patients and hypoglossal-facial nerve transfer in another 11 patients). All patients had facial function House-Brackmann (HB) grade VI, either caused by...

  11. Microbial biofilms on silicone facial prostheses

    NARCIS (Netherlands)

    Ariani, Nina

    2015-01-01

    Facial disfigurements can result from oncologic surgery, trauma and congenital deformities. These disfigurements can be rehabilitated with facial prostheses. Facial prostheses are usually made of silicones. A problem of facial prostheses is that microorganisms can colonize their surface. It is hard

  12. Continuous emotion detection using EEG signals and facial expressions

    NARCIS (Netherlands)

    Soleymani, Mohammad; Asghari-Esfeden, Sadjad; Pantic, Maja; Fu, Yun

    Emotions play an important role in how we select and consume multimedia. Recent advances on affect detection are focused on detecting emotions continuously. In this paper, for the first time, we continuously detect valence from electroencephalogram (EEG) signals and facial expressions in response to

  13. Influence of Suboptimally and Optimally Presented Affective Pictures and Words on Consumption-Related Behavior

    Science.gov (United States)

    Winkielman, Piotr; Gogolushko, Yekaterina

    2018-01-01

    Affective stimuli can influence immediate reactions as well as spontaneous behaviors. Much evidence for such influence comes from studies of facial expressions. However, it is unclear whether these effects hold for other affective stimuli, and how the amount of stimulus processing changes the nature of the influence. This paper addresses these issues by comparing the influence on consumption behaviors of emotional pictures and valence-matched words presented at suboptimal and supraliminal durations. In Experiment 1, both suboptimal and supraliminal emotional facial expressions influenced consumption in an affect-congruent, assimilative way. In Experiment 2, pictures of both high- and low-frequency emotional objects congruently influenced consumption. In comparison, words tended to produce incongruent effects. We discuss these findings in light of privileged access theories, which hold that pictures better convey affective meaning than words, and embodiment theories, which hold that pictures better elicit somatosensory and motor responses. PMID:29434556

  14. Multivariate Pattern Classification of Facial Expressions Based on Large-Scale Functional Connectivity.

    Science.gov (United States)

    Liang, Yin; Liu, Baolin; Li, Xianglin; Wang, Peiyuan

    2018-01-01

    It is an important question how human beings achieve efficient recognition of others' facial expressions in cognitive neuroscience, and it has been identified that specific cortical regions show preferential activation to facial expressions in previous studies. However, the potential contributions of the connectivity patterns in the processing of facial expressions remained unclear. The present functional magnetic resonance imaging (fMRI) study explored whether facial expressions could be decoded from the functional connectivity (FC) patterns using multivariate pattern analysis combined with machine learning algorithms (fcMVPA). We employed a block design experiment and collected neural activities while participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise). Both static and dynamic expression stimuli were included in our study. A behavioral experiment after scanning confirmed the validity of the facial stimuli presented during the fMRI experiment with classification accuracies and emotional intensities. We obtained whole-brain FC patterns for each facial expression and found that both static and dynamic facial expressions could be successfully decoded from the FC patterns. Moreover, we identified the expression-discriminative networks for the static and dynamic facial expressions, which span beyond the conventional face-selective areas. Overall, these results reveal that large-scale FC patterns may also contain rich expression information to accurately decode facial expressions, suggesting a novel mechanism, which includes general interactions between distributed brain regions, and that contributes to the human facial expression recognition.

  15. Automatic change detection to facial expressions in adolescents

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Jiannong, Shi

    2016-01-01

    Adolescence is a critical period for the neurodevelopment of social-emotional processing, wherein the automatic detection of changes in facial expressions is crucial for the development of interpersonal communication. Two groups of participants (an adolescent group and an adult group) were...... in facial expressions between the two age groups. The current findings demonstrated that the adolescent group featured more negative vMMN amplitudes than the adult group in the fronto-central region during the 120–200 ms interval. During the time window of 370–450 ms, only the adult group showed better...... automatic processing on fearful faces than happy faces. The present study indicated that adolescent’s posses stronger automatic detection of changes in emotional expression relative to adults, and sheds light on the neurodevelopment of automatic processes concerning social-emotional information....

  16. [Facial tics and spasms].

    Science.gov (United States)

    Potgieser, Adriaan R E; van Dijk, J Marc C; Elting, Jan Willem J; de Koning-Tijssen, Marina A J

    2014-01-01

    Facial tics and spasms are socially incapacitating, but effective treatment is often available. The clinical picture is sufficient for distinguishing between the different diseases that cause this affliction.We describe three cases of patients with facial tics or spasms: one case of tics, which are familiar to many physicians; one case of blepharospasms; and one case of hemifacial spasms. We discuss the differential diagnosis and the treatment possibilities for facial tics and spasms. Early diagnosis and treatment is important, because of the associated social incapacitation. Botulin toxin should be considered as a treatment option for facial tics and a curative neurosurgical intervention should be considered for hemifacial spasms.

  17. Perceived differences between chimpanzee (Pan troglodytes) and human (Homo sapiens) facial expressions are related to emotional interpretation.

    Science.gov (United States)

    Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C

    2007-11-01

    Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.

  18. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson's Disease.

    Science.gov (United States)

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  19. Síndrome de dolor facial

    Directory of Open Access Journals (Sweden)

    DR. F. Eugenio Tenhamm

    2014-07-01

    Full Text Available El dolor o algia facial constituye un síndrome doloroso de las estructuras cráneo faciales bajo el cual se agrupan un gran número de enfermedades. La mejor manera de abordar el diagnóstico diferencial de las entidades que causan el dolor facial es usando un algoritmo que identifica cuatro síndromes dolorosos principales que son: las neuralgias faciales, los dolores faciales con síntomas y signos neurológicos, las cefaleas autonómicas trigeminales y los dolores faciales sin síntomas ni signos neurológicos. Una evaluación clínica detallada de los pacientes, permite una aproximación etiológica lo que orienta el estudio diagnóstico y permite ofrecer una terapia específica a la mayoría de los casos

  20. Outcome of different facial nerve reconstruction techniques.

    Science.gov (United States)

    Mohamed, Aboshanif; Omi, Eigo; Honda, Kohei; Suzuki, Shinsuke; Ishikawa, Kazuo

    There is no technique of facial nerve reconstruction that guarantees facial function recovery up to grade III. To evaluate the efficacy and safety of different facial nerve reconstruction techniques. Facial nerve reconstruction was performed in 22 patients (facial nerve interpositional graft in 11 patients and hypoglossal-facial nerve transfer in another 11 patients). All patients had facial function House-Brackmann (HB) grade VI, either caused by trauma or after resection of a tumor. All patients were submitted to a primary nerve reconstruction except 7 patients, where late reconstruction was performed two weeks to four months after the initial surgery. The follow-up period was at least two years. For facial nerve interpositional graft technique, we achieved facial function HB grade III in eight patients and grade IV in three patients. Synkinesis was found in eight patients, and facial contracture with synkinesis was found in two patients. In regards to hypoglossal-facial nerve transfer using different modifications, we achieved facial function HB grade III in nine patients and grade IV in two patients. Facial contracture, synkinesis and tongue atrophy were found in three patients, and synkinesis was found in five patients. However, those who had primary direct facial-hypoglossal end-to-side anastomosis showed the best result without any neurological deficit. Among various reanimation techniques, when indicated, direct end-to-side facial-hypoglossal anastomosis through epineural suturing is the most effective technique with excellent outcomes for facial reanimation and preservation of tongue movement, particularly when performed as a primary technique. Copyright © 2016 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  1. Facial gender interferes with decisions about facial expressions of anger and happiness.

    Science.gov (United States)

    Becker, D Vaughn

    2017-04-01

    The confounded signal hypothesis maintains that facial expressions of anger and happiness, in order to more efficiently communicate threat or nurturance, evolved forms that take advantage of older gender recognition systems, which were already attuned to similar affordances. Two unexplored consequences of this hypothesis are (1) facial gender should automatically interfere with discriminations of anger and happiness, and (2) controlled attentional processes (like working memory) may be able to override the interference of these particular expressions on gender discrimination. These issues were explored by administering a Garner interference task along with a working memory task as an index of controlled attention. Results show that those with good attentional control were able to eliminate interference of expression on gender decisions but not the interference of gender on expression decisions. Trials in which the stimulus attributes were systematically correlated also revealed strategic facilitation for participants high in attentional control. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Facial Transplantation Surgery Introduction

    OpenAIRE

    Eun, Seok-Chan

    2015-01-01

    Severely disfiguring facial injuries can have a devastating impact on the patient's quality of life. During the past decade, vascularized facial allotransplantation has progressed from an experimental possibility to a clinical reality in the fields of disease, trauma, and congenital malformations. This technique may now be considered a viable option for repairing complex craniofacial defects for which the results of autologous reconstruction remain suboptimal. Vascularized facial allotranspla...

  3. The Development of Dynamic Facial Expression Recognition at Different Intensities in 4- to 18-Year-Olds

    Science.gov (United States)

    Montirosso, Rosario; Peverelli, Milena; Frigerio, Elisa; Crespi, Monica; Borgatti, Renato

    2010-01-01

    The primary purpose of this study was to examine the effect of the intensity of emotion expression on children's developing ability to label emotion during a dynamic presentation of five facial expressions (anger, disgust, fear, happiness, and sadness). A computerized task (AFFECT--animated full facial expression comprehension test) was used to…

  4. Outcome of different facial nerve reconstruction techniques

    Directory of Open Access Journals (Sweden)

    Aboshanif Mohamed

    Full Text Available Abstract Introduction: There is no technique of facial nerve reconstruction that guarantees facial function recovery up to grade III. Objective: To evaluate the efficacy and safety of different facial nerve reconstruction techniques. Methods: Facial nerve reconstruction was performed in 22 patients (facial nerve interpositional graft in 11 patients and hypoglossal-facial nerve transfer in another 11 patients. All patients had facial function House-Brackmann (HB grade VI, either caused by trauma or after resection of a tumor. All patients were submitted to a primary nerve reconstruction except 7 patients, where late reconstruction was performed two weeks to four months after the initial surgery. The follow-up period was at least two years. Results: For facial nerve interpositional graft technique, we achieved facial function HB grade III in eight patients and grade IV in three patients. Synkinesis was found in eight patients, and facial contracture with synkinesis was found in two patients. In regards to hypoglossal-facial nerve transfer using different modifications, we achieved facial function HB grade III in nine patients and grade IV in two patients. Facial contracture, synkinesis and tongue atrophy were found in three patients, and synkinesis was found in five patients. However, those who had primary direct facial-hypoglossal end-to-side anastomosis showed the best result without any neurological deficit. Conclusion: Among various reanimation techniques, when indicated, direct end-to-side facial-hypoglossal anastomosis through epineural suturing is the most effective technique with excellent outcomes for facial reanimation and preservation of tongue movement, particularly when performed as a primary technique.

  5. Recognizing Facial Expressions Automatically from Video

    Science.gov (United States)

    Shan, Caifeng; Braspenning, Ralph

    Facial expressions, resulting from movements of the facial muscles, are the face changes in response to a person's internal emotional states, intentions, or social communications. There is a considerable history associated with the study on facial expressions. Darwin [22] was the first to describe in details the specific facial expressions associated with emotions in animals and humans, who argued that all mammals show emotions reliably in their faces. Since that, facial expression analysis has been a area of great research interest for behavioral scientists [27]. Psychological studies [48, 3] suggest that facial expressions, as the main mode for nonverbal communication, play a vital role in human face-to-face communication. For illustration, we show some examples of facial expressions in Fig. 1.

  6. Rejuvenecimiento facial en "doble sigma" "Double ogee" facial rejuvenation

    Directory of Open Access Journals (Sweden)

    O. M. Ramírez

    2007-03-01

    Full Text Available Las técnicas subperiósticas descritas por Tessier revolucionaron el tratamiento del envejecimiento facial, recomendando esta vía para tratar los signos tempranos del envejecimiento en pacientes jóvenes y de mediana edad. Psillakis refinó la técnica y Ramírez describió un método más seguro y eficaz de lifting subperióstico, demostrando que la técnica subperióstica de rejuveneciento facial se puede aplicar en el amplio espectro del envejecimiento facial. La introducción del endoscopio en el tratamiento del envejecimiento facial ha abierto una nueva era en la Cirugía Estética. Hoy la disección subperióstica asistida endocópicamente del tercio superior, medio e inferior de la cara, proporciona un medio eficaz para la reposición de los tejidos blandos, con posibilidad de aumento del esqueleto óseo craneofacial, menor edema facial postoperatorio, mínima lesión de las ramas del nervio facial y mejor tratamiento de las mejillas. Este abordaje, desarrollado y refinado durante la última década, se conoce como "Ritidectomía en Doble Sigma". El Arco Veneciano en doble sigma, bien conocido en Arquitectura desde la antigüedad, se caracteriza por ser un trazo armónico de curva convexa y a continuación curva cóncava. Cuando se observa una cara joven, desde un ángulo oblicuo, presenta una distribución característica de los tejidos, previamente descrita para el tercio medio como un arco ojival arquitectónico o una curva en forma de "S". Sin embargo, en un examen más detallado de la cara joven, en la vista de tres cuartos, el perfil completo revela una "arco ojival doble" o una sigma "S" doble. Para ver este recíproco y multicurvilíneo trazo de la belleza, debemos ver la cara en posición oblicua y así poder ver ambos cantos mediales. En esta posición, la cara joven presenta una convexidad característica de la cola de la ceja que confluye en la concavidad de la pared orbitaria lateral formando así el primer arco (superior

  7. Facial transplantation for massive traumatic injuries.

    Science.gov (United States)

    Alam, Daniel S; Chi, John J

    2013-10-01

    This article describes the challenges of facial reconstruction and the role of facial transplantation in certain facial defects and injuries. This information is of value to surgeons assessing facial injuries with massive soft tissue loss or injury. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Restoration of orbicularis oculi muscle function in rabbits with peripheral facial paralysis via an implantable artificial facial nerve system.

    Science.gov (United States)

    Sun, Yajing; Jin, Cheng; Li, Keyong; Zhang, Qunfeng; Geng, Liang; Liu, Xundao; Zhang, Yi

    2017-12-01

    The purpose of the present study was to restore orbicularis oculi muscle function using the implantable artificial facial nerve system (IAFNS). The in vivo part of the IAFNS was implanted into 12 rabbits that were facially paralyzed on the right side of the face to restore the function of the orbicularis oculi muscle, which was indicated by closure of the paralyzed eye when the contralateral side was closed. Wireless communication links were established between the in vivo part (the processing chip and microelectrode) and the external part (System Controller program) of the system, which were used to set the working parameters and indicate the working state of the processing chip and microelectrode implanted in the body. A disturbance field strength test of the IAFNS processing chip was performed in a magnetic field dark room to test its electromagnetic radiation safety. Test distances investigated were 0, 1, 3 and 10 m, and levels of radiation intensity were evaluated in the horizontal and vertical planes. Anti-interference experiments were performed to test the stability of the processing chip under the interference of electromagnetic radiation. The fully implanted IAFNS was run for 5 h per day for 30 consecutive days to evaluate the accuracy and precision as well as the long-term stability and effectiveness of wireless communication. The stimulus intensity (range, 0-8 mA) was set every 3 days to confirm the minimum stimulation intensity which could indicate the movement of the paralyzed side was set. Effective stimulation rate was also tested by comparing the number of eye-close movements on both sides. The results of the present study indicated that the IAFNS could rebuild the reflex arc, inducing the experimental rabbits to close the eye of the paralyzed side. The System Controller program was able to reflect the in vivo part of the artificial facial nerve system in real-time and adjust the working pattern, stimulation intensity and frequency, range of wave

  9. Clinical outcomes of facial transplantation: a review.

    Science.gov (United States)

    Shanmugarajah, Kumaran; Hettiaratchy, Shehan; Clarke, Alex; Butler, Peter E M

    2011-01-01

    A total of 18 composite tissue allotransplants of the face have currently been reported. Prior to the start of the face transplant programme, there had been intense debate over the risks and benefits of performing this experimental surgery. This review examines the surgical, functional and aesthetic, immunological and psychological outcomes of facial transplantation thus far, based on the predicted risks outlined in early publications from teams around the world. The initial experience has demonstrated that facial transplantation is surgically feasible. Functional and aesthetic outcomes have been very encouraging with good motor and sensory recovery and improvements to important facial functions observed. Episodes of acute rejection have been common, as predicted, but easily controlled with increases in systemic immunosuppression. Psychological improvements have been remarkable and have resulted in the reintegration of patients into the outside world, social networks and even the workplace. Complications of immunosuppression and patient mortality have been observed in the initial series. These have highlighted rigorous patient selection as the key predictor of success. The overall early outcomes of the face transplant programme have been generally more positive than many predicted. This initial success is testament to the robust approach of teams. Dissemination of outcomes and ongoing refinement of the process may allow facial transplantation to eventually become a first-line reconstructive option for those with extensive facial disfigurements. Copyright © 2011 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  10. Oral contraceptive use in women changes preferences for male facial masculinity and is associated with partner facial masculinity.

    Science.gov (United States)

    Little, Anthony C; Burriss, Robert P; Petrie, Marion; Jones, Benedict C; Roberts, S Craig

    2013-09-01

    Millions of women use hormonal contraception and it has been suggested that such use may alter mate preferences. To examine the impact of oral contraceptive (pill) use on preferences, we tested for within-subject changes in preferences for masculine faces in women initiating pill use. Between two sessions, initiation of pill use significantly decreased women's preferences for male facial masculinity but did not influence preferences for same-sex faces. To test whether altered preference during pill use influences actual partner choice, we examined facial characteristics in 170 age-matched male partners of women who reported having either been using or not using the pill when the partnership was formed. Both facial measurements and perceptual judgements demonstrated that partners of women who used the pill during mate choice have less masculine faces than partners of women who did not use hormonal contraception at this time. Our data (A) provide the first experimental evidence that initiation of pill use in women causes changes in facial preferences and (B) documents downstream effects of these changes on real-life partner selection. Given that hormonal contraceptive use is widespread, effects of pill use on the processes of partner formation have important implications for relationship stability and may have other biologically relevant consequences. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Quality of life assessment in facial palsy: validation of the Dutch Facial Clinimetric Evaluation Scale.

    Science.gov (United States)

    Kleiss, Ingrid J; Beurskens, Carien H G; Stalmeier, Peep F M; Ingels, Koen J A O; Marres, Henri A M

    2015-08-01

    This study aimed at validating an existing health-related quality of life questionnaire for patients with facial palsy for implementation in the Dutch language and culture. The Facial Clinimetric Evaluation Scale was translated into the Dutch language using a forward-backward translation method. A pilot test with the translated questionnaire was performed in 10 patients with facial palsy and 10 normal subjects. Finally, cross-cultural adaption was accomplished at our outpatient clinic for facial palsy. Analyses for internal consistency, test-retest reliability, construct validity and responsiveness were performed. Ninety-three patients completed the Dutch Facial Clinimetric Evaluation Scale, the Dutch Facial Disability Index, and the Dutch Short Form (36) Health Survey. Cronbach's α, representing internal consistency, was 0.800. Test-retest reliability was shown by an intraclass correlation coefficient of 0.737. Correlations with the House-Brackmann score, Sunnybrook score, Facial Disability Index physical function, and social/well-being function were -0.292, 0.570, 0.713, and 0.575, respectively. The SF-36 domains correlate best with the FaCE social function domain, with the strongest correlation between the both social function domains (r = 0.576). The FaCE score did statistically significantly increase in 35 patients receiving botulinum toxin type A (P = 0.042, Student t test). The domains 'facial comfort' and 'social function' improved statistically significantly as well (P = 0.022 and P = 0.046, respectively, Student t-test). The Dutch Facial Clinimetric Evaluation Scale shows good psychometric values and can be implemented in the management of Dutch-speaking patients with facial palsy in the Netherlands. Translation of the instrument into other languages may lead to widespread use, making evaluation and comparison possible among different providers.

  12. Facial nerve conduction after sclerotherapy in children with facial lymphatic malformations: report of two cases.

    Science.gov (United States)

    Lin, Pei-Jung; Guo, Yuh-Cherng; Lin, Jan-You; Chang, Yu-Tang

    2007-04-01

    Surgical excision is thought to be the standard treatment of choice for lymphatic malformations. However, when the lesions are limited to the face only, surgical scar and facial nerve injury may impair cosmetics and facial expression. Sclerotherapy, an injection of a sclerosing agent directly through the skin into a lesion, is an alternative method. By evaluating facial nerve conduction, we observed the long-term effect of facial lymphatic malformations after intralesional injection of OK-432 and correlated the findings with anatomic outcomes. One 12-year-old boy with a lesion over the right-side preauricular area adjacent to the main trunk of facial nerve and the other 5-year-old boy with a lesion in the left-sided cheek involving the buccinator muscle were enrolled. The follow-up data of more than one year, including clinical appearance, computed tomography (CT) scan and facial nerve evaluation were collected. The facial nerve conduction study was normal in both cases. Blink reflex in both children revealed normal results as well. Complete resolution was noted on outward appearance and CT scan. The neurophysiologic data were compatible with good anatomic and functional outcomes. Our report suggests that the inflammatory reaction of OK-432 did not interfere with adjacent facial nerve conduction.

  13. Dissociation in Rating Negative Facial Emotions between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    Science.gov (United States)

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2016-11-01

    Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.

  14. Facial Nerve Paralysis due to a Pleomorphic Adenoma with the Imaging Characteristics of a Facial Nerve Schwannoma.

    Science.gov (United States)

    Nader, Marc-Elie; Bell, Diana; Sturgis, Erich M; Ginsberg, Lawrence E; Gidley, Paul W

    2014-08-01

    Background Facial nerve paralysis in a patient with a salivary gland mass usually denotes malignancy. However, facial paralysis can also be caused by benign salivary gland tumors. Methods We present a case of facial nerve paralysis due to a benign salivary gland tumor that had the imaging characteristics of an intraparotid facial nerve schwannoma. Results The patient presented to our clinic 4 years after the onset of facial nerve paralysis initially diagnosed as Bell palsy. Computed tomography demonstrated filling and erosion of the stylomastoid foramen with a mass on the facial nerve. Postoperative histopathology showed the presence of a pleomorphic adenoma. Facial paralysis was thought to be caused by extrinsic nerve compression. Conclusions This case illustrates the difficulty of accurate preoperative diagnosis of a parotid gland mass and reinforces the concept that facial nerve paralysis in the context of salivary gland tumors may not always indicate malignancy.

  15. Advances in facial reanimation.

    Science.gov (United States)

    Tate, James R; Tollefson, Travis T

    2006-08-01

    Facial paralysis often has a significant emotional impact on patients. Along with the myriad of new surgical techniques in managing facial paralysis comes the challenge of selecting the most effective procedure for the patient. This review delineates common surgical techniques and reviews state-of-the-art techniques. The options for dynamic reanimation of the paralyzed face must be examined in the context of several patient factors, including age, overall health, and patient desires. The best functional results are obtained with direct facial nerve anastomosis and interpositional nerve grafts. In long-standing facial paralysis, temporalis muscle transfer gives a dependable and quick result. Microvascular free tissue transfer is a reliable technique with reanimation potential whose results continue to improve as microsurgical expertise increases. Postoperative results can be improved with ancillary soft tissue procedures, as well as botulinum toxin. The paper provides an overview of recent advances in facial reanimation, including preoperative assessment, surgical reconstruction options, and postoperative management.

  16. Relative preservation of the recognition of positive facial expression "happiness" in Alzheimer disease.

    Science.gov (United States)

    Maki, Yohko; Yoshida, Hiroshi; Yamaguchi, Tomoharu; Yamaguchi, Haruyasu

    2013-01-01

    Positivity recognition bias has been reported for facial expression as well as memory and visual stimuli in aged individuals, whereas emotional facial recognition in Alzheimer disease (AD) patients is controversial, with possible involvement of confounding factors such as deficits in spatial processing of non-emotional facial features and in verbal processing to express emotions. Thus, we examined whether recognition of positive facial expressions was preserved in AD patients, by adapting a new method that eliminated the influences of these confounding factors. Sensitivity of six basic facial expressions (happiness, sadness, surprise, anger, disgust, and fear) was evaluated in 12 outpatients with mild AD, 17 aged normal controls (ANC), and 25 young normal controls (YNC). To eliminate the factors related to non-emotional facial features, averaged faces were prepared as stimuli. To eliminate the factors related to verbal processing, the participants were required to match the images of stimulus and answer, avoiding the use of verbal labels. In recognition of happiness, there was no difference in sensitivity between YNC and ANC, and between ANC and AD patients. AD patients were less sensitive than ANC in recognition of sadness, surprise, and anger. ANC were less sensitive than YNC in recognition of surprise, anger, and disgust. Within the AD patient group, sensitivity of happiness was significantly higher than those of the other five expressions. In AD patient, recognition of happiness was relatively preserved; recognition of happiness was most sensitive and was preserved against the influences of age and disease.

  17. Gender identity rather than sexual orientation impacts on facial preferences.

    Science.gov (United States)

    Ciocca, Giacomo; Limoncin, Erika; Cellerino, Alessandro; Fisher, Alessandra D; Gravina, Giovanni Luca; Carosa, Eleonora; Mollaioli, Daniele; Valenzano, Dario R; Mennucci, Andrea; Bandini, Elisa; Di Stasi, Savino M; Maggi, Mario; Lenzi, Andrea; Jannini, Emmanuele A

    2014-10-01

    Differences in facial preferences between heterosexual men and women are well documented. It is still a matter of debate, however, how variations in sexual identity/sexual orientation may modify the facial preferences. This study aims to investigate the facial preferences of male-to-female (MtF) individuals with gender dysphoria (GD) and the influence of short-term/long-term relationships on facial preference, in comparison with healthy subjects. Eighteen untreated MtF subjects, 30 heterosexual males, 64 heterosexual females, and 42 homosexual males from university students/staff, at gay events, and in Gender Clinics were shown a composite male or female face. The sexual dimorphism of these pictures was stressed or reduced in a continuous fashion through an open-source morphing program with a sequence of 21 pictures of the same face warped from a feminized to a masculinized shape. An open-source morphing program (gtkmorph) based on the X-Morph algorithm. MtF GD subjects and heterosexual females showed the same pattern of preferences: a clear preference for less dimorphic (more feminized) faces for both short- and long-term relationships. Conversely, both heterosexual and homosexual men selected significantly much more dimorphic faces, showing a preference for hyperfeminized and hypermasculinized faces, respectively. These data show that the facial preferences of MtF GD individuals mirror those of the sex congruent with their gender identity. Conversely, heterosexual males trace the facial preferences of homosexual men, indicating that changes in sexual orientation do not substantially affect preference for the most attractive faces. © 2014 International Society for Sexual Medicine.

  18. Bell’s Palsy: Symptoms Preceding and Accompanying the Facial Paresis

    Directory of Open Access Journals (Sweden)

    Daniele De Seta

    2014-01-01

    Full Text Available This individual prospective cohort study aims to report and analyze the symptoms preceding and accompanying the facial paresis in Bell’s palsy (BP. Two hundred sixty-nine patients affected by BP with a maximum delay of 48 hours from the onset were enrolled in the study. The evolution of the facial paresis expressed as House-Brackmann grade in the first 10 days and its correlation with symptoms were analyzed. At the onset, 136 patients presented postauricular pain, 114 were affected by dry eye, and 94 reported dysgeusia. Dry mouth was present in 54 patients (19.7%, facial pain, hyperlacrimation, aural fullness, and hyperacusis represented a smaller percentage of the reported symptoms. After 10 days, 39.9% of the group had a severe paresis while 10.2% reached a complete recovery. Dry mouth at the onset was correlated with severe grade of palsy and was prognostic for poor recovery in the early period. These outcomes lead to the deduction that the nervus intermedius plays an important role in the presentation of the BP and it might be responsible for most of the accompanying symptomatology of the paresis. Our findings could be of important interest to early address a BP patient to further examinations and subsequent therapy.

  19. Correlation between facial types and muscle TMD in women: an anthropometric approach

    Directory of Open Access Journals (Sweden)

    Ronaldo Pacheco de ARAUJO

    2015-01-01

    Full Text Available Temporomandibular disorders (TMD affecting the articular disc and/or the facial muscles are common among the population, recording a higher incidence in women age 20-40 years. The aim of this study was to investigate the correlation between facial types and muscle TMD in women. This study comprised 56 women age 18 to 49 years, seeking treatment for TMD at the School of Medicine, Federal University of São Paulo. All of the study individuals were diagnosed with muscle TMD, based on the Research Diagnostic Criteria (RDC. Facial type was determined using the Facial Brugsch Index and classified as euryprosopic (short and/or broad, mesoprosopic (average width and leptoprosopic (long and/or narrow. The data were submitted to the Chi-square test and ANOVA-Tukey’s test to conduct the statistical analysis. The faces of 27 individuals were classified as euryprosopic (48%, 18 as mesoprosopic (32%, and 11 as leptoprosopic (20%. A statistically significant difference (Chi-square, p = 0.032 was found among the facial types, in that leptoprosopic facial types showed the lowest values for muscle TMD. A greater number (p = 0.0007 of cases of muscle TMD were observed in the 20 to 39 year-old subjects than in the subjects of other age segments. In conclusion, women with euryprosopic facial types could be more susceptible to muscle TMD. Further studies are needed to investigate this hypothesis.

  20. Examining speed of processing of facial emotion recognition in individuals at ultra-high risk for psychosis

    DEFF Research Database (Denmark)

    Glenthøj, Louise Birkedal; Fagerlund, Birgitte; Bak, Nikolaj

    2018-01-01

    Emotion recognition is an aspect of social cognition that may be a key predictor of functioning and transition to psychosis in individuals at ultra-high risk (UHR) for psychosis ( Allott et al., 2014 ). UHR individuals exhibit deficits in accurately identifying facial emotions ( van Donkersgoed et...... al., 2015 ), but other potential anomalies in facial emotion recognition are largely unexplored. This study aimed to extend current knowledge on emotion recognition deficits in UHR individuals by examining: 1) whether UHR would display significantly slower facial emotion recognition than healthy...... controls, 2) whether an association between emotion recognition accuracy and emotion recognition latency is present in UHR, 3) the relationships between emotion recognition accuracy, neurocognition and psychopathology in UHR....