Digitizing the moving face during dynamic displays of emotion
Introduction
Facial expressions are complex signals caused by rapid changes in facial muscular activity that are brief and last only a few seconds. Rarely do they persist more than five seconds or fewer than 250 ms. Over the years, many studies have shown that the left side of the face is more emotionally expressive than the right [5], [31], [47], [54], [60], [61] (for reviews, see [6], [15], [29]). This asymmetry is more predominant for right- than left-handers and seems to occur for both negative and positive expressions (for review, see [9]). Even nonhuman primates such as the rhesus monkey have been reported to display more intense expressions over the left hemiface [33]. Although the basis for these asymmetries is unclear, one popular neuropsychological interpretation is that they reflect greater contribution of right hemisphere systems to emotional processing via contralateral hemispheric enervation of the left side of the face. This is known as the right hemisphere emotional priming hypothesis.
Previous research examining expressive asymmetries among normal individuals has largely relied on subjective ratings by human judges who view still photographs or a video frame of normal individuals posing different facial emotions (e.g. fear, anger, etc.). In some studies, pictures of right and left hemicomposite faces are created by photographically merging one side of the face with its mirror reversal [7]. Naive raters view these static stimuli and judge the intensity or overall emotionality using various ordinal rating scales. A variant of this approach involves highly trained raters who make inferences about which muscle groups on the face contract during the face expression, for example, Ekman’s Facial Action Coding System (FACS) [24] or Izard’s MAX [38]. In these latter studies, face expressive asymmetries have been less reliably observed [11], [13], [30], [46].
In vivo, human observers typically observe facial signals not as static stimuli, but during dynamic interactions in which the face moves and transitions from one expression to another. Bassili [4] pasted light dots onto darkened faces. He found that normal adults easily identified expressions of fear, anger, and happiness solely from the motion of the light dot patterns in the absence of information about the shape and position of facial features. Such findings are consistent with clinical observations of patients with focal brain lesions involving either the ventral or the dorsal visual processing systems [39]. Humphreys et al. [37] suggested that the processing of dynamic facial expressions are mediated by the same neural pathways in the brain that detect movement (dorsal), whereas static facial expressions are mediated by more ventral systems.
Although humans in naturalistic settings do not observe light dots pasted onto the face, they do detect changes in the surface patterns of light reflectance across the face as it moves. Leonard et al. [45] digitized videoimages of normals who posed smiles and then examined changes in signal value over a series of consecutive video frames. Each frame was approximately 30.75 ms in duration (i.e. the industry standard for a video frame) and consisted of a 240×360 pixel array depicted at 256 levels of grey scale. Changes in signal value were derived by subtracting corresponding pixel intensities between adjacent frames (e.g. ΣP111−P112+P121−P122…Pijk−1−Pijk, where i=horizontal pixel location, j=vertical pixel location, and k=frame number). Thus, changes in pixel intensity on a frame by frame basis represented the ‘signal’ that was detected by the human viewer. This signal corresponded to movement over the face. Using this procedure, Leonard et al. [45] found that the individual’s categorical perception of an emotional expression directly corresponded to the point in time when the most rapid changes in signal value occurred across the face. Of interest, the temporal characteristics of these changes were similar to response latencies of face sensitive neurons in the region of the temporal cortex [32], [44].
In the present study, we adopted the computer imaging methodology of Leonard et al. [45] to re-visit the study of facial asymmetry. Use of this technique afforded the unique opportunity to objectively quantify and analyze movement changes across the face over time. Importantly, this methodology bypassed human raters for making subjective judgements about emotional intensity or judgements about contraction of specific muscles on the face, characteristic of most previous research [3], [6], [7], [13], [14], [23], [30], [38], [47], [54].
Our overall aim was to use this computer methodology to determine whether the two sides of the face differed in the extent of movement during the unfolding of an expression, either emotional or nonemotional. To address this question, we videotaped 40 right-handed college students, all males, while they produced facial expressions at the request of the examiner. These videoimages were then digitized, frame by frame, and analyzed offline for temporal changes in pixel intensity that occurred over the face from a resting state to a peak expression. We referred to these signal changes as entropy. Separate entropy scores were derived for the upper and lower regions of the left and right hemiface.
Three specific issues were addressed, corresponding to those examined in early face asymmetry studies with subjective raters. First, we wanted to learn whether emotional expressions induced more asymmetric movement than nonemotional expressions (e.g. purse the lips, lower brow). If so, this would suggest that the emotionality per se may be more crucial for inducing movement asymmetries than other factors (e.g. face size, muscle mass, neural enervation).
Second, we wanted to learn whether emotion expression asymmetries, if observed, might be influenced by the valence of the emotion (e.g. positive, negative). In a pilot study, we found that 10 dextral males displayed more activity over the left side of the face during the expression of negative (fear, anger) vs positive emotions (happiness). These observations are consistent with a view that the right hemisphere may be involved in negative, aversive emotional states [21], [22]. In contrast, no systematic asymmetries occurred for 10 dextral females. Although intriguing, the current study was undertaken to replicate these preliminary findings in a larger sample of 40 subjects. This was particularly important in light of the contrasting results that have been obtained in studies using human raters to judge facial asymmetries [9].
Third, we wanted to learn whether hemifacial asymmetries differed in the upper vs lower face. This question is based on well-documented differences in cortical enervation between the upper and lower facial muscles [42], [51]. During posed, voluntary expressions, the lower 2/3 of each hemiface is primarily enervated by projections from the frontal motor regions of the contralateral hemisphere (i.e. right hemisphere — left lower hemiface, left hemisphere–right lower hemiface). In contrast, each side of the upper face is enervated by both hemispheres. Based on these enervation patterns, we predicted that asymmetries of hemifacial movements should be greater for the lower than upper hemiface during voluntary facial expressions.
Section snippets
Subjects
Forty right-handed males between the ages of 18 and 31 were recruited from the University of Florida student population. Subjects (Ss) with a self-reported history of learning disability, psychiatric, or neurologic disorders (e.g. head injury, etc.) were excluded, as were individuals with facial hair and facial jewelry. Informed consent was obtained according to University of Florida’s Institutional Review Board guidelines. Because of gender differences in the degree to which specific cognitive
Whole face entropy
An initial set of analyses evaluated which facial expressions resulted in the greatest overall amount of change or movement by examining entropy scores from the entire face. Emotional (fear, angry, sadness, happiness) and Nonemotional (Lower brow, Purse Lips, Show Teeth, Wrinkle brow) face expressions were independently analyzed in separate analyses of variance (ANOVA).
Table 1 shows the mean adjusted entropy scores and standard deviations for each facial expression in the emotional and
Discussion
In this study, we digitized real time video signals in order to examine movement asymmetries across the face during emotional and nonemotional expressions. In doing so, several assumptions were made regarding dynamic changes in facial expression. First, it was assumed that changes in surface light reflectance across a moving face could be indexed by changes in pixel greyness values over time. Presumably, changes in pixel intensity represented the ‘signal’ detected by the human observer. Another
Acknowledgements
This research was completed in partial fulfillment of the requirements for a doctoral degree by CKR. We gratefully acknowledge research support from the National Institutes of Health (NS93211, MH54623, MHP50-52384), the University of Florida Office of Research, Teaching, and Education (1997 Faculty Development Award to DB and CML), and the University of Florida Clinical Research Center (RR00082).
References (61)
Sex differences in lateral facial facility: the effects of habitual concealment
Neuropsychologia
(1983)- et al.
Hemiface mobility and facial expression
Cortex
(1983) - et al.
Facial asymmetry during emotional expression: gender, valence, and measurement techniques
Neuropsychologia
(1998) Using FACS vs communication scores to measure spontaneous facial expression of emotion in brain-damaged patients: a reply to Mammacuri et al. (1988)
Cortex
(1990)- et al.
Posed emotional expression in unilateral brain damaged patients
Cortex
(1989) - et al.
Craniofacial morphometry by photographic evaluation
American Journal of Orthodontic and Dentofacial Orthotics
(1993) The Venus of Milo and the dawn of facial asymmetry research
Brain and Cognition
(1991)- et al.
Expression is computed separately from facial identity, and it is computed separately for moving and static faces: neuropsychological evidence
Neuropsychologia
(1993) Manual activity during speaking: Right handers
Neuropsychologia
(1973)- et al.
Neurons in the amygdala of the monkey with responses selective for faces
Behavioral Brain Research
(1985)
Spontaneous expression of emotions in brain damaged patients
Cortex
Asymmetries in spontaneous facial expression and their possible relation to hemispheric specialization
Neuropsychologia
Morphological asymmetry of the face: A review
Brain and Cognition
Left sided oral asymmetries in spontaneous but not posed smiles
Neuropsychologia
Sex differences in asymmetry in the facial expression of emotion
Neuropsychologia
Hemiregional variations in facial expression of emotions
British Journal of Psychology
Emotion recognition: the role of facial movement and the relative importance of upper and lower face areas of the face
Journal of Personality and Social Psychology
Cerebral mechanisms underlying facial, prosodic, and lexical emotional expression
Neuropsychology
Neuropsychological aspects of facial asymmetry during emotional expression: a review of the normal adult literature
Neuropsychological Review
Lateralization for facial emotional behavior: a methodological perspective
International Journal of Psychology
Asymmetries of expression in facial movements during induced positive vs negative states: a video-analytic study
Cognition and Emotion
Stereophotogrammetric measurement of visual face asymmetry in children
Human Biology
Asymmetries of facial action: some facts and fancy of normal face movement
The lateralization of emotion: a critical review
International Journal of Psychology
Asymmetries of facility motility during the dissimulation of emotion
Perceptual and Motor Skills
Functional asymmetry of the face
Acta Anatomica
Indices of craniofacial asymmetry
The Angle Orthodontist
Role of the ipsilateral motor cortex in voluntary movement
Canadian Journal of Neurological Science
Experimental and clinical studies of central connections and central relations of the facial nerve
Annals of Otology, Rhinology, and Larynology
Asymmetrical brain activity discriminates between positive and negative affective stimuli in human infants
Science
Cited by (55)
Is a symmetrical face really attractive?
2023, International Journal of Oral and Maxillofacial SurgeryHow does a collision warning system shape driver's brake response time? the influence of expectancy and automation complacency on real-life emergency braking
2015, Accident Analysis and PreventionCitation Excerpt :Micro expressions in facts are brief involuntary rapid changes in facial muscles that could occur in particularly critical situations. The time to display a micro expression range in a time of 40–60 ms (Ekman, 1999), and the expression can be fully recognized in no less than 250 ms (Richardson et al., 2000; Sato and Yoshikawa, 2004). We expect that the analysis of micro expression could give a feedback on the monitoring process, especially for unexpected situations like the failure of the warning device in a surprise event (Gratch et al., 2013).
Posed versus spontaneous facial expressions are modulated by opposite cerebral hemispheres
2013, CortexCitation Excerpt :In their study of facial expressions in fully commissurotomized patients who were able to respond to written commands presented tachistoscopically to the right or left visual fields, Gazzaniga and Smylie (1990) reported that in the two patients who were able to wink to command from either visual field, the resultant wink was always contralateral to the stimulated hemisphere. Finally, our observations may help explain why Posed facial expressions have been reported to appear more intense on the right-upper compared to the left-upper face (Asthana and Mandal, 1997; Richardson et al., 2000). The functional-behavioral lateralization regarding Posed versus Spontaneous also holds for lower facial expressions that are presumably controlled by M1/LPMCv but, at least for smiling, it is far less robust compared to upper facial expressions.
Right hemisphere dominance for emotion processing in baboons
2011, Brain and CognitionCitation Excerpt :While the repeated-measures ANOVA indicated a high consistency of asymmetrical facial and vocal expressions across the different data sets, coherent with the significant asymmetries observed at an individual level, the fact that there was a directional asymmetry at the individual level does not imply that there was a similar degree of asymmetry each time. Facial expressions rarely last for more than 5 s, and form part of the complex dynamics of interaction, in which the magnitude of the facial asymmetry varies throughout the whole sequence, whether it is observed in humans (Richardson et al., 2000) or in nonhuman primates (Hook-Costigan & Rogers, 1998). Thus, in the still images we collected, the intensity of the emotion peak varied hugely from one picture to another within the same expression category, depending on the emotional intensity of the behavior, whether or not the emotional peak had been reached and which image was extracted from the sequence.
Electrocortical effects of acute tryptophan depletion on emotive facial processing in depression-prone individuals
2010, European Neuropsychopharmacology