Research ReportElectrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions
Research Highlights
► Motion in face increases the impact of emotional expressions. ► Dynamic faces provide richer information and augment intensity in expressions. ► EPN and LPC components are enhanced and prolonged for dynamic relative to static facial expressions. ► Motion seems to enhance visual attention and consolidation in working memory. ► Facial expressions developing dynamically over time are more ecologically valid.
Introduction
Social communication is a dynamic process in which rapidly changing auditory and visual inputs need to be quickly evaluated. In the context of social interactions, human faces provide an extraordinarily important source of information. For instance, lip movements support speech comprehension, gaze direction informs about spatial attention, and facial expressions communicate the emotional state of others. Thus, it seems that we are geared to quickly recognize subtle changes in the facial composure of conspecifics. Although some studies have indeed shown a particular sensitivity for dynamic facial movements, for example, in learning faces (Pilz et al., 2006), identifying persons (O'Toole et al., 2002), recognizing emotional expressions (Ambadar et al., 2005, Bassili, 1978, Kamachi et al., 2001), and in the perceived intensity of expressions (e.g., Biele and Grabowska, 2006), most studies on emotional facial expression rely on static images (e.g., Adolphs, 2002). However, in reality, emotional expressions usually occur as characteristic changes of the facial configuration when coordinated muscle contractions unfold over time.
Neuroimaging studies have shown activation in several brain areas while viewing static expressions of emotion, including the striate cortex, the fusiform face area (FFA), the superior temporal gyrus, the amygdala, the orbitofrontal cortex, the basal ganglia, and the superior temporal sulcus (STS) (for reviews, see Adolphs, 2002, Allison et al., 2000, Blake and Shiffrar, 2007). More recently, several studies have found enhanced and/or more widespread activation patterns in these networks in response to dynamic face stimuli, particularly, in the amygdala, in visual areas (striate and extrastriate cortex, and V5/MT+), fusiform gyrus, STS, inferior frontal cortex, FFA, premotor area, parahippocampal regions, and supplementary motor area (e.g., Kilts et al., 2003, LaBar et al., 2003, Sato et al., 2004, Trautmann et al., 2009). The enhanced activation in striate and extrastriate visual areas has been suggested to reflect augmented selective attention to emotional relative to neutral stimuli at early stages of visual processing (e.g., Kilts et al., 2003). Trautmann et al. (2009) proposed that the higher complexity and rapidly changing cues in dynamic faces might result in activation of wider brain networks. On the other hand, the temporal pattern of structural changes in dynamic facial expressions, their greater liveliness and higher ecological validity, along with increased arousal ratings might improve the three-dimensional perception of faces and facilitate the processing of emotional expressions.
Electrophysiological studies of dynamic facial expressions are even scarcer. Puce, Smith and Allison (2000) found evidence that the amplitude of the face-sensitive N170 component in the event-related potential (ERP) was affected by the direction of gaze and mouth movement in a continuously presented face. Furthermore, there is evidence that dynamic emotional expressions and gaze direction affect ERP components as early as 100 ms after the onset of the event (P1 and subsequent N1 and P3 components) (Fichtenholtz et al., 2007, Fichtenholtz et al., 2009), indicating shifts in attentional orientation. However, no direct comparison has been made between static and dynamic conditions in these studies; therefore, they are not informative about specific differences between these conditions. A recent study using a steady-state stimulation procedure, that directly compared static with dynamic emotional faces, found a late reduction in neural processing in the temporal lobe for dynamic faces (Mayes et al., 2009).
Summarizing previous findings, facial motion seems to improve the perception of emotional expressions (for a review, see Ambadar et al., 2005) and neuronal substrates of perceiving and evaluating facial motion have been described. However, the mechanisms underlying the motion effects and their time course of action remain largely undefined.
It has been suggested that emotional aspects of stimuli facilitate their processing by influencing early perceptual and later elaborative stages (Öhman et al., 2000). Limited attentional resources might be intentionally or reflexively allocated to a given stimulus, depending on, for instance, its salience or intensity (Wickens, 1980). Thus, emotional aspects might enhance the allocation of attention to the stimulus, facilitating perceptual and subsequent recognition processes (Schupp et al., 2003). Attention capture has been attributed also to moving objects, as already suggested by William James (1891/1950) and confirmed by recent studies for translating and looming motion (Franconeri and Simons, 2003), feature changes (Mühlenen et al., 2005), and motion onset (Abrams and Christ, 2003). Therefore, the superiority of dynamic emotional expressions might relate to the augmented capture of attentional resources as compared to static pictures, boosting – among other aspects – the evaluation of the emotional expression.
Emotions in facial expressions have been reported to elicit two ERP components: the early posterior negativity (EPN) and the late positive complex (LPC) (e.g., Holmes et al., 2009, Schacht and Sommer, 2009, Schupp et al., 2004). Both components can be best visualized when the ERP to neutral stimuli is subtracted from the ERP to emotional stimuli. The EPN emerges as early as 150 to 300 ms after stimulus onset as a negative deflection over occipito-parietal electrodes and – as its counterpart – fronto-central positivity and is considered to reflect attention allocation to the stimuli (Junghöfer et al., 2001). If dynamic facial expressions facilitate performance by augmenting attention capture, the EPN to emotional as compared to neutral stimuli should be more pronounced for dynamic stimuli.
The second ERP component modulated by emotional expressions, the LPC, appears at around 500 ms, as a long-lasting, enhanced positivity over centro-parietal electrodes, and is suggested to reflect elaborative processing and conscious recognition of the stimulus (Schupp et al., 2003). If dynamic stimuli also augment the elaborative processes following attention capture, also the LPC effect might be more prominent for dynamic than static facial expressions.
Here, we presented face stimuli with happy, angry, or neutral expressions in either static or dynamic presentation modes while participants explicitly categorized these expressions. If the dynamic presentation is responsible for improved emotion evaluation, motion should facilitate the typical emotion effect in the ERPs. It was of primary interest, at which time after stimulus onset, ERPs to dynamic and static facial expressions would start to be distinguishable from each other and from ERPs to neutral expressions. Given previous evidence that both emotional images (e.g., Schupp et al., 2003) and moving objects (Abrams and Christ, 2003, Franconeri and Simons, 2003, Mühlenen et al., 2005) guide stimulus-driven selective attention, we expected enhanced EPN amplitudes for dynamic faces. Moreover, we expected this boosted visual attention to facilitate emotion evaluation, which should be reflected in more pronounced LPC effects. If motion in the face increases the intensity of the facial expression (e.g., Biele and Grabowska, 2006), which in turn facilitates the perception and evaluation of emotional expressions, the LPC effect should be enhanced in the dynamic condition. Furthermore, we expected the scalp distributions of the observed emotional effects to reflect enhanced visual processing for dynamic emotional faces in posterior electrode positions, in line with the more widespread activation patterns reported in fMRI studies suggesting shifts of attention to dynamic stimuli (e.g., Kilts, et al., 2003).
Section snippets
Behavioral data
Behavioral data is presented in Table 1. Relative to static pictures dynamic emotional expressions were recognized faster, F(1,20) = 41.8, p < .001, and more accurately, F(1,20) = 10.9, p < .01. Moreover, interactions between presentation mode and emotional expression were significant for both RTs, F(2,40) = 19.6, p < .001, and error rates, F(2,40) = 15.4, p < .001. Pairwise comparisons revealed that happy faces particularly benefited from dynamic presentation: whereas responses to static faces were
Discussion
In the present study, we compared dynamic and static faces displaying positive, negative, or neutral facial expressions that were morphed from portraits of neutral faces. It was our main interest to assess whether emotional-related ERP components to static facial expression as obtained in previous research would extend also to dynamic displays or whether such dynamically developing expressions would lead to qualitatively different patterns of brain activity. In general, the present findings
Participants, stimuli, and procedure
Participants were 21 healthy adults (7 female) between 20 and 34 years of age (M = 24.14 years, SD = 3.3) with normal or corrected-to-normal vision. Apart from one male left-hander, all participants were right-handed (Oldfield, 1971). Laterality quotients for handedness were M = 84.5, SD = 18.9 for female participants; and M = 69.6, SD = 50.4 for males.
None of them reported a history of neurological or neuropsychological problems. Prior to the experimental session all participants gave written informed
Acknowledgments
This research was supported by the Cluster of Excellence 302 “Languages of Emotion”, Grant 209 to AS and WS. We thank Marina Palazova, Julian Rellecke, and Olga Shmuilovich for assistance in data collection, and Thomas Pinkpank and Rainer Kniesche for technical support.
References (70)
- et al.
Social perception from visual cues: role of the STS region
Trends Cogn. Sci.
(2000) - et al.
Early processing of the six basic facial emotional expressions
Brain Res. Cogn. Brain Res.
(2003) - et al.
Brain potentials in affective picture processing: covariation with autonomic arousal and affective report
Biol. Psychol.
(2000) - et al.
Event-related brain potential correlates of emotional face processing
Neuropsychologia
(2007) - et al.
The effect of dynamics on identifying basic emotions from synthetic and natural faces
Int. J. Hum. Comput. Stud.
(2008) - et al.
Effects of emotional arousal in the cerebral hemispheres: a study of oscillatory brain activity and event-related potentials
Clin. Neurophysiol.
(2001) - et al.
Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions
Neuroimage
(2003) - et al.
Task instructions modulate neural responses to fearful facial expressions
Biol. Psychiatry
(2003) - et al.
Reference-free identification of components of checkerboard-evoked multichannel potential fields
Electroencephalogr. Clin. Neurophysiol.
(1980) - et al.
Scalp distributions of event-related potentials: an ambiguity associated with analysis of variance models
Electroencephalogr. Clin. Neurophysiol.
(1985)