Acessibilidade / Reportar erro

Recognition of static and dynamic facial expressions: a study review

Reconhecimento de expressões faciais estáticas e dinâmicas: um estudo de revisão

Abstracts

In emotion research, criticism has been directed to the use of static facial expressions, especially concerning its supposedly low ecological validity. In the present work, we performed a review of studies that directly compared the recognition of emotions using static and dynamic facial expressions. Behavioral, neuroimaging, brain damage and facial electromyography studies, published since 1993 were included. Overall, facial motion seems to promote emotional recognition. Neuroimaging and brain damage studies sustain the idea of a dissociation between the systems responsible to process static and dynamic expressions. Electromyography studies indicated that dynamic expressions tend to elicit more intense responses of facial mimic and are related to a higher physiological activation. Those findings support the hypothesis that dynamic facial expressions are ecologically more valid and therefore more appropriate to emotion research.

facial expression; perception; emotions


Na literatura de estudo das emoções, críticas têm sido dirigidas ao uso de expressões faciais estáticas, principalmente no que se refere a sua suposta baixa validade ecológica. No presente trabalho, foi realizada uma revisão de estudos que compararam diretamente o reconhecimento de emoções usando expressões faciais estáticas e dinâmicas. Foram incluídos estudos comportamentais, de neuroimagem, lesão cerebral e eletromiografia facial, publicados a partir de 1993. De um modo geral, o movimento facial parecer promover o reconhecimento de emoções. Estudos de neuroimagem e lesão cerebral sustentam a ideia de uma dissociação entre os sistemas responsáveis por processar expressões estáticas e dinâmicas. Estudos de eletromiografia indicaram que expressões dinâmicas tendem a eliciar respostas mais intensas de mímica facial e estão relacionadas a uma maior ativação fisiológica. Estes achados sustentam a hipótese de que expressões faciais dinâmicas são ecologicamente mais válidas e, portanto, mais adequados à pesquisa com emoções.

expressão facial; percepção; emoções


DOSSIER: COGNITION AND HUMAN DEVELOPMENT

Recognition of static and dynamic facial expressions: a study review

Reconhecimento de expressões faciais estáticas e dinâmicas: um estudo de revisão

Nelson Torro Alves

Federal University of Paraíba

ABSTRACT

In emotion research, criticism has been directed to the use of static facial expressions, especially concerning its supposedly low ecological validity. In the present work, we performed a review of studies that directly compared the recognition of emotions using static and dynamic facial expressions. Behavioral, neuroimaging, brain damage and facial electromyography studies, published since 1993 were included. Overall, facial motion seems to promote emotional recognition. Neuroimaging and brain damage studies sustain the idea of a dissociation between the systems responsible to process static and dynamic expressions. Electromyography studies indicated that dynamic expressions tend to elicit more intense responses of facial mimic and are related to a higher physiological activation. Those findings support the hypothesis that dynamic facial expressions are ecologically more valid and therefore more appropriate to emotion research.

Keywords: facial expression; perception; emotions.

RESUMO

Na literatura de estudo das emoções, críticas têm sido dirigidas ao uso de expressões faciais estáticas, principalmente no que se refere a sua suposta baixa validade ecológica. No presente trabalho, foi realizada uma revisão de estudos que compararam diretamente o reconhecimento de emoções usando expressões faciais estáticas e dinâmicas. Foram incluídos estudos comportamentais, de neuroimagem, lesão cerebral e eletromiografia facial, publicados a partir de 1993. De um modo geral, o movimento facial parecer promover o reconhecimento de emoções. Estudos de neuroimagem e lesão cerebral sustentam a ideia de uma dissociação entre os sistemas responsáveis por processar expressões estáticas e dinâmicas. Estudos de eletromiografia indicaram que expressões dinâmicas tendem a eliciar respostas mais intensas de mímica facial e estão relacionadas a uma maior ativação fisiológica. Estes achados sustentam a hipótese de que expressões faciais dinâmicas são ecologicamente mais válidas e, portanto, mais adequados à pesquisa com emoções.

Palavras-chave: expressão facial; percepção; emoções.

In recent years, the use of static facial expressions in emotion recognition studies has been questioned (Roark, Barrett, Spence, Abdi, & O'Toole, 2003). Some criticism is directed at the supposedly low ecological validity of static stimuli, since we need to take into account the temporal aspects of facial motion, which are relevant for emotional recognition in everyday interactions.

Static facial stimuli have been predominantly used in emotion research. Ekman and Friesen (1976) created an important set of standardized photographs of facial expressions of happiness, sadness, fear, anger, disgust, surprise and neutral face, which was widely used in research in the past decades. Since then, other sets of static faces were created, such as the "Japanese and Caucasian Facial Expressions of Emotion - JACFEE" (Biehl et al., 1997), the "Montreal Set of Facial Displays of Emotion - MSDEF" (Beaupré & Hess, 2005) and the "Nim Stim Face Stimulus Set" (Tottenham et al., 2009).

More recently, sets of dynamic expressions have been validated for emotional research, such as "Perception of Emotion Test - POET" (Kilts, Egan, Gideon, Ely, & Hoffman, 2003) and "Amsterdam Dynamic Facial Expression Set - ADFES "(van der Schalk, Hawk, Fischer, & Doosje, 2011). An alternative way to create dynamic expressions is to animate sequences of morphing which varies in emotional intensity (Biele & Grabowska, 2006).

Studies have presented conflicting results concerning the role of motion in the recognition of facial expressions. Some behavioral studies, for example, indicate that facial motion improves the recognition (Ambadar, Schooler, & Cohn, 2005), whereas others do not find differences between static and dynamic conditions (Kätsyri, Saalasti, Tiippana, von Wendt, & Sams, 2008). In this context, the question arises as to whether dynamic facial expressions produce different results from static expressions and, if so, what is its impact on emotional recognition research.

In order to evaluate the influence of facial motion in emotion recognition, we conducted a selective review of studies that compared static and dynamic facial expressions. We searched for works related to the topic published until September 2011, including articles in press, indexed to the databases "Science Direct" and "Pub Med". Based on preliminary tests, we chose to search for the expression "Dynamic AND Facial Expressions", which presented the most relevant studies to the review.

We selected 20 papers published since 1993. Studies that exclusively evaluated dynamic or static faces, without comparing the conditions, or works that investigated non-emotional aspects of facial recognition were not included. Interestingly, we observe that most of the publications are recent, with 95.5% of the articles published after 2001.

Review showed that works were divided into three major groups. A series of studies examined the effects of facial motion on the patterns of behavioral responses, such as accuracy in identifying emotions, attribution of emotional intensity and response time. Another set of studies investigated the patterns of brain activation using neuroimaging techniques. A third group of studies recorded electromyographic responses during the judgment or observation of static and dynamic facial stimuli.

Review indicated that motion affects behavioral responses, enhancing emotional recognition, and that static and dynamic conditions are associated to differential brain activation. In the following sections, we present the main findings of the review and discuss their implications in the emotion recognition area.

Recognition of static and dynamic facial expressions: behavioral studies

Throughout the review, we found that many studies support the hypothesis that motion is not redundant information, but it improves the recognition or, at least, it affects behavioral performance.

Recio, Schacht and Sommer (2011) reported an advantage for the recognition of dynamic compared to static facial expressions of happiness, but found no differences between conditions for anger expressions. Fujimura and Suzuki (2010) investigated the recognition of pleasant (joy, excitement and relaxation) and unpleasant facial expressions (fear, anger and sadness) presented in central and peripheral visual fields in the static and dynamic conditions. They found that only dynamic expressions of anger were better recognized in the peripheral region, which would be associated to a greater sensitivity to detect emotional salience linked to motion in this region. In a study combining behavioral responses and neuroimaging, Yoshikawa and Sato (2006) found that dynamic facial expressions, compared with static expressions, increased reports of emotional experience and promoted a wider activation of the visual cortices, right inferior frontal gyrus and amygdala.

In brain injury studies, importance of motion has been demonstrated. Adolphs, Tranel and Damasio (2003) report the case of a patient with extensive bilateral brain damage who could recognize only happiness in static images, whereas in dynamic presentations he was able to recognize different emotions, suggesting the existence of a dissociated neural substrate to process dynamic and static faces. Information regarding action mainly depends on the functioning of dorsal and parieto-occipital cortices, preserved in the patient. His injured regions (bilateral anterior and posterior temporal lobe and medial frontal cortex) could not process static stimuli efficiently in order provide the necessary information for recognition. Humphreys, Donnelly and Riddoch (1993) reported the case of a patient with a significant impairment to discriminate emotion and sex from a static face, but who could normally identify these characteristics in moving points of light. Likewise, that case suggests a separate codification to dynamic and static faces.

Some studies indicated that motion affect the recognition of emotions in clinical populations. Uono, Sato and Toichi (2010) found that individuals with Pervasive Developmental Disorder and voluntary controls had similar responses, perceiving dynamic expressions as more exaggerated. Other studies, however, failed to find differences between conditions. Kätsyri et al. (2008), for example, observed a similar performance in participants with Asperger Disorder and controls regarding the recognition of static and dynamic faces.

In a study with healthy volunteers, Ambadar et al. (2005) investigated whether differences between static and dynamic conditions resulted from the motion itself or due to other feature of the stimulus. Stimuli presentation was manipulated in four experimental conditions. In a condition named single-static, they presented a mask (random visual noise) for 200 m followed by a facial expression until the participant judged the emotion. In the dynamic condition, a sequence of three to six frames was present, beginning with the neutral and finishing with the emotional face, with each frame lasting for 33 m. In the multi-static condition, the same frames of dynamic condition were presented during 500 m, interspersed with a visual noise of 200 m. In the first-last condition, they presented the first frame of the sequence (neutral face) followed by the last one (emotional face). Results indicated an advantage on recognition of the dynamic and first-last conditions compared to others. The advantage of the dynamic condition over the multi-static condition indicate that the benefit of the motion is not due to the fact that the motion provides more quantity of static information, since in both conditions the same number of frames was presented.

Other studies showed that variables as sex of participant and velocity of presentation of the stimulus can affect the recognition of dynamic expressions. Biele and Grabowska (2006), for example, observed that motion produced an increase in the perceived emotional intensity of dynamic faces of happiness and anger when judged by women. As for men, motion increased only the intensity of anger, but not happiness. Kamachi et al. (2001) modified the presentation time of dynamic expressions created by the animation of sequences of morphings. They found that sadness was identified more precisely at low speed. Happiness and surprise were better recognized in fast sequences and anger at intermediate speed. Difference between conditions was not attributed to the duration of the videos, but to the motion itself.

Despite evidence of motion on improving recognition, some studies found no differences between dynamic and static conditions. Fiorentini and Viviani (2011) created faces combined by two different emotional faces (e.g., anger and fear, fear and sadness, happiness and disgust). Dynamic stimuli were generated by a procedure in which videos of two facial expressions were combined frame by frame to generate an intermediate expression. Participants had to identify which emotion predominated. Results revealed no systematic differences between conditions with respect to the accuracy and reaction time for recognition. Main findings and characteristics of behavioral studies are presented in Table 1.

Recognition of static and dynamic facial expressions: neuroimaging and facial electromyography studies

Neuroimaging studies revealed differences in brain activation for static and dynamic faces (Table 2). According to some findings, dynamic faces elicit more widespread patterns of brain activation than static faces. In an fMRI study, Trautmann, Fehr and Herrmann (2009) recorded brain activation in a procedure in which participants passively watched to static and dynamic expressions of disgust, happiness and neutral face. Compared to static faces, dynamic expressions increased activation in parahippocampal gyrus and amygdala, fusiforme gyrus, superior temporal gyrus, inferior frontal gyrus, and occipital and orbitofrontal cortices, which are regions involved, respectively, with processes of memory encoding, threat assessment, face recognition, biological motion, mirror neuron system, and increase of the emotional activity and reward processing. According to the authors, findings provide support to the idea that dynamic facial expressions would be more appropriate to study the perception of emotional expressions. Sato, Kochiyama, Yoshikawa, Naito and Matsumura (2004) found an activation, especially in the right hemisphere, of the inferior occipital gyrus, medial temporal gyrus and fusiform gyrus during the observation of dynamic facial expressions of happiness and fear. Arsalidou, Morris and Taylor (2011), in a meta-analysis and empirical study, found that dynamic facial stimuli tend to increase the activity in regions associated with emotional processing and interpretation of social signals.

Motion also seems to affect how the amygdala integrates information from gaze direction and expression. Sato, Kochiyama, Uono and Yoshikawa (2010), showed dynamic and static expressions of anger and happiness with the eye gaze directed toward the participant or elsewhere. They found that the left amygdala responded to the interaction between condition of expression and gaze direction, showing greater activity when dynamic but not static faces had eye gaze directed toward the participant.

Neuroimaging studies suggest the existence of a distinct neural substrate for the processing of static and dynamic facial stimuli. Kilts et al. (2003), using Positron Emission Tomography (PET), found that during the evaluation of static and dynamic happy faces there was a differential pattern of activation involving extra-striate cortex, area V5, spinal cord and temporo-medial cortical regions. During the evaluation of faces of anger, there was a difference between static and dynamic conditions involving the activation of superior temporal sulcus, area V5, peri-amygdaloid cortex and cerebellum. Kessler et al. (2011) found a differential activation during observation of static and dynamic expressions. Regardless of the valence, dynamic faces selectively activated the visual area V5, the fusiform gyrus, the thalamus, the bilateral superior temporal sulcus and other frontal and parietal areas. Static expressions of happiness were associated with an increased activity in the medial prefrontal cortex. According to the authors, the activity of the superior temporal sulcus and area V5 confirms previous findings indicating the relevant role of those regions in the processing of biological motion.

The review identified another set of studies, which compared facial electromyograpic (EMG) responses during the perception of dynamic and static facial expressions. Sato, Fujimura and Suzuki (2008) recorded EMG activity of the corrugator supercilii and zygomatic major muscles to expressions of anger and happiness. Compared to the static display condition, they found that dynamic faces of anger induced more EMG activity in the corrugator muscle of the eyebrow, whereas dynamic faces of happiness promoted greater activity of the zygomatic major muscle. This means that dynamic stimuli caused more intense responses of facial mimic than static stimuli.

Using a similar procedure, Rymarczyk, Biele, Grabowska and Majczynski (2011) compared EMG activity and attribution of emotional intensity to static and dynamic facial expressions of happiness and anger. They found that dynamic faces received higher scores on emotional intensity and that participants responded to the viewing of the happy face increasing the activity of the zygomatic major and decreasing the activity of the corrugator supercilii. Faces of anger caused no changes in the zygomatic muscle and promoted small changes in the corrugator supercilii, but without significant differences between static and dynamic conditions.

In another study, Sato and Yoshikawa (2007) discreetly filmed the faces of participants as they watched dynamic and static expressions of anger and happiness. After, they coded facial reactions using the Facial Action Coding System (FACS) (Ekman & Friesen, 1978), which permits to identify visually the patterns of contraction or relaxation of the facial muscles. During the dynamic condition, facial muscles reacted in accordance to the observed expression, indicating that motion elicits spontaneous facial responses in observers. In this work, Sato and Yoshikawa (2007) showed that dynamic expressions not only elicit more pronounced facial mimic responses, but can also be visually detected by an outside observer.

Discussion

We conducted a review of studies that directly compared static and dynamic presentation of facial expressions, including behavioral, neuroimaging, brain injury and facial electromyography research.

Most of the publications identified are recent, concentrated in the last decade. We consider that the growth in the number of works employing dynamic faces may represent a response to the criticisms to static stimuli. According to Fiorentini and Viviani (2011), static faces are an impoverished representation of real stimuli, since facial expressions are inherently dynamic. Furthermore, the interpretation of affective states in interpersonal relationships depends on a constant monitoring of changes in facial expressions that occur from moment to moment (Sato & Yoshikawa, 2008). From this perspective, dynamic facial expressions are more similar to natural stimulus, therefore with a greater ecological validity (Ambadar et al., 2005).

The body of work investigating behavioral responses indicates that motion is a relevant variable for the recognition of facial emotions. In general, studies show that dynamic expressions, even when produced from animations of morphings or moving points of light, tend to favor the recognition of facial expressions. This advantage is reported in studies with brain damage (Adolphs et al., 2003), clinical populations (Uono et al., 2010) and healthy volunteers (Ambadar et al., 2005).

In literature of emotion recognition, we can consider that some of the works conducted exclusively with static faces, such as cross-cultural studies, could produce different results if dynamic stimuli were employed. Regardless evidence on universality of recognition of facial expressions, criticisms to cross-cultural studies have been done, such as those concerning methods that could bias judgments of the participants and discrepancies in the categorization of emotions (Russell, 1994). For example, studies with isolated peoples, such as the Fores and Danes of Papua New Guinea, indicate culture may influence recognition. Compared to Westerners, Fores tended to confuse expressions of surprise and fear more often, while the Danes confuses anger and disgust (Ekman & Friesen, 1971).

Likewise, cultural differences may affect the emotional intensity perceived in the face. Biehl et al. (1997) found that non-Western cultures tend to attribute greater intensity to faces of fear, while Western cultures consider faces of happiness more intense. Differences between countries are also found. Inhabitants of Sumatra perceived contempt with less intensity than Hungarians, disgust less intensely than Japanese, and happiness with a lower intensity than the other countries studied.

Disagreements in categorization of emotions are usually explained in terms of cultural particularities in expression and interpretation of facial emotions. Ekman (1999) attributed these differences to "display rules", which means the way in which a culture normalizes about when, where and how an expression should occur. Thus, one might question whether the use of more valid ecologic stimuli, such as dynamic facial expressions, could produce different results compared to static faces. If motion really contributes to the recognition, differences between cultures would be attenuated.

In the same direction of behavioral studies, neuroimaging studies indicate that static and dynamic faces are differently processed. Dynamic stimuli tended to produce more generalized responses of brain activation (Trautmann et al., 2009), elicit more activity in areas associated with the interpretation of social signals and processing of emotions (Arsalidou et al., 2011) and produced faster responses in visual areas when compared to static stimuli (Recio et al., 2011). Those findings are consistent with brain damage studies, which indicate that the recognition of static and dynamic facial expressions are dissociated (Adolphs et al., 2003; Humphreys et al., 1993).

Electromyography studies showed that dynamic facial expressions tended to elicit more movements of facial mimic. For example, producing greater EMG activity in the corrugator muscle of eyebrow during the viewing of dynamic faces of anger and more activity of the zygomatic major during dynamic faces of happiness (Sato et al., 2008). Similarly, facial responses to dynamic stimuli were intense enough to be perceived by an outside observer (Sato & Yoshikawa, 2007). Facial mimic may be associated to an increase of the functioning of the mirror neuron system, which fires when someone acts as well as when one observes the same action being performed by another individual. Sato et al. (2004) found an increase in the activation of the inferior frontal gyrus during observation of dynamic faces, which is the homologous brain region of monkeys where mirror neurons were discovered (Gallese, Fadiga, Fogassi, & Rizzolatti, 1996). Therefore, EMG studies reiterate the distinction between static and dynamic information processing.

Conclusions

We reviewed behavioral, neuroimaging, brain injury and electromyography studies that compared the recognition of static and dynamic facial expressions. We found that motion influences emotional recognition, in general, contributing to a better performance in the tasks. Neuroimaging and brain lesion studies support the idea of a dissociation in the processing of static and dynamic facial expressions, which would recruit different brain areas. Overall, EMG studies showed that dynamic expressions elicit more intense facial and physiological responses in observers.

Findings reinforce the hypothesis that dynamic expressions are ecologically valid stimuli and closer to real situations of social interaction. We recommend the composition of new sets of dynamic expressions, which can significantly contribute to the emotion recognition research area.

Received: 06 May 2012

Revised: 13 December 2012

Accepted: 27 December 2012

Nelson Torro Alves, professor of the Department of Psychology and the Graduate Program in Cognitive Neuroscience and Behavior at the Federal University of Paraíba (UFPB). Email: nelsontorro@pq.cnpq.br

  • Adolphs, R., Tranel, D., & Damasio, A. R. (2003). Dissociable neural systems for recognizing emotions. Brain and Cognition, 52(1), 61-69.
  • Ambadar, Z., Schooler, J., & Cohn, J. F. (2005). Deciphering the enigmatic face: The importance of facial dynamics to interpreting subtle facial expressions. Psychological Science, 16, 403-410.
  • Arsalidou, M., Morris, D., & Taylor, M. J. (2011). Converging evidence for the advantage of dynamic facial expressions. Brain Topography, 24(2), 149-163.
  • Beaupré, M. G., & Hess, U. (2005). Cross-cultural emotion recognition among Canadian ethnic groups. Journal of Cross-Cultural Psychology, 36, 355-369.
  • Biehl, M., Matsumoto, D., Ekman, P., Hearn, V., Heider, K., Kudoh, T., & Ton, V. (1997). Matsumoto and Ekman's Japanese and Caucasian facial expressions of emotion (JACFEE): Reliability data and cross-national differences. Journal of Nonverbal Behavior, 21, 3-21.
  • Biele, C., & Grabowska, A. (2006). Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental Brain Research, 171(1), 1-6.
  • Ekman, P. (1999). Facial Expressions. In T. Dalgleish & T. Power (Eds.), The Handbook of Cognition and Emotion Sussex, U. K.: John Wiley & Sons, Ltd.
  • Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect Palo Alto, CA: Consulting Psychologists Press.
  • Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17(2), 124-129.
  • Ekman, P., & Friesen, W. V. (1978). Facial action coding system Palo Alto, CA: Consulting Psychologists Press.
  • Fiorentini, C., & Viviani, P. (2011). Is there a dynamic advantage for facial expressions? Journal of Vision, 11(3), 1-15.
  • Fujimura, T., & Suzuki, N. (2010). Recognition of dynamic facial expressions in peripheral and central vision. The Japanese Journal of Psychology, 81(4), 348-355.
  • Gallese, V., Fadiga, L., Fogassi, L., & Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain, 119, 593-609.
  • Humphreys, G. W., Donnelly, N., & Riddoch, M. J. (1993). Expression is computed separately from facial identity, and it is computed separately for moving and static faces: Neuropsychological evidence. Neuropsychologia, 31(2), 173-181.
  • Kamachi, M., Bruce, V., Mukaida, S., Gyoba, J., Yoshikawa, S., & Akamatsu, S. (2001). Dynamic properties influence the perception of facial expressions. Perception, 30(7), 875-887.
  • Kätsyri, J., Saalasti, S., Tiippana, K., von Wendt, L., & Sams, M. (2008). Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome. Neuropsychologia, 46(7), 1888-1897.
  • Kessler, H., Doyen-Waldecker, C., Hofer, C., Hoffmann, H., Traue, H. C., & Abler, B. (2011). Neural correlates of the perception of dynamic versus static facial expressions of emotion. GMS Psycho-Social Medicine, 8, 1-8.
  • Kilts, C. D., Egan, G., Gideon, D. A., Ely, T. D., & Hoffman, J. M. (2003). Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions. Neuroimage, 156-168.
  • Recio, G., Sommer, W., & Schacht, A. (2011). Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions. Brain Research, 1376(0), 66-75.
  • Roark, D. A., Barret, S. E., Spence, M. J., Abdi, H., & O'Toole, A. J. (2003). Psychological and neural perspectives on the role of motion in face recognition. Behavioral and Cognitive Neuroscience Reviews, 2(1), 15-46.
  • Russell, J. A. (1994). Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychological Bulletin, 115(1), 102-141.
  • Rymarczyk, K., Biele, C., Grabowska, A., & Majczynski, H. (2011). EMG activity in response to static and dynamic facial expressions. International Journal of Psychophysiology, 79(2), 330-333.
  • Sato, W., Fujimura, T., & Suzuki, N. (2008). Enhanced facial EMG activity in response to dynamic facial expressions. International Journal of Psychophysiology, 70(1), 70-74.
  • Sato, W., Kochiyama, T., Uono, S., & Yoshikawa, S. (2010). Amygdala integrates emotional expression and gaze direction in response to dynamic facial expressions. Neuroimage, 50(4), 1658-1665.
  • Sato, W., Kochiyama, T., Yoshikawa, S., Naito, E., & Matsumura, M. (2004). Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study. Cognitive Brain Research, 20(1), 81-91.
  • Sato, W., & Yoshikawa, S. (2007). Spontaneous facial mimicry in response to dynamic facial expressions. Cognition, 104(1), 1-18.
  • Tottenham, N., Tanaka, J., Leon, A. C., Mccarry, T., Nurse, M., Hare, T. A., ... Nelson, C. (2009). The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Research, 242-249. doi: 10.1016/j.psychres.2008.05.006
  • Trautmann, S. A., Fehr, T., & Herrmann, M. (2009). Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations. Brain Research, 1284, 100-115.
  • Uono, S., Sato, W., & Toichi, M. (2010). Brief Report: Representational Momentum for Dynamic Facial Expressions in Pervasive Developmental Disorder. Journal of Autism and Developmental Disorders, 40(3), 371-377.
  • van der Schalk, J., Hawk, S. T., Fischer, A. H., & Doosje, B. (2011). Moving faces, looking places: Validation of the Amsterdam Dynamic Facial Expression Set (ADFES). Emotion, 11(4), 907-920.
  • Yoshikawa, S., & Sato, W. (2006). Enhanced perceptual, emotional, and motor processing in response to dynamic facial expressions of emotion. Japanese Psychological Research, 48(3), 213-222.
  • Yoshikawa, S., & Sato, W. (2008). Dynamic facial expressions of emotion induce representational momentum. Cognitive, Affective, & Behavioral Neuroscience, 8, 25-31.

Publication Dates

  • Publication in this collection
    04 July 2019
  • Date of issue
    Mar 2013

History

  • Received
    06 May 2012
  • Accepted
    27 Dec 2012
  • Reviewed
    13 Dec 2012
Programa de Pós-graduação em Psicologia e do Programa de Pós-graduação em Psicobiologia, Universidade Federal do Rio Grande do Norte Caixa Postal 1622, 59078-970 Natal RN Brazil, Tel.: +55 84 3342-2236(5) - Natal - RN - Brazil
E-mail: revpsi@cchla.ufrn.br