Skip to main content

Advertisement

Log in

Generating an item pool for translational social cognition research: Methodology and initial validation

  • Published:
Behavior Research Methods Aims and scope Submit manuscript

Abstract

Existing sets of social and emotional stimuli suitable for social cognition research are limited in many ways, including size, unimodal stimulus delivery, and restriction to major universal emotions. Existing measures of social cognition could be improved by taking advantage of item response theory and adaptive testing technology to develop instruments that obtain more efficient measures of multimodal social cognition. However, for this to be possible, large pools of emotional stimuli must be obtained and validated. We present the development of a large, high-quality multimedia stimulus set produced by professional adult and child actors (ages 5 to 74) containing both visual and vocal emotional expressions. We obtained over 74,000 audiovisual recordings of a wide array of emotional and social behaviors, including the main universal emotions (happiness, sadness, anger, fear, disgust, and surprise), as well as more complex social expressions (pride, affection, sarcasm, jealousy, and shame). The actors generated a high quantity of technically superior, ecologically valid stimuli that were digitized, archived, and rated for accuracy and intensity of expressions. A subset of these facial and vocal expressions of emotion and social behavior were submitted for quantitative ratings to generate parameters for validity and discriminability. These stimuli are suitable for affective neuroscience-based psychometric tests, functional neuroimaging, and social cognitive rehabilitation programs. The purposes of this report are to describe the method of obtaining and validating this database and to make it accessible to the scientific community. We invite all those interested in participating in the use and validation of these stimuli to access them at www.med.upenn.edu/bbl/actors/index.shtml.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Adams, R. B., & Kleck, R. E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14, 644–647.

    Article  PubMed  Google Scholar 

  • Adolphs, R., Sears, L., & Piven, J. (2001). Abnormal processing of social information from faces in autism. Journal of Cognitive Neuroscience, 13, 232–240.

    Article  PubMed  Google Scholar 

  • Alba-Ferrara, L., Hausmann, M., Mitchell, R. L., & Weis, S. (2011). The neural correlates of emotional prosody comprehension: Disentangling simple from complex emotion. PLoS ONE, 6, e28701. doi:10.1371/journal.pone.0028701

    Article  PubMed Central  PubMed  Google Scholar 

  • Alfimova, M. V., Abramova, L. I., Barhatova, A. I., Yumatova, P. E., Lyachenko, G. L., & Golimbet, V. E. (2009). Facial affect recognition deficit as a marker of genetic vulnerability to schizophrenia. Spanish Journal of Psychology, 12, 46–55.

    Article  PubMed  Google Scholar 

  • Capps, L., Yirmiya, N., & Sigman, M. (1992). Understanding of simple and complex emotions in non-retarded children with autism. Journal of Child Psychology and Psychiatry, 33, 1169–1182.

    Article  PubMed  Google Scholar 

  • De Silva, L. C., Miyasato, T., & Nakatsu, R. (1997). Facial emotion recognition using multi-modal information. Information, Communications and Signal Processing, 1, 397–401.

    Article  Google Scholar 

  • Edwards, J., Jackson, H. J., & Pattison, P. E. (2002). Emotion recognition via facial expression and affective prosody in schizophrenia: A methodological review. Clinical Psychology Review, 22, 789–832.

    Article  PubMed  Google Scholar 

  • Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologists Press.

    Google Scholar 

  • Gur, R. C., Sara, R., Hagendoorn, M., Marom, O., Hughett, P., Turner, T., & Gur, R. E. (2002). A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. Journal of Neuroscience Methods, 115, 137–143. doi:10.1016/S0165-0270(02)00006-7

    Article  PubMed  Google Scholar 

  • Haskins, B., Shutty, J., & Kellogg, E. (1995). Affect processing in chronically psychotic patients: Development of a reliable assessment tool. Schizophrenia Research, 15, 291–297.

    Article  PubMed  Google Scholar 

  • Heimberg, C., Gur, R. E., Erwin, R. J., Shtasel, D. L., & Gur, R. C. (1992). Facial emotion discrimination: III. Behavioral findings in schizophrenia. Psychiatry Research, 42, 253–265.

    Article  PubMed  Google Scholar 

  • Hoekert, M., Kahn, R. S., Pijnenborg, M., & Aleman, A. (2007). Impaired recognition and expression of emotional prosody in schizophrenia: Review and meta-analysis. Clinical Psychology Review, 96, 135–145.

    Google Scholar 

  • Izard, C. E. (1971). The face of emotion. New York: Appleton-Century-Crofts.

    Google Scholar 

  • Kaulard, K., Cunningham, D. W., Bülthoff, H. H., & Wallraven, C. (2012). The MPI Facial Expression Database—A validated database of emotional and conversational facial expressions. PLoS ONE, 7, e32321. doi:10.1371/journal.pone.0032321

    Article  PubMed Central  PubMed  Google Scholar 

  • Kerr, S. L., & Neale, J. M. (1993). Emotion perception in schizophrenia-specific deficit or further evidence of generalized poor performance. Journal of Abnormal Psychology, 102, 312–318.

    Article  PubMed  Google Scholar 

  • Leentjens, A., Wielaert, S., van Harskamp, F., & Wilmink, F. (1998). Disturbances of affective prosody in patients with schizophrenia: A cross sectional study. Journal of Neurology, Neurosurgery and Psychiatry, 64, 375–378.

    Article  PubMed Central  PubMed  Google Scholar 

  • Matsumoto, D., & Ekman, P. (1988). Japanese and Caucasian Facial Expressions of Emotion (JACFEE) and Neutral Focus (JACNeuF) [Slides]. San Francisco: Department of Psychology, San Francisco State University.

    Google Scholar 

  • Mazurski, E. J., & Bond, N. W. (1993). A new series of slides depicting facial expressions of affect: A comparison with the pictures of facial affect series. Australian Journal of Psychology, 45, 41–47.

    Article  Google Scholar 

  • McCann, J., & Peppe, S. (2003). Prosody in autism spectrum disorders: A critical review. International Journal of Language and Communication Disorders, 38, 325–350.

    Article  PubMed  Google Scholar 

  • Morris, J. S., deBonis, M., & Dolan, R. J. (2002). Human amygdala responses to fearful eyes. NeuroImage, 17, 214–222.

    Article  PubMed  Google Scholar 

  • Mower, E., Mataric, M. J., & Narayanan, S. (2009). Human perception of audio–visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Transactions on Multimedia, 11, 843–855.

    Article  Google Scholar 

  • Reise, S. P., & Waller, N. G. (2009). Item response theory and clinical measurement. Annual Review of Clinical Psychology, 5, 27–48.

    Article  PubMed  Google Scholar 

  • Russ, J. B., Gur, R. C., & Bilker, W. B. (2008). Validation of affective and neutral sentence content for prosodic testing. Behavior Research Methods, 40, 935–939. doi:10.3758/BRM.40.4.935

    Article  PubMed  Google Scholar 

  • Savla, G. N., Vella, L., Armstrong, C. C., Penn, D. L., & Twamley, E. W. (2012). Deficits in domains of social cognition in schizophrenia: A meta-analysis of the empirical evidence. Schizophrenia Bulletin, 39, 979–992. doi:10.1093/schbul/sbs080

    Article  PubMed Central  PubMed  Google Scholar 

  • Simon, D., Craig, K. D., Gosselin, F., Belin, P., & Rainville, P. (2008). Recognition and discrimination of prototypical dynamic expressions of pain and emotions. Pain, 1–2, 55–64.

    Article  Google Scholar 

  • Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2003). Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nature Neuroscience, 6, 624–631. doi:10.1038/nn1057

    Article  PubMed  Google Scholar 

  • Whalen, P. J., Kagan, J., Cook, R. G., Davis, F. C., Kim, H., Polis, S., & Johnstone, T. (2004). Human amygdala responsivity to masked fearful eye whites. Science, 306, 2061. doi:10.1126/science.1103617

    Article  PubMed  Google Scholar 

  • Williams, M. A., Morris, A. P., McGlone, F., Abbott, D. F., & Mattingley, J. B. (2004). Amygdala responses to fearful and happy facial expressions under conditions of binocular suppression. Journal of Neuroscience, 24, 2898–2904.

    Article  PubMed  Google Scholar 

  • Winston, J. S., Vuilleumier, P., & Dolan, R. J. (2003). Effects of low-spatial frequency components of fearful faces on fusiform cortex activity. Current Biology, 13, 1824–1829.

    Article  PubMed  Google Scholar 

Download references

Author note

This work was supported by Grant Nos. NIH R01-MH060722 and NIH R01-MH084856. We thank our directors, Amy Dugas Brown and David M. O’Connor, and the Brain Behavior Laboratory staff, who edited the stimuli and provided Web programming support. The authors also recognize the work of Raymond P. Hill, who contributed significantly to the manuscript. Ray passed away in January 2013.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael K. Keutmann.

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1

(PDF 64 kb)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Keutmann, M.K., Moore, S.L., Savitt, A. et al. Generating an item pool for translational social cognition research: Methodology and initial validation. Behav Res 47, 228–234 (2015). https://doi.org/10.3758/s13428-014-0464-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3758/s13428-014-0464-0

Keywords

Navigation