Skip to main content
Log in

The Rovereto Emotion and Cooperation Corpus: a new resource to investigate cooperation and emotions

  • Report
  • Published:
Language Resources and Evaluation Aims and scope Submit manuscript

Abstract

The Rovereto Emotion and Cooperation Corpus (RECC) is a new resource collected to investigate the relationship between cooperation and emotions in an interactive setting. Previous attempts at collecting corpora to study emotions have shown that this data are often quite difficult to classify and analyse, and coding schemes to analyse emotions are often found not to be reliable. We collected a corpus of task-oriented (MapTask-style) dialogues in Italian, in which the segments of emotional interest are identified using psycho-physiological indexes (Heart Rate and Galvanic Skin Conductance) which are highly reliable. We then annotated these segments in accordance with novel multimodal annotation schemes for cooperation (in terms of effort) and facial expressions (an indicator of emotional state). High agreement was obtained among coders on all the features. The RECC corpus is to our knowledge the first resource with psycho-physiological data aligned with verbal and nonverbal behaviour data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Notes

  1. Duchenne smiles involve the innervations of the orbicularis oculi, a facial muscle surrounding the eyes that is difficult to intentionally control, and have been empirically demonstrated to correlate with the experience of positive emotion (Frank and Ekman 1993).

  2. “Make your conversational contribution such as is required, at the stage at which it occurs, by the accepted purpose or direction of the talk exchange in which you are engaged”.

  3. There is a lack of consensus on how to interpret inter-annotator agreement scores. Some authors considered reliable the Kappa values between 0.67 and 0.8 for multimodal annotation while authors accepted as reliable only scoring rates over 0.8 (for a review of this issue see Artstein and Poesio 2008).

  4. Davies used her coding scheme for the HCRC map task corpus (Anderson et al. 1991), and the moves she coded were defined for that kind of task.

  5. All the participants had given informed consent and the experimental protocol was approved by the Human Research Ethics Committee of the University of Trento.

References

  • Allwood, J., Cerrato, L., Jokinen, K., Navarretta, C., & Paggio, P. (2007). The MUMIN coding scheme for the annotation of feedback, turn management and sequencing phenomena. Language Resources and Evaluation, 41, 273–287.

    Article  Google Scholar 

  • Anderson, A., Bader, M., Bard, E., Boyle, E., Doherty, G. M., Garrod, S., et al. (1991). The HCRC map task corpus. Language and Speech, 34, 351–366.

    Google Scholar 

  • Anderson, J. C., Linden, W., & Habra, M. E. (2005). The importance of examining blood pressure reactivity and recovery in anger provocation research. International Journal of Psychophysiology, 57, 159–163.

    Article  Google Scholar 

  • Artstein, R., & Poesio, M. (2008). Inter-coder agreement for computational linguistics. Computational Linguistics, 34, 555–596.

    Article  Google Scholar 

  • Bianchi, N., & Lisetti, C. L. (2002). Modeling multimodal expression of user’s affective subjective experience. User Modeling and User-Adapted Interaction, an International Journal, 12, 49–84.

    Article  Google Scholar 

  • Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavioral Therapy and Experimental Psychiatry, 25, 49–59.

    Article  Google Scholar 

  • Brown, W. M., & Moore, C., (2002). Smile asymmetries and reputation as reliable indicators of likelihood to cooperate: An evolutionary analysis. In S. P. Shohov (Ed.), Advances in psychology research (Vol. 11, pp. 59–78). New York: Nova Science Publishers.

  • Brown, W. M., Palameta, B., & Moore, C. (2003). Are there non-verbal cues to commitment? An exploratory study using the zero-acquaintance video presentation paradigm. Evolutionary Psychology, 1, 42–69.

    Google Scholar 

  • Callejas, Z., & Lopez-Cozar, R. (2008). Influence of contextual information in emotion annotation for spoken dialogue systems. Speech Communication, 50, 416–433.

    Article  Google Scholar 

  • Carletta, J. (2007). Unleashing the killer corpus: experiences in creating the multi-everything AMI meeting corpus. Language Resources and Evaluation, 41, 181–190.

    Article  Google Scholar 

  • Cavicchio, F., & Poesio, M., (2008). Annotation of emotion in dialogue: The emotion in cooperation project. In Multimodal dialogue systems perception. Lecture notes in computer science (pp. 233–239). Heidelberg, Berlin: Springer.

  • Clark, H. H. (1996). Using language. Cambridge: Cambridge University Press.

    Google Scholar 

  • Cowie, R., Douglas-Cowie, E., Savvidou, S., McMahon, E., Sawey, M., & Schroeder, M. (2000). FEELTRACE: An instrument for recording perceived emotion in real time. In ITRW on speech and emotion, September 5–7 (pp. 19–24). Newcastle, Northern Ireland.

  • Craggs, R., & Wood, M. (2004). A categorical annotation scheme for emotion in the linguistic content of dialogue. In Affective dialogue systems (pp. 89–100). Elsevier.

  • Davies, B. L. (1998). An empirical examination of cooperation, effort and risk in task-oriented dialogue. Unpublished Ph.D. thesis, University of Edinburgh.

  • Davies, B. L. (2006). Leeds working papers in linguistics and phonetics 11. http://www.leeds.ac.uk/linguistics/WPL/WP2006/2.pdf.

  • Dawkins, R. (1976). The selfish gene. New York: Oxford University Press.

    Google Scholar 

  • Devillers, L., Vidrascu, L., & Lamel, L. (2005). Challenges in real-life emotion annotation and machine learning based detection. Neural Networks, 18, 407–422.

    Article  Google Scholar 

  • Duncan, S. (1972). Some signals and rules for taking speaking turns in conversations. Journal of Personality and Social Psychology, 23, 283–292.

    Article  Google Scholar 

  • Duncan, S. (1974). On the structure of speaker-auditor interaction during speaking turns. Language in Society, 2, 161–180.

    Article  Google Scholar 

  • Duncan, S., & Fiske, D. (1977). Face-to-face interaction. Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Ekman, P., & Friesen, W. V. (1975). Unmasking the face. Englewood Cliffs, N.J.: Prentice-Hall.

    Google Scholar 

  • Ekman, P., & Friesen, W. V. (1978). Facial action coding system: A technique for the measurement of facial movement. Palo Alto, California: Consulting Psychologists Press.

    Google Scholar 

  • Ekman, P., & Friesen, W. V. (1982). Felt, false, and miserable smiles. Journal of Nonverbal Behaviour, 6, 238–252.

    Article  Google Scholar 

  • Fasel, B., & Luettin, J. (2003). Automatic facial expression analysis: A survey. Pattern Recognition, 36, 259–275.

    Article  Google Scholar 

  • Frank, R. (1988). Passions within reason: The strategic role of the emotions. New York: Norton.

    Google Scholar 

  • Frank, M. G., & Ekman, P. (1993). Not all smiles are created equal: The differences between enjoyment and non enjoyment smiles. International Journal of Humor Research, 6, 9–26.

    Article  Google Scholar 

  • Goodwin, C. (1981). Conversational organization: Interaction between speakers and hearers. New York: NY Academic Press.

    Google Scholar 

  • Grice, H. P. (1975). Logic and conversation. In P. Cole & J. L. Morgan (Eds.), Syntax and semantics, Vol. 3: Speech acts (pp. 41–58). New York: Academic Press.

    Google Scholar 

  • Hamilton, W. D. (1964). The genetical evolution of social behavior. Journal of Theoretical Biology, 7, 17–52.

    Article  Google Scholar 

  • Hoque, M. E., el Kaliouby, R., & Picard, R. W. (2009). When human coders (and machines) disagree on the meaning of facial affect in spontaneous videos. In 9th international conference on intelligent virtual agents (IVA). Amsterdam, Netherlands.

  • Kipp, M. (2001). ANVIL—a generic annotation tool for multimodal dialogue. In Eurospeech 2001 Scandinavia 7th European conference on speech communication and technology.

  • Krumhuber, E., Manstead, A. S. R., Cosker, D., Marshall, D., Rosin, P. L., & Kappas, A. (2007). Facial dynamics as indicators of trustworthiness and cooperative behavior. Emotion, 7, 730–735.

    Article  Google Scholar 

  • Levinson, S. C. (2006). On the human “interaction engine”. In N. J. Enfield & S. C. Levinson (Eds.), Roots of human sociality: Culture, cognition and interaction (pp. 39–69). Oxford: Berg.

    Google Scholar 

  • Magno Caldognetto, E., Poggi, I., Cosi, P., Cavicchio, F., & Merola, G. (2004). Multimodal score: An anvil based annotation scheme for multimodal audio-video analysis. In J.-C. Martin, E. D. Os, P. Kühnlein, L. Boves, P. Paggio, & R. Catizone (Eds.), In Proceedings of workshop multimodal corpora: Models of human behavior for the specification and evaluation of multimodal input and output interfaces (pp. 29–33).

  • Martin, J.-C., Caridakis, G., Devillers, L., Karpouzis, K., & Abrilian, S. (2006). Manual annotation and automatic image processing of multimodal emotional behaviors: Validating the annotation of TV interviews. In Fifth international conference on language resources and evaluation (LREC 2006). Genoa, Italy.

  • Matsumoto, D., Haan, N., Gary, Y., Theodorou, P., & Cooke-Carney, C. (1986). Preschoolers’ moral actions and emotions in prisoner’s dilemma. Developmental Psychology, 22, 663–670.

    Article  Google Scholar 

  • Mehu, M., Little, A. C., & Dunbar, R. I. M. (2007). Duchenne smiles and the perception of generosity and sociability in faces. Journal of Evolutionary Psychology, 5, 133–146.

    Article  Google Scholar 

  • Oda, R., Yamagata, N., Yabiku, Y., & Matsumoto-Oda, A. (2009). Altruism can be assessed correctly based on impression. Human Nature, 20(3), 331–341.

    Article  Google Scholar 

  • Pantic, M., & Rothkrantz, L. J. M. (2003). Toward an affect-sensitive multimodal human-computer interaction. Proceedings of the IEEE, 91, 1370–1390.

    Article  Google Scholar 

  • Pianesi, F., Leonardi, C., & Zancanaro, M. (2006). Multimodal annotated corpora of consensus decision making meetings. In J.-C. Martin, P. Kühnlein, P. Paggio, R. Stiefelhagen, & F. Pianesi (Eds.), Multimodal corpora: From multimodal behavior theories to usable models (pp. 6–19).

  • Picard, R. W. (1997). Affective computing. Cambridge: MIT Press.

    Google Scholar 

  • Pillutla, M. M., & Murnighan, J. K. (1996). Unfairness, anger, and spite: Emotional rejections of ultimatum offers. Organizational Behavior and Human Decision Processes, 68, 208–224.

    Article  Google Scholar 

  • Poggi, I., & Vincze, L. (2008). The Persuasive impact of gesture and gaze. In J.-C. Martin, P. Patrizia, M. Kipp, & D. Heylen (Eds.), Multimodal corpora: From models of natural interaction to systems and applications (pp. 46–51). Berlin: Springer.

    Google Scholar 

  • Rodríguez, K., Stefan, K. J., Dipper, S., Götze, M., Poesio, M., Riccardi, G., Raymond, C., & Wisniewska, J. (2007). Standoff coordination for multi-tool annotation in a dialogue corpus. In Proceedings of the linguistic annotation workshop at the ACL’07 (LAW-07), Prague, Czech Republic.

  • Sacks, H., Schegloff, E., & Jefferson, G. (1974). A simple systematics for the organization of turn-taking for conversation. Language, 50, 696–735.

    Article  Google Scholar 

  • Sanfey, A. G., Rilling, J. K., Aronson, J. A., Nystrom, L. E., & Cohen, J. D. (2003). The neural basis of economic decision-making in the Ultimatum Game. Science, 300, 1755–1758.

    Article  Google Scholar 

  • Scharlemann, J. P. W., Eckel, C. C., Kacelnik, A., & Wilson, R. K. (2001). The value of a smile: Game theory with a human face. Journal of Economic Psychology, 22, 617–640.

    Article  Google Scholar 

  • Siegel, S., & Castellan, N. J. (1988). Nonparametric statistics for the behavioral sciences. Oxford: McGraw-Hill.

    Google Scholar 

  • Taylor, T. J., & Cameron, D. (1987). Analysing conversation: Rules and units in the structure. Oxford: Pergamon.

    Google Scholar 

  • Trivers, R. L. (1971). The evolution of reciprocal altruism. Quarterly Review of Biology, 46, 35–57.

    Article  Google Scholar 

  • Villon, O., & Lisetti, C. L. (2007). A user model of psycho-physiological measure of emotion. In 11th international conference on user modelling.

  • Wagner, J., Kim, J., & Andre, E. (2005). From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification. In Multimedia and expo, ICME 2005 (pp. 940–943).

  • Xiao, E., & Houser, D. (2005). Emotion expression in human punishment behavior. Proceedings of the National Academy of Sciences, 102, 7398–7401.

    Article  Google Scholar 

  • Yngve, V. H. (1970). On getting a word in edgewise. Sixth regional meeting of the Chicago Linguistics Society (pp. 567–577).

Download references

Acknowledgments

This research was supported by a Ph.D. studentship from the Department of Information Science and Engineering, Università di Trento.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Federica Cavicchio.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cavicchio, F., Poesio, M. The Rovereto Emotion and Cooperation Corpus: a new resource to investigate cooperation and emotions. Lang Resources & Evaluation 46, 117–130 (2012). https://doi.org/10.1007/s10579-011-9163-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10579-011-9163-y

Keywords

Navigation