Skip to main content
Log in

An empirical study of machine learning techniques for affect recognition in human–robot interaction

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

Given the importance of implicit communication in human interactions, it would be valuable to have this capability in robotic systems wherein a robot can detect the motivations and emotions of the person it is working with. Recognizing affective states from physiological cues is an effective way of implementing implicit human–robot interaction. Several machine learning techniques have been successfully employed in affect-recognition to predict the affective state of an individual given a set of physiological features. However, a systematic comparison of the strengths and weaknesses of these methods has not yet been done. In this paper, we present a comparative study of four machine learning methods—K-Nearest Neighbor, Regression Tree (RT), Bayesian Network and Support Vector Machine (SVM) as applied to the domain of affect recognition using physiological signals. The results showed that SVM gave the best classification accuracy even though all the methods performed competitively. RT gave the next best classification accuracy and was the most space and time efficient.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. World Robotics (2004) Statistics, market analysis, forecasts, case studies and profitability of robot investment. Sales No. GV.E.04.0.20 or ISBN No. 92-1-101084-5

  2. Reeves B, Nass C (1996) The media equation: how people treat computers, televisions and new media like real people and places. Cambridge University Press, New York

    Google Scholar 

  3. Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor JG (2001) Emotion recognition in human–computer interaction. IEEE Signal Process Mag 18(2):32–80

    Article  Google Scholar 

  4. Picard R (1997) Affective computing. The MIT Press, Cambridge

    Google Scholar 

  5. Walter WG (1963) The living brain. W. W. Norton, New York

    Google Scholar 

  6. Breazeal C, Aryananda L (2002) Recognizing affective intent in robot directed speech. Auton Robots 12(1):83–104

    Article  MATH  Google Scholar 

  7. Littlewort GC, Bartlett MS, Chenu J, Fasel I, Kanda T, Ishiguro H, Movellan JR (2004) Towards social robots: Automatic evaluation of human–robot interaction by face detection and expression classification. Adv Neural Inform Process Syst 16:1563–1570

    Google Scholar 

  8. Pantic M, Rothkrantz LJM (2003) Towards an affect-sensitive multimodal human–computer interaction. Proc IEEE 91(9):1370–1390

    Article  Google Scholar 

  9. Conati C, Zhou X (2002) Modeling students’ emotions from cognitive appraisal in educational games. In: Proceedings of 6th international conference on intelligent tutoring systems, France

  10. Backs RW, Lenneman JK, Wetzel JM, Green P (2003) Cardiac measures of driver workload during simulated driving with and without visual occlusion. Hum Factors 45(4):525–539

    Article  PubMed  Google Scholar 

  11. Hudlicka E, McNeese MD (2002) Assessment of user affective and belief states for inference adaptation: application to an air force pilot task. User Model User Adapt Interact 12:1–47

    Article  MATH  Google Scholar 

  12. Hayakawa Y, Sugano S (1998) Real time simple measurement of mental strain in machine operation. ISCIE 1998 Japan–USA symposium on Flexible Automation, Otsu, Japan, pp 35–42

  13. Dana Kulic, Croft E (2003) Estimating Intent for Human–robot Interaction. In: Proceedings of IEEE international conference on advanced robotics, pp 810–815

  14. Tsapatsoulis N, Karpouzis K, Stamou G, Piat F, Kollias S (2000) A fuzzy system for emotion classification based on the MPEG-4 facial definition parameter set. In: Proceedings of EUSIPCO-2000, Finland

  15. Petrushin VA (2000) Emotion recognition agents in real world. AAAI fall symposium on socially intelligent agents: human in the loop

  16. Moriyama T, Saito H, Ozawa S (1999) Evaluation of the relation between emotional concepts and emotional parameters on speech. IEICE J J82-DII(10):1710–1720

    Google Scholar 

  17. Ark W, Dryer D, Lu D (1999) The emotion mouse human–computer interaction: ergonomics and user interfaces. In: Bullinger HJ, Ziegler J (eds) Lawrence Erlbaum Assoc, London, pp 818–823

  18. Picard RW, Vyzas E, Healy J (2001) Toward machine emotional intelligence: analysis of affective psychological states. IEEE Trans Pattern Anal Mach Intell 23(10):1175–1191

    Article  Google Scholar 

  19. Zhao J, Kearney G (1996) Classifying facial movement by backpropagation neural networks with fuzzy inputs. In: Proceedings of international conference on neural information processing, pp 454–457

  20. Qi Y, Picard RW (2002) Context-sensitive Bayesian classifiers and application to mouse pressure pattern classification. In: Proceedings of international conference on pattern recognition, Canada

  21. Cohen I, Garg A, Huang TS (2000) Emotion recognition using multilevel HMM. In: Proceedings of NIPS workshop on affective computing, Colorado

  22. Conati C (2002) Probabilistic assessment of user’s emotions in educational games. J Appl Artif Intell, special issue on “Merging Cognition and Affect in HCI 16:555–575

    Google Scholar 

  23. Nasoz F, Alvarez K, Lisetti C, Finkelstein N (2003) Emotion recognition from physiological signals for presence technologies. Int J Cogn Technol Work Spec Issue Presence 6:1

    Google Scholar 

  24. Wilson GF, Russell CA (2003) Real-time assessment of mental workload using psychophysiological measures and artificial neural networks. Hum Factors 45(4):635–643

    Article  PubMed  Google Scholar 

  25. Takahashi K (2004) Remarks on emotion recognition from bio-potential signals. In: Proceedings of 2nd international conference on autonomous robots and agents, New Zealand

  26. Kim KH, Bang SW, Kim SR (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42:419–427

    Article  PubMed  Google Scholar 

  27. Bradley MM (2000) Emotion and motivation. In: Cacioppo JT, Tassinary LG, Berntson G (eds) Handbook of Psychophysiology. Cambridge University Press, New York, pp 602–642

    Google Scholar 

  28. Rani P, Sarkar N, Smith C, Kirby L (2004) Anxiety detecting robotic systems—towards implicit human–robot collaboration. Robotica 22(1):85–95

    Article  Google Scholar 

  29. Pecchinenda A, Smith CA (1996) The affective significance of skin conductance activity during a difficult problem-solving task. Cogn Emotion 10(50):481–504

    Article  Google Scholar 

  30. Brown RM, Hall LR, Holtzer R, Brown SL, Brown NL (1997) Gender and video game performance. Sex Roles 36(11–12):793–812

    Article  Google Scholar 

  31. Breiman L, Friedman JH, Olshen RA, Stone CJ (1984) Classification and regression trees. Wadsworth & Brooks/Cole Advanced Books & Software, Pacific Grove

    MATH  Google Scholar 

  32. Kokol P, Mernik M, Završnik J, Kancler K, Malèiæ I (1994) Decision trees and automatic learning and their use in cardiology. J Med Syst 9(4):201–206

    Article  Google Scholar 

  33. Downey S, Russell JM (1992) A decision tree approach to task independent speech recognition. In: Proceedings of inst acoustics autumn conf on speech and hearing, vol 14(6), pp 181–188

  34. Heckerman D (1999) A tutorial on learning with Bayesian networks. In: Jordan M (ed) Learning in graphical models. MIT Press, Cambridge

  35. Brown LE, Tsamardinos I, Aliferis CF (2004) A novel algorithm for scalable and accurate Bayesian network learning. In Proceedings of the 11th world congress on medical informatics (MEDINFO), California, September 2004

  36. Catlett J (1991) On changing continuous attributes into ordered discrete attributes. In: Proceedings of Fifth European working session on learning. Springer, Berlin Heidelberg New York, pp 164–177

  37. Vapnik V (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  38. Joachims T (1998) Text categorization with support vector machines: learning with many relevant features. In: Proceedings of ECML-98, 10th European conference on machine learning. DE, Heidelberg, pp 137–142

  39. Burges C (2000) A tutorial on support vector machines for pattern recognition. In: Fayyad U (ed) Knowledge discovery and data mining. Kluwer, Norwell, pp 1–43

    Google Scholar 

  40. Hsu CW, Lin CJ (2002) A comparison of methods for multi-class support vector machines. IEEE Trans Neural Netw 13:415–425

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pramila Rani.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rani, P., Liu, C., Sarkar, N. et al. An empirical study of machine learning techniques for affect recognition in human–robot interaction. Pattern Anal Applic 9, 58–69 (2006). https://doi.org/10.1007/s10044-006-0025-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-006-0025-y

Keywords

Navigation