Skip to main content
Log in

Emotional Postures for the Humanoid-Robot Nao

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This paper presents the development of emotional postures for the humanoid robot Nao. The approach is based on adaptation of the postures that are developed for a virtual human body model to the case of the physical robot Nao. In the paper the association between the joints of the human body model and the joints of the Nao robot are described and the transformation of postures is explained. The non-correspondence between the joints of the actual physical robot and the joints of the human body model was a major challenge in this work. Moreover, the implementation of the postures into the robot was constrained by the physical structure and the artificial mass distribution. Postures for the three emotions of anger, sadness, and happiness are studied. Thirty two postures are generated for each emotion. Among them the best five postures for each emotion are selected based on the votes of twenty five external observers. The distribution of the votes indicates that many of the implemented postures do not convey the intended emotions. The emotional content of the selected best five postures are tested by the votes of forty observers. The intended emotions received the highest recognition rate for each group of these selected postures. This study can be considered to be the last step of a general process for developing emotional postures for robots. This process starts with qualitative descriptions of human postures, continues with encoding those descriptions in quantitative terms, and ends with adaptation of the quantitative values to a specific robot. The present study demonstrates the last step of this process.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Mehrabian A (2009) Nonverbal communication. Aldine Transaction, Piscataway, pp 1–11, first printed in 1972

    Google Scholar 

  2. Erden MS, Tapus A (2010) Postural expressions of emotions in a humanoid robot for assistive applications. Poster paper in workshop on learning for human–robot interaction modeling under the conference of robotics science and systems-RSS 2010, 27–30 June, Zaragoza, Spain

  3. Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2:147–157

    Article  Google Scholar 

  4. Zecca M, Endo N, Momoki S, Itoh K, Takanishi A (2008) Design of humanoid robot KOBIAN—preliminary analysis of facial and whole body emotion expression capabilities. In: Proc of the 8th IEEE-RAS int conf on humanoid robot, vol 1–3, Daejeon, Korea, 1–3 Dec, pp 487–492

    Google Scholar 

  5. Li J, Chignell M (2010) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot. doi:10.1007/s12369-010-0071-x. Published Online 4 September 2010.

    Google Scholar 

  6. Zecca M, Mizoguchi T, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expression for KOBIAN humanoid robot—preliminary experiments with different emotional patterns. In: Proc of the 18th IEEE int symp on robot and human interactive communication—ROMAN 2009, Toyama, Japan, Sept 27–Oct 2, pp 381–386

    Chapter  Google Scholar 

  7. Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139

    Article  MathSciNet  Google Scholar 

  8. Boone RT, Cunningham JG (2001) Children’s expression of emotional meaning in music through expressive body movement. J Nonverbal Behav 25(1):21–41

    Article  Google Scholar 

  9. Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152

    Article  Google Scholar 

  10. Walbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896

    Article  Google Scholar 

  11. Goh DH, Ang RP, Tan HC (2008) Strategies for designing effective psychotherapeutic gaming interventions for children and adolescents. Comput Hum Behav 24:2217–2235

    Article  Google Scholar 

  12. Billard A, Dautenhahn K (2002) Games children with autism can play with robota, a humanoid robotic doll. In: Proceedings of the 1st Cambridge workshop on universal access and assistive technology, num. 1

    Google Scholar 

  13. Jahr E, Eldevik S, Eikeseth S (2000) Teaching children with autism to initiate and sustain cooperative play. Res Dev Disabil 21:151–169

    Article  Google Scholar 

  14. Silver M, Oakes P (2001) Evaluation of a new computer intervention to teach people with autism or Asperger syndrome to recognize and predict emotions in others. Autism 5(3):299–316

    Article  Google Scholar 

  15. Frith U, Blakemore SJ (2003) “Social cognition”. Foresight cognitive systems project research review. Institute of Cognitive Neuroscience, London

    Google Scholar 

  16. Billard A (2003) Robota: clever toy and educational tool. Robot Auton Syst 42:259–269

    Article  MATH  Google Scholar 

  17. Billard A (2000) Play, dreams and imitation in robota. In: Proceedings of the workshop on interactive robotics and entertainment, CMU, Pittsburgh, 30 April–1 May, 2000

    Google Scholar 

  18. Virvou M, Katsionis G (2008) On the usability and likeability of virtual reality games for education: the case of VR-ENGAGE. Comput Educ 50:154–178

    Article  Google Scholar 

  19. Kozima H, Michalowski MP, Nakagawa C (2009) Keepon—a playful robot for research, therapy, and entertainment. Int J Soc Robot 1:3–18

    Article  Google Scholar 

  20. Kanda T, Nabe S, Hiraki K, Ishiguro H, Hagita N (2008) Human friendship estimation model for communication robots. Auton Robots 24:135–145

    Article  Google Scholar 

  21. Breazeal C, Gray J, Berlin M (2009) An embodied cognition approach to mindreading skills for socially intelligent robots. Int J Robot Res 28(5):656–680

    Article  Google Scholar 

  22. Liu C, Conn K, Sarkar N, Stone W (2008) Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE Trans Robot 24(4):883–896

    Article  Google Scholar 

  23. Kozima H, Nakagawa C, Yano H (2004) Can a robot empathize with people? Artif Life Robot 8:83–88

    Article  Google Scholar 

  24. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166

    Article  MATH  Google Scholar 

  25. Arkin RC, Fujita M, Takagi T, Hasegawa R (2003) An ethological and emotional basis for human–robot interaction. Robot Auton Syst 42:191–201

    Article  MATH  Google Scholar 

  26. Ekman P, Friesen WV (1984) Emotion facial action coding system (EM-FACS). University of California, San Francisco

    Google Scholar 

  27. Breazeal C (2003) Emotion and sociable humanoid robot. Int J Hum-Comput Stud 59:119–155

    Article  Google Scholar 

  28. Endo N, Momoki S, Zecca M, Saito M, Mizoguchi Y, Itoh K, Takanishi A (2008) Developments of whole-body emotion expression humanoid robot. In: Proc of IEEE int conf on robotics and automation, Pasadena, CA, USA, May 19–23, pp 2140–2145

    Google Scholar 

  29. Itoh K, Miwa H, Nukariya Y, Zecca M, Takanobu H, Roccella S, Carrozza MC, Dario P, Takanishi A (2006) Behavior generation of humanoid robots depending on mood. In: Proc of the 9th int conf on intelligent autonomous systems (IAS-9), vol 9, pp 965–972

    Google Scholar 

  30. Kleinsmith A, Bianchi-Berthouze N (2007) Recognizing affective dimension from body posture. In: Proc of second int conf on affective computing and intelligent interaction, Lisbon, Portugal, pp 48–58

    Chapter  Google Scholar 

  31. Castellano G, Villalba D, Camurri A (2007) Recognizing human emotions from body movement and gesture dynamics. In: Proc of second int conf on affective computing and intelligent interaction, Lisbon, Portugal, pp 71–82

    Chapter  Google Scholar 

  32. Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: Proc of second int conf on affective computing and intelligent interaction, Lisbon, Portugal, pp 59–70

    Chapter  Google Scholar 

  33. Clavel C, Plessier J, Martin JC, Ach L, Morel B (2009) Combining facial and postural expressions of emotions in a virtual character. In: Ruttkay Zs et al (eds) Intelligent virtual agents. Lecture notes in computer science, vol 5773. Springer, Heidelberg, pp 287–300

    Chapter  Google Scholar 

  34. Pasch M, Poppe R (2007) Person or puppet? The role of stimulus realism in attributing emotion to static body postures. In: Paiva A, Prada R, Picard RW (eds) Affective computing and intelligent interaction. Lecture notes in computer science, vol 4738. Springer, Berlin, pp 83–94

    Chapter  Google Scholar 

  35. Crane E, Gross M (2007) Motion capture and emotion: affect detection in whole body movement. In: Paiva A, Prada R, Picard RW (eds) ACII 2007. LNCS, vol 4738. Springer, Berlin, pp 95–101

    Google Scholar 

  36. A report on human body mass distribution (1988) Anthropometry and mass distribution for human analogues. Available at http://www.smf.org/articles/hic/USAARL_88-5.pdf, last accessed: 17 April 2012

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mustafa Suphi Erden.

Additional information

This study was conducted when the author was a post-doc researcher at Institute for Intelligent Systems and Robotics (ISIR), University Pierre et Marie Curie, Paris and École Nationale Supérieure de Techniques Avancées (ENSTA), Paris, France. Currently the author is with the LASA Laboratory at École Polytechnique Fédérale de Lausanne (EPFL), Switzerland.”

Rights and permissions

Reprints and permissions

About this article

Cite this article

Erden, M.S. Emotional Postures for the Humanoid-Robot Nao. Int J of Soc Robotics 5, 441–456 (2013). https://doi.org/10.1007/s12369-013-0200-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-013-0200-4

Keywords

Navigation