Skip to main content
Log in

Interaction of robot with humans by communicating simulated emotional states through expressive movements

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

This paper presents a non-verbal and non-facial method for effective communication of a “mechanoid robot” by conveying the emotions through gestures. This research focuses on human–robot interaction using a mechanoid robot that does not possess any anthropomorphic facial features for conveying gestures. Another feature of this research is the use of human-like smooth motion of this mechanoid robot in contrast to the traditional trapezoidal velocity profile for its communication. For conveying gestures, the connection between motion of robot and perceived emotions is established by varying the velocity and acceleration of the mechanoid structure. The selected motion parameters are changed systematically to observe the variation in perceived emotions. The perceived emotions have been further investigated using three different emotional behavior models: Russell’s circumplex model of affect, Tellegen–Watson–Clark model and PAD model. Results obtained show that the designated motion parameters are linked with the change of emotions. Moreover, the emotions perceived by the user are same through all three models, validating the reliability of all the three emotional scale models and also of the emotions perceived by the user.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22

Similar content being viewed by others

References

  1. TUM Cognition for Technical Systems (CoTeSys): Munich Center for NeuroSciences—Brain and Mind—LMU Munich. http://www.mcn.uni-muenchen.de/affil_research/centers/cotesys/index.html. Accessed 15 Jan 2015

  2. Broekens J, Heerink M, Rosendal H (2009) Assistive social robots in elderly care: a review. Gerontechnology 8(2):94–103. doi:10.4017/gt.2009.08.02.002.00

    Article  Google Scholar 

  3. Song JH (2009) Effects of a robot pet-assisted program for elderly people with dementia. J Korean Acad Nurs 39(4):562–573. doi:10.4040/jkan.2009.39.4.562

    Article  Google Scholar 

  4. Kaiser WA, Fischer H, Vagner J, Selig M (2009) Robotic system for biopsy and therapy of breast lesions in a high-field whole-body magnetic resonance tomography unit. Invest Radiol 2000 35:513–519

    Article  Google Scholar 

  5. Kim YT, Kim SW, Jung YW (2008) Robotic surgery in gynecologic field. Yonsei Med J 49(6):886–890. doi:10.3349/ymj.2008.49.6.886

    Article  Google Scholar 

  6. Hong SH, Park J-H, Kwon KH, Jeon JW (2007) Information exchange for controlling internet robots. computational science (ICCS’07), part IV. Springer, Berlin, pp 425–432

  7. Torrey C, Powers A, Marge M, Fussell SR, Kiesler S (2006) Effects of adaptive robot dialogue on information exchange and social relations. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction (HRI’06), pp 126–133

  8. Grey Walter W (1950) An imitation of life. Sci Am 182(5):42–45

  9. Opfer JE (2002) Identifying living and sentient kinds from dynamic information: the case of goal-directed versus aimless autonomous movement in conceptual change. Cognition 86(2):97–122

    Article  Google Scholar 

  10. Csibra G (2003) Teleological and referential understanding of action in infancy. Philos Trans R Soc B Biol Sci 358(1431):447–458

    Article  Google Scholar 

  11. Zhao S (2006) Humanoid social robots as a medium of communication. New Media Soci 8(3):401–419

    Article  Google Scholar 

  12. Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: 5th ACM/IEEE international conference on human–robot interaction (HRI), pp 53–60. doi:10.1109/HRI.2010.5453269

  13. Roether CL, Omlor L, Christensen A, Giese MA (2009) Critical features for the perception of emotion from gait. J Vis 9:15. doi:10.1167/9.6.15

  14. Miwa H, Itoh K, Matsumoto M, Zecca M, Takanobu H, Rocella S, Carrozza MC, Dario P, Takanishi (2004) A effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1. In: 2004 IEEE/RSJ international conference on intelligent robots and systems, proceedings, vol 3, pp 2203–2208. doi:10.1109/IROS.2004.1389736

  15. Hauser K (2013) Recognition, prediction, and planning for assisted teleoperation of freeform tasks. Auton Robot 35(4):241–254

    Article  MathSciNet  Google Scholar 

  16. Breazeal C, Scassellati B (2000) Infant-like social interactions between a robot and a human caregiver. Adapt Behav 8(1):49–74

    Article  Google Scholar 

  17. Dautenhahn K (2007) Socially intelligent robots: dimensions of human–robot interaction. Philos Trans R Soc Lond B Biol Sci 362(1480):679–704. doi:10.1098/rstb.2006.2004

    Article  Google Scholar 

  18. Păiş Ana Lucia, Moga Sunita Andreea, Buiu Cătălin (2010) Emotions and robot artists: state-of-the-art and research challenges. Acad J Pet Gas Univ Ploiesti Bull Math I 62(2):26

  19. Sonoda T, Ishii K (2010) Robot joints employing degree of freedom constrained parallel mechanism. Pet Gas Univ Ploiesti Bull Math Inform Phys Ser 62(2):26–40

  20. Sandin PE (2003) Robot mechanisms and mechanical devices illustrated. McGraw-Hill Companies, Maidenheach

    Google Scholar 

  21. Flanagan JR, Ostry DJ (1990) Trajectories of human multi-joint arm movements: evidence of joint level planning. In: The first international symposium on experimental robotics I. http://brain.phgy.queensu.ca/flanagan/papers/FlaOst_ERO_90.pdf. Accessed 15 Jan 2015

  22. Gaertner S et al (2010) Generation of human-like motion for humanoid robots based on marker-based motion capture data. In: Robotics (ISR), 41st international symposium on and 2010 6th German conference on robotics (ROBOTIK), Munich, pp 1 – 8. http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&arnumber=5756898&tag=1. Accessed 15 Jan 2015

  23. Gaveau J, Papaxanthis C (2011) The temporal structure of vertical arm movements. PLoS One Open Access Biomed Image Search Engine 6(7):e22045. doi:10.1371/journal.pone.0022045

    Google Scholar 

  24. Reeves B, Nass C (1996) The media equation: how people treat computers, television, and new media like real people and places (CSLI lecture notes S). CSLI Publications (The Center for the Study of Language and Information Publications). Cambridge University Press, UK

  25. Mori M (2005) The Uncanny Valley. http://www.androidscience.com/theuncannyvalley/proceedings2005/uncannyvalley.html. Accessed 15 Jan 2015

  26. Matsumaru T (2009) Discrimination of emotion from movement and addition of emotion in movement to improve personal affinity of human–coexistence robot. SICE J Control Meas System Integr 2(6):365–372

    Article  Google Scholar 

  27. Nakata T, Sato T, Mori T (1998) Expression of emotion and intention by robot body movement. In: Intelligent autonomous systems 5 (IAS-5). IOS Press, Amsterdam, pp 352–359

  28. Takahashi K, Hosokawa M, Hashimoto M (2010) Remarks on designing of emotional movement for simple communication robot. In: IEEE International conference on industrial technology (ICIT), pp 585–590

  29. Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot 3(2):125–142

    Article  Google Scholar 

  30. Julian M, Angel F, Bonarini A (2014) Studying people’s emotional responses to robot’s movements. In: Proceedings of the 50th anniversary convention of the AISB

  31. Shibata S, Ohba K, Inooka H (1993) Emotional evaluation of human arm motion models. In: Proceedings of 1993 2nd IEEE international workshop on robot and human communication, pp 346–351

  32. Broqu‘ere X, Sidobre D, Herrera-Aguilar I (2008) Soft motion trajectory planner for service manipulator robot. IEEE/RSJ Int Conf Intell Robot Syst (IROS) 2008:2808–2813

  33. Yamada K, Taura T, Nagai Y (2011) Design of emotional and creative motion by focusing on rhythmic features. In: Design creativity. Springer, Berlin, pp 139–146

  34. Hattori T, Yamada Y, Okamoto S, Mori S, Yamada S (2014) Characteristics and individual differences of human actions for avoiding harm to eyes from a robot. J Robot Mechatron 26(3):358–368

    Article  Google Scholar 

  35. Saerbeck M, Van Breemen AJN (2007) Design guidelines and tools for creating believable motion for personal robots. In: The 16th IEEE international symposium on robot and human interactive communication, pp 386–391

  36. Lewis M, Jones J, Barrett L (2008) Handbook of emotions, 3rd edn. The Guilford Press, New York

    Google Scholar 

  37. Barrett LF, Mesquita B, Ochsner KN, Gross JJ (2007) The experience of emotion. Annu Rev Psychol 58:373–403

    Article  Google Scholar 

  38. Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39(6):1161–1178. doi:10.1037/h0077714

    Article  Google Scholar 

  39. Watson D, Clark LA, Tellegen A (1988) Development and validation of brief measures of positive and negative affect: the PANAS scales. J Personal Soc Psychol 54(6):1063–1070

    Article  Google Scholar 

  40. Tellegen A, Watson D, Clark LA (1999) Further support for a hierarchical model of affect: reply to Green and Salovey. Psychol Sci 10:307–309

    Article  Google Scholar 

  41. Junchao X, Broekens J, Hindriks K, Neerincx MA (2013) Mood expression through parameterized functional behavior of robots. RO-MAN IEEE 26–29:533–540. doi:10.1109/ROMAN.2013.6628534

    Google Scholar 

  42. Bartneck C, Reichenbach J, Carpenter J (2006) Use of praise and punishment in human–robot collaborative teams. In: The 15th IEEE international symposium on robot and human interactive communication, pp 177–182

  43. Mehrabian A (1997) Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. J Psychopathol Behav Assess 19(4):331–357

    Article  Google Scholar 

  44. Vicario CM, Newman A (2013) Emotions affect the recognition of hand gestures. Front Hum Neurosci 7:906. doi:10.3389/fnhum.2013.00906

  45. Atkinsona AP, Tunstalla ML, Dittrichb WH (2007) Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures. Cognition 104(1):59–72

    Article  Google Scholar 

  46. Crawford JR, Henry JD (2004) The positive and negative affect schedule (PANAS): construct validity, measurement properties and normative data in a large non-clinical sample. Br J Clin Psychol 43:245–265

    Article  Google Scholar 

  47. Kuhnlenz K, Sosnowski S, Buss M (2007) Evaluating emotion expressing robots in affective space. In: Human–robot interaction. InTech, New York, pp 235–246

  48. McDowell I (2006) Measuring health: a guide to rating scales and questionnaires. Oxford University Press, Oxford. ISBN-10: 0195165675

  49. Ekkekakis P (2013) The measurement of affect, mood, and emotion. A guide for health-behavioral research. Cambridge University Press, Cambridge. ISBN 9781107648203

  50. Dautenhahn K (1999) Robots as social actors: aurora and the case of autism. In: Third cognitive technology conference (CT’99). August, San Francisco

  51. Mollahosseini A, Graitzer G, Borts E, Conyers S, Voyles RM, Cole R, Mahoor MH (2014) ExpressionBot: an emotive lifelike robotic face for face-to-face communication. In: 14th IEEE-RAS international conference on humanoid robots (humanoids), Madrid

  52. Admoni H (2016) Nonverbal communication for human–robot interaction. Social Robotics Lab, Yale university. http://scazlab.yale.edu/nonverbal-communication-human-robot-interaction-henny-admoni. Accessed 15 Jan 2015

  53. Ende T, Haddadin S, Parusel S, Wüsthoff T, Hassenzahl M, Albu-Schäffer A (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: 2011 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 3367–3374

  54. Beck A, Hiolle A, Cañamero L (2013) Using perlin noise to generate emotional expressions in a robot.In: Proceedings of annual meeting of the cognitive science society (Cog Sci’13), pp 1845–1850

  55. Remington NA, Fabrigar LR, Visser PS (2000) Re-examining the circumplex model of affect. J Personal Soc Psychol 79:286–300

    Article  Google Scholar 

  56. Coan JA, Allen JJ (2007) Handbook of emotion elicitation and assessment. Oxford University Press, Oxford

    Google Scholar 

  57. Mehrabian A (1980) Basic dimensions for a general psychological theory: implications for personality, social, environmental, and developmental studies. Oelgeschlager, Gunn & Hain, Cambridge

  58. Mehrabian A (1977) Nonverbal communication. Trans Publ. https://books.google.com.pk/books?hl=en&lr=&id=Xt-YALu9CGwC&oi=fnd&pg=PR7&dq=Mehrabian,+A.,+1977.+Nonverbal+Communication.&ots=5yJcMd8lhy&sig=7znBty5c-eIf-gwfZJPkR7kHLkY&q&f=false. Accessed 15 Jan 2015

  59. Chen W (2009) Facial expression imitation for human–robot interaction. Ph.D. dissertation

  60. Atkeson CG, Hollerbach JM (1985) Kinematic features of unrestrained vertical arm movements. J Neurosci 5:2318–2330

    Google Scholar 

  61. Eun SJ et al (2009) Sound production for the emotional expression of socially interactive robots. In: Advances in human–robot interaction

  62. Opfer JE (2002) Identifying living and sentient kinds from dynamic information: the case of goal-directed versus aimless autonomous movement in conceptual change. Cognition 86(2):97–122

    Article  Google Scholar 

  63. Trohidis K, Tsoumakas G, Kalliris G, Vlahavas I (2011) Multi-label classification of music by emotion. EURASIP J Audio Speech Music Process. doi:10.1186/1687-4722-2011-426793

  64. Bizzi E et al (1984) Posture control and trajectory formation during arm movement. J Neurosci 4(11):2738–2744

  65. Gupta JP et al (2013) Human activity recognition using gait pattern. Int J Comput Vis Image Process (IJCVIP) 3(3):31–53

    Article  Google Scholar 

  66. Semwal V, Nandi G (2015) Towards developing a computational model for bipedal push recovery—a brief. Sens J IEEE 15(4):2021–2022

  67. Semwal VB, Chakraborty P, Nandi GC (2015) Less computationally intensive fuzzy logic (type-1)-based controller for humanoid push recovery. Robot Auton Syst 63:122–135

    Article  Google Scholar 

  68. Kumari P, Vaish A (2015) Brainwave based user identification system: a pilot study in robotics environment. Robot Auton Syst 65:15–23

    Article  Google Scholar 

  69. Vaish A, Kumari P (2012) A comparative study on machine learning algorithms in emotion state recognition using ECG. In: Proceedings of the second international conference on soft computing for problem solving (SocProS)

  70. Semwal VB, Katiyar SA, Chakraborty R, Nandi GC (2015) Biologically-inspired push recovery capable bipedal locomotion modeling through hybrid automata. Robot Auton Syst 70:181–190

  71. Kumari P, Vaish A (2013) A comparative study of machine learning algorithms for emotion state recognition through physiological signal. In: Advances in intelligent systems and computing, vol 236

  72. Bellustin N, Kovalchuck AT, Shemagina O, Yakho V, Kalafati Y, Vaish A, Verma S (2011) Instant human face attributes recognition system. Int J Adv Comput Sci Appl Spec Issue Artif Intell 3:112–120

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sara Baber Sial.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sial, S.B., Sial, M.B., Ayaz, Y. et al. Interaction of robot with humans by communicating simulated emotional states through expressive movements. Intel Serv Robotics 9, 231–255 (2016). https://doi.org/10.1007/s11370-016-0199-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-016-0199-0

Keywords

Navigation