Skip to content
BY-NC-ND 4.0 license Open Access Published by De Gruyter Open Access July 25, 2018

How does the robot feel? Perception of valence and arousal in emotional body language

  • Mina Marmpena EMAIL logo , Angelica Lim and Torbjørn S. Dahl

Abstract

Human-robot interaction in social robotics applications could be greatly enhanced by robotic behaviors that incorporate emotional body language. Using as our starting point a set of pre-designed, emotion conveying animations that have been created by professional animators for the Pepper robot, we seek to explore how humans perceive their affect content, and to increase their usability by annotating them with reliable labels of valence and arousal, in a continuous interval space. We conducted an experiment with 20 participants who were presented with the animations and rated them in the two-dimensional affect space. An inter-rater reliability analysis was applied to support the aggregation of the ratings for deriving the final labels. The set of emotional body language animations with the labels of valence and arousal is available and can potentially be useful to other researchers as a ground truth for behavioral experiments on robotic expression of emotion, or for the automatic selection of robotic emotional behaviors with respect to valence and arousal. To further utilize the data we collected, we analyzed it with an exploratory approach and we present some interesting trends with regard to the human perception of Pepper’s emotional body language, that might be worth further investigation.

References

[1] C. Breazeal, Role of expressive behaviour for robots that learn from people, Philosophical Transactions of the Royal Society B: Biological Sciences, 2009, 364(1535), 3527-353810.1098/rstb.2009.0157Search in Google Scholar

[2] T. Fong, I. Nourbakhsh, K. Dautenhahn, A survey of socially interactive robots, Robotics and Autonomous Systems, 2003, 42(3), 143-16610.1016/S0921-8890(02)00372-XSearch in Google Scholar

[3] I. Leite, G. Castellano, A. Pereira, C. Martinho, A. Paiva, Longterm interactions with empathic robots: evaluating perceived support in children, Proceedings of International Conference on Social Robotics (2012, Chengdu, China), Springer, Berlin, Heidelberg, 2012, 298-30710.1007/978-3-642-34103-8_30Search in Google Scholar

[4] B. J. MacLennan, Robots React, but Can They Feel?, In: J. Vallverdú, D. Casacuberta (Eds.), Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence, IGI Global, 2009Search in Google Scholar

[5] A. Paiva, I. Leite, T. Ribeiro, Emotion Modeling for Social Robots, In: R. A. Calvo, S. D’Mello, J. Gratch, A. Kappas (Eds.), The Oxford Handbook of Affective Computing, Oxford University Press, 2015Search in Google Scholar

[6] P. Ekman, An argument for basic emotions, Cognition and Emotion, 2008, 6(3-4), 169-20010.1080/02699939208411068Search in Google Scholar

[7] J. A. Russell, L. F. Barrett, Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant, Journal of Personality and Social Psychology, 1999, 76(5), 805-81910.1037/0022-3514.76.5.805Search in Google Scholar

[8] A. Ortony, G. L. Clore, A. Collins, The Cognitive Structure of Emotions, Cambridge University Press, 1990Search in Google Scholar

[9] K. R. Scherer, A. Schorr, T. Johnstone (Eds.), Series in affective science. Appraisal processes in emotion: Theory, methods, research, Oxford University Press, 2001Search in Google Scholar

[10] E. Hudlicka, H. Gunes, Benefits and limitations of continuous representations of emotions in affective computing: Introduction to the special issue, 2012, International Journal of Synthetic Emotions, 3(1), i-viSearch in Google Scholar

[11] R. Cowie, G. McKeown, E. Douglas-Cowie, Tracing emotion: An overview, International Journal of Synthetic Emotions, 2012, 3(1), 1-1710.4018/jse.2012010101Search in Google Scholar

[12] J. Broekens, In defense of dominance: PAD usage in computational representations of affect, International Journal of Synthetic Emotions, 2012, 3(1), 33-4210.4018/jse.2012010103Search in Google Scholar

[13] M. Mortillaro, B. Meuleman, K. R. Scherer, Advocating a componential appraisal model to guide emotion recognition, International Journal of Synthetic Emotions, 2012, 3(1), 18-3210.4018/jse.2012010102Search in Google Scholar

[14] M. Lewis and L. Cañamero, Are Discrete Emotions Useful in Human-Robot Interaction? Feedback from Motion Capture Analysis, Proceedings of Humaine Association Conference on Affective Computing and Intelligent Interaction, (2013, Geneva, Switzerland), 97-10210.1109/ACII.2013.23Search in Google Scholar

[15] J. R. J. Fontaine, K. R. Scherer, E. B. Roesch, P. C. Ellsworth, The world of emotions is not two-dimensional, Psychological Science, 2007, 18(12), 1050-105710.1111/j.1467-9280.2007.02024.xSearch in Google Scholar PubMed

[16] Lisetti, Hudlicka, Why and How to build Emotion-Based Agent Architectures, In: R. A. Calvo, S. D’Mello, J. Gratch, A. Kappas (Eds.), The Oxford Handbook of Affective Computing, Oxford University Press, 2015Search in Google Scholar

[17] A. Kleinsmith, N. Bianchi-Berthouze, Affective body expression perception and recognition: A survey, IEEE Transactions on Affective Computing, 2013, 4(1), 15-3310.1109/T-AFFC.2012.16Search in Google Scholar

[18] M. de Meijer, The contribution of general features of body movement to the attribution of emotions, Journal of Nonverbal Behavior, 1989, 13(4), 247-26810.1007/BF00990296Search in Google Scholar

[19] H. G. Wallbott, Bodily expression of emotion, European Journal of Social Psychology, 1998, 28(6), 879-89610.1002/(SICI)1099-0992(1998110)28:6<879::AID-EJSP901>3.0.CO;2-WSearch in Google Scholar

[20] N. Dael, M. Mortillaro, K. R. Scherer, Emotion expression in body action and posture, Emotion, 2012, 12(5), 1085-110110.1037/a0025737Search in Google Scholar

[21] M. Coulson, Attributing Emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence, Journal of Nonverbal Behavior, 2004, 28(2), 117-13910.1023/B:JONB.0000023655.25550.beSearch in Google Scholar

[22] A. Kleinsmith, N. Bianchi-Berthouze, Recognizing affective dimensions from body posture, Proceedings of International Conference on Affective Computing and Intelligent Interaction (2007, Lisbon, Portugal), Springer, Berlin, Heidelberg, 2007, 48-5810.1007/978-3-540-74889-2_5Search in Google Scholar

[23] T. Ribeiro, A. Paiva, The illusion of robotic life: Principles and practices of animation for robots, Proceedings of International Conference on Human-Robot Interaction (2012, Boston, USA), ACM New York, NY, USA, 2012, 383-39010.1145/2157689.2157814Search in Google Scholar

[24] J. Monceaux, J. Becker, C. Boudier, A. Mazel, Demonstration: First steps in emotional expression of the humanoid robot Nao, Proceedings of International Conference on Multimodal Interfaces (2009, Cambridge, Massachusetts, USA), ACM New York, NY, USA, 2009, 235-23610.1145/1647314.1647362Search in Google Scholar

[25] A. Beck, L. Cañamero, K. A. Bard, Towards an Affect Space for robots to display emotional body language, Proceedings of International Symposium in Robot and Human Interactive Communication (2010, Viareggio, Italy), IEEE, 2010, 464-46910.1109/ROMAN.2010.5598649Search in Google Scholar

[26] C. Tsiourti, A. Weiss, K. Wac, M. Vincze, Designing emotionally expressive Robots: A comparative study on the perception of communication modalities, Proceedings of International Conference on Human Agent Interaction (2017, Bielefeld, Germany), ACM New York, NY, USA, 2017, 213-22210.1145/3125739.3125744Search in Google Scholar

[27] M. Destephe, T. Maruyama, M. Zecca, K. Hashimoto, A. Takanishi, Improving the human-robot interaction through emotive movements A special case: Walking, Proceedings of International Conference on Human-Robot Interaction (2013, Tokyo, Japan), IEEE Press Piscataway, NJ, USA, 2013, 115-11610.1109/HRI.2013.6483528Search in Google Scholar

[28] M. Destephe, A. Henning, M. Zecca, K. Hashimoto, A. Takanishi, Perception of emotion and emotional intensity in humanoid robots gait, Proceedings of International Conference on Robotics and Biomimetics (2013, Shenzhen, China), IEEE, 2013, 1276-128110.1109/ROBIO.2013.6739640Search in Google Scholar

[29] M. Häring, N. Bee, E. André, Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots, Proceedings of International Symposium in Robot and Human Interactive Communication (2011, Atlanta, USA), 204-20910.1109/ROMAN.2011.6005263Search in Google Scholar

[30] S. Embgen, M. Luber, C. Becker-Asano, M. Ragni, V. Evers, K. O. Arras, Robot-specific social cues in emotional body language, Proceedings of International Symposium in Robot and Human Interactive Communication (2012, Paris, France), IEEE, 2012, 1019-102510.1109/ROMAN.2012.6343883Search in Google Scholar

[31] J. Li, M. Chignell, Communication of emotion in social robots through simple head and arm movements, International Journal of Social Robotics, 2011, 3(2), 125-14210.1007/s12369-010-0071-xSearch in Google Scholar

[32] R. Laban, Modern Educational Dance, Macdonald & Evans Ltd, 1964Search in Google Scholar

[33] H. Knight, R. Simmons, Expressive motion with x, y and theta: Laban Effort Features for mobile robots, Proceedings of International Symposium on Robot and Human Interactive Communication (2014, Edinburgh, UK), IEEE, 2014, 267-27310.1109/ROMAN.2014.6926264Search in Google Scholar

[34] M. Sharma, D. Hildebrandt, G. Newman, J. E. Young, R. Eskicioglu, Communicating affect via flight path: Exploring use of the Laban Effort System for designing affective locomotion paths, Proceedings of International Conference on Human-Robot Interaction (2013, Tokyo, Japan), IEEE Press Piscataway, NJ, USA, 2013, 293-30010.1109/HRI.2013.6483602Search in Google Scholar

[35] J. M. Angel-Fernandez, A. Bonarini, Robots showing emotions, Interaction Studies, 2016, 17(3), 408-43710.1075/is.17.3.06angSearch in Google Scholar

[36] J. Novikova, L. Watts, A Design model of emotional body expressions in non-humanoid robots, in Proceedings of International Conference on Human-agent Interaction (2014, Tsukuba, Japan), ACM New York, NY, USA, 2014, 353-36010.1145/2658861.2658892Search in Google Scholar

[37] M. Masuda, S. Kato, H. Itoh, Laban-based motion rendering for emotional expression of human form robots, Proceedings of International Workshop on Knowledge Management and Acquisition for Smart Systems and Services (2010, Daegue, Korea), Springer Berlin Heidelberg, 2010, 49-6010.1007/978-3-642-15037-1_5Search in Google Scholar

[38] A. Lim, H. G. Okuno, The MEI robot: Towards using motherese to develop multimodal emotional intelligence, IEEE Transactions on Autonomous Mental Development, 2014, 6(2), 126-13810.1109/TAMD.2014.2317513Search in Google Scholar

[39] S. Rossi, M. Staffa, A. Tamburro, Socially assistive robot for providing recommendations: comparing a humanoid robot with a mobile application, International Journal of Social Robotics, 2018, 10(2), 265-127810.1007/s12369-018-0469-4Search in Google Scholar

[40] A. Betella, P. F. M. J. Verschure, The Affective Slider: A digital self-assessment scale for the measurement of human emotions, PLoS ONE, 2016, 11(2), e0148037, https://doi.org/10.1371/journal.pone.014803710.1371/journal.pone.0148037Search in Google Scholar PubMed PubMed Central

[41] D. Watson, L. A. Clark, A. Tellegen, Development and validation of brief measures of positive and negative affect: The PANAS scales, Journal of Personality and Social Psychology, 1988, 54(6), 1063-107010.1037/0022-3514.54.6.1063Search in Google Scholar

[42] P. E. Shrout, J. L. Fleiss, Intraclass correlations: Uses in assessing rater reliability, Psychological Bulletin, 1979, 86(2), 420-42810.1037/0033-2909.86.2.420Search in Google Scholar

[43] L. S. Feldt, D. J. Woodruff, F. A. Salih, Statistical inference for coefficient alpha, Applied Psychological Measurement, 1987, 11(1), 93-10310.1177/014662168701100107Search in Google Scholar

[44] H. C. Kraemer, Extension of Feldt’s approach to testing homogeneity of coefficients of reliability, Psychometrika, 1981, 46(1), 41-4510.1007/BF02293917Search in Google Scholar

[45] B. Diedenhofen, J. Musch, cocron: A web interface and R package for the statistical comparison of Cronbach’s alpha coefficients, International Journal of Internet Science, 2016, 11(1), 51-60Search in Google Scholar

[46] K. O. Mcgraw, S. P. Wong, Forming inferences about some intraclass correlation coefficients, Psychological Methods, 1996, 1(4), 390-39010.1037/1082-989X.1.4.390Search in Google Scholar

[47] K. A. Hallgren, Computing inter-rater reliability for observational data: An overview and tutorial, Tutorials in Quantitative Methods for Psychology, 2012, 8(1), 23-3410.20982/tqmp.08.1.p023Search in Google Scholar PubMed PubMed Central

[48] T. K. Koo, M. Y. Li, A guideline of selecting and reporting intraclass correlation coefficients for reliability research, Journal of Chiropractic Medicine, 2016, 15(2), 155-16310.1016/j.jcm.2016.02.012Search in Google Scholar PubMed PubMed Central

[49] H. Gunes, M. Pantic, Automatic, dimensional and continuous emotion recognition, International Journal of Synthetic Emotions, 2010, 1(1), 68-9910.4018/jse.2010101605Search in Google Scholar

[50] L. F. Barrett, B. Mesquita, M. Gendron, Context in emotion perception, Current Directions in Psychological Science, 2011, 20(5), 286-29010.1177/0963721411422522Search in Google Scholar

Received: 2017-11-30
Accepted: 2018-05-29
Published Online: 2018-07-25

© Mina Marmpena et al

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.

Downloaded on 14.5.2024 from https://www.degruyter.com/document/doi/10.1515/pjbr-2018-0012/html
Scroll to top button