Skip to main content

Recognition of a Robot’s Affective Expressions Under Conditions with Limited Visibility

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12934))

Abstract

The capability of showing affective expressions is important for the design of social robots in many contexts, where the robot is designed to communicate with humans. It is reasonable to expect that, similar to all other interaction modalities, communicating with affective expressions is not without limitations. In this paper, we present two online video studies (72 and 50 participants) and investigate if/to what extent the recognition of affective displays of a zoomorphic robot is affected under situations with different levels of visibility. Recognition of five affective expressions under five visibility effects were studied. The intensity of the effects was more pronounced in the second experiment. While visual constraints affected recognition of expressions, our results showed that affective displays of the robot conveyed through its head and body motions can be robust and recognition rates can still be high even under severe visibility constraints. This study supported the effectiveness of using affective displays as a complementary communication modality in human-robot interaction in situations with visibility constraints, e.g. in the case of older users with visual impairments, or in outdoor scenarios such as search-and-rescue.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Note, we had originally envisaged to conduct both studies as in-person experiments, but due to COVID-19 this was not possible and we had to move the study online.

  2. 2.

    A combination of widely available video editing software (i.e., iMovie, Wondershare Filmora, and HitFilm Express) was used to create the video effects. The choice of software depended on the effect. Please contact the authors if you are interested to see the videos.

  3. 3.

    http://app.visgraf.impa.br/database/faces/.

  4. 4.

    Please contact the authors if you are interested to see the videos.

References

  1. Akgun, S.A., Ghafurian, M., Crowley, M., Dautenhahn, K.: Using emotions to complement multi-modal human-robot interaction in urban search and rescue scenarios. In: Proceedings of the 2020 International Conference on Multimodal Interaction, pp. 575–584 (2020)

    Google Scholar 

  2. Baker, M., Casey, R., Keyes, B., Yanco, H.A.: Improved interfaces for human-robot interaction in urban search and rescue. In: 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No. 04CH37583), vol. 3, pp. 2960–2965. IEEE (2004)

    Google Scholar 

  3. Bartneck, C., Duenser, A., Moltchanova, E., Zawieska, K.: Comparing the similarity of responses received from studies in amazon mechanical Turk to studies conducted online and with direct recruitment. PLOS ONE 10(4), 1–23 (2015)

    Google Scholar 

  4. Beck, A., et al.: Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int. J. Soc. Robot. 5(3), 325–334 (2013)

    Article  Google Scholar 

  5. Beer, J.M., Prakash, A., Mitzner, T.L., Rogers, W.A.: Understanding robot acceptance. Technical report, Georgia Institute of Technology (2011)

    Google Scholar 

  6. Bennett, M.V., Matthews, I.: Life-saving uncooled IR camera for use in firefighting applications. In: Infrared Technology and Applications XXII, vol. 2744, pp. 549–554. International Society for Optics and Photonics (1996)

    Google Scholar 

  7. Bethel, C.L., Murphy, R.R.: Non-facial and non-verbal affective expression for appearance-constrained robots used in victim management. Paladyn J. Behav. Robot. 1(4), 219–230 (2010)

    Google Scholar 

  8. Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum. Comput. Stud. 59(1–2), 119–155 (2003)

    Article  Google Scholar 

  9. Chen, A.Y., Peña-Mora, F., Plans, A.P., Mehta, S.J., Aziz, Z.: Supporting urban search and rescue with digital assessments of structures and requests of response resources. Adv. Eng. Inform. 26(4), 833–845 (2012)

    Article  Google Scholar 

  10. Collins, E.C., Prescott, T.J., Mitchinson, B.: Saying it with light: a pilot study of affective communication using the MIRO robot. In: Wilson, S.P., Verschure, P.F.M.J., Mura, A., Prescott, T.J. (eds.) LIVINGMACHINES 2015. LNCS (LNAI), vol. 9222, pp. 243–255. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-22979-9_25

    Chapter  Google Scholar 

  11. Collins, E.C., Prescott, T.J., Mitchinson, B., Conran, S.: Miro: a versatile biomimetic edutainment robot. In: Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology, pp. 1–4 (2015)

    Google Scholar 

  12. Dang, Q.K., Suh, Y.S.: Human-following robot using infrared camera. In: 2011 11th International Conference on Control, Automation and Systems, pp. 1054–1058. IEEE (2011)

    Google Scholar 

  13. Cañamero, L.D.: Playing the emotion game with Feelix: what can a LEGO robot tell us about emotion? In: Dautenhahn, K., Bond, A., Cañamero, L., Edmonds, B. (eds.) Socially Intelligent Agents. MASA, vol. 3, pp. 69–76. Springer, Boston (2002). https://doi.org/10.1007/0-306-47373-9_8

    Chapter  Google Scholar 

  14. Delmerico, J., et al.: The current state and future outlook of rescue robotics. J. Field Robot. 36(7), 1171–1191 (2019)

    Article  Google Scholar 

  15. D’Onofrio, G., et al.: Assistive robots for socialization in elderly people: results pertaining to the needs of the users. Aging Clin. Exp. Res. 31(9), 1313–1329 (2019)

    Article  Google Scholar 

  16. Fleischer, A., Mead, A.D., Huang, J.: Inattentive responding in MTurk and other online samples. Ind. Organ. Psychol. 8(2), 196 (2015)

    Article  Google Scholar 

  17. Gácsi, M., Kis, A., Faragó, T., Janiak, M., Muszyński, R., Miklósi, Á.: Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour. Comput. Hum. Behav. 59, 411–419 (2016)

    Article  Google Scholar 

  18. Ghafurian, M., Lakatos, G., Tao, Z., Dautenhahn, K.: Design and evaluation of affective expressions of a zoomorphic robot. In: Wagner, A.R., et al. (eds.) ICSR 2020. LNCS (LNAI), vol. 12483, pp. 1–12. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-62056-1_1

    Chapter  Google Scholar 

  19. Giambattista, A., Teixeira, L., Ayanoğlu, H., Saraiva, M., Duarte, E.: Expression of emotions by a service robot: a pilot study. In: Marcus, A. (ed.) DUXU 2016. LNCS, vol. 9748, pp. 328–336. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40406-6_31

    Chapter  Google Scholar 

  20. Goodrich, M.A., et al.: Supporting wilderness search and rescue using a camera-equipped mini UAV. J. Field Robot. 25(1–2), 89–110 (2008)

    Article  Google Scholar 

  21. Greatbatch, I., Gosling, R.J., Allen, S.: Quantifying search dog effectiveness in a terrestrial search and rescue environment. Wilderness Environ. Med. 26(3), 327–334 (2015)

    Article  Google Scholar 

  22. Hortensius, R., Hekele, F., Cross, E.S.: The perception of emotion in artificial agents. IEEE Trans. Cogn. Dev. Syst. 10(4), 852–864 (2018)

    Article  Google Scholar 

  23. Islam, M.J., Ho, M., Sattar, J.: Understanding human motion and gestures for underwater human-robot collaboration. J. Field Robot. 36(5), 851–873 (2019)

    Article  Google Scholar 

  24. Jackovics, P.: Standard of operation for cave rescue in Hungary. Int. Fire Fighter 2016(9), 84–86 (2016)

    Google Scholar 

  25. Jones, B., Tang, A., Neustaedter, C.: Remote communication in wilderness search and rescue: implications for the design of emergency distributed-collaboration tools for network-sparse environments. Proc. ACM Hum.-Comput. Interact. 4(GROUP) (2020). https://doi.org/10.1145/3375190

  26. Kahn, P.H., Jr., et al.: “Robovie, you’ll have to go into the closet now’’: children’s social and moral relationships with a humanoid robot. Dev. Psychol. 48(2), 303 (2012)

    Article  Google Scholar 

  27. Karaca, Y., et al.: The potential use of unmanned aircraft systems (drones) in mountain search and rescue operations. Am. J. Emerg. Med. 36(4), 583–588 (2018)

    Article  Google Scholar 

  28. Karma, S., et al.: Use of unmanned vehicles in search and rescue operations in forest fires: advantages and limitations observed in a field trial. Int. J. Disaster Risk Reduction 13, 307–312 (2015)

    Article  Google Scholar 

  29. Kim, J.H., Starr, J.W., Lattimer, B.Y.: Firefighting robot stereo infrared vision and radar sensor fusion for imaging through smoke. Fire Technol. 51(4), 823–845 (2015)

    Article  Google Scholar 

  30. Kitade, T., Satake, S., Kanda, T., Imai, M.: Understanding suitable locations for waiting. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 57–64. IEEE (2013)

    Google Scholar 

  31. Korcsok, B., et al.: Biologically inspired emotional expressions for artificial agents. Front. Psychol. 9, 1191 (2018)

    Article  Google Scholar 

  32. Larochelle, B., Kruijff, G.J.M., Smets, N., Mioch, T., Groenewegen, P.: Establishing human situation awareness using a multi-modal operator control unit in an urban search & rescue human-robot team. IEEE (2011)

    Google Scholar 

  33. Li, J., et al.: Usability of a robot’s realistic facial expressions and peripherals in autistic children’s therapy. arXiv preprint arXiv:2007.12236 (2020)

  34. McGinn, C., et al.: Meet Stevie: a socially assistive robot developed through application of a ‘design-thinking’ approach. J. Intell. Robot. Syst. 98(1), 39–58 (2020)

    Article  Google Scholar 

  35. Mitchinson, B., Prescott, T.J.: MIRO: a robot “Mammal’’ with a biomimetic brain-based control system. In: Lepora, N.F.F., Mura, A., Mangan, M., Verschure, P.F.M.J.F.M.J., Desmulliez, M., Prescott, T.J.J. (eds.) Living Machines 2016. LNCS (LNAI), vol. 9793, pp. 179–191. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-42417-0_17

    Chapter  Google Scholar 

  36. Mu, L., Zhao, E.: The optimization of maritime search and rescue simulation system based on CPS. In: Hu, S., Yu, B. (eds.) Big Data Analytics for Cyber-Physical Systems, pp. 231–245. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-43494-6_11

    Chapter  Google Scholar 

  37. Nađ, Đ., et al.: Towards advancing diver-robot interaction capabilities. IFAC-PapersOnLine 52(21), 199–204 (2019)

    Google Scholar 

  38. Pan, Y., Gao, F., Qi, C., Chai, X.: Human-tracking strategies for a six-legged rescue robot based on distance and view. Chin. J. Mech. Eng. 29(2), 219–230 (2016)

    Article  Google Scholar 

  39. Paolacci, G., Chandler, J., Ipeirotis, P.G.: Running experiments on Amazon mechanical Turk. Judgm. Decis. Mak. 5(5), 411–419 (2010)

    Google Scholar 

  40. Ritschel, H., Aslan, I., Mertes, S., Seiderer, A., André, E.: Personalized synthesis of intentional and emotional non-verbal sounds for social robots. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 1–7 (2019)

    Google Scholar 

  41. Rivera, A., Villalobos, A., Monje, J., Mariñas, J., Oppus, C.: Post-disaster rescue facility: human detection and geolocation using aerial drones. In: 2016 IEEE Region 10 Conference (TENCON), pp. 384–386. IEEE (2016)

    Google Scholar 

  42. Sabelli, A.M., Kanda, T., Hagita, N.: A conversational robot in an elderly care center: an ethnographic study. In: 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 37–44. IEEE (2011)

    Google Scholar 

  43. Saldien, J., Goris, K., Vanderborght, B., Vanderfaeillie, J., Lefeber, D.: Expressing emotions with the social robot probo. Int. J. Soc. Robot. 2(4), 377–389 (2010)

    Article  Google Scholar 

  44. Sales, J., Marin, R., Cervera, E., Rodríguez, S., Pérez, J.: Multi-sensor person following in low-visibility scenarios. Sensors 10(12), 10953–10966 (2010)

    Article  Google Scholar 

  45. Sharma, M., Hildebrandt, D., Newman, G., Young, J.E., Eskicioglu, R.: Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 293–300. IEEE (2013)

    Google Scholar 

  46. Silvagni, M., Tonoli, A., Zenerino, E., Chiaberge, M.: Multipurpose UAV for search and rescue operations in mountain avalanche events. Geomat. Nat. Haz. Risk 8(1), 18–33 (2017)

    Article  Google Scholar 

  47. Soegaard, M., Dam, R.F.: The Encyclopedia of Human-computer Interaction (2012)

    Google Scholar 

  48. Starr, J.W., Lattimer, B.: Evaluation of navigation sensors in fire smoke environments. Fire Technol. 50(6), 1459–1481 (2014)

    Article  Google Scholar 

  49. Stott, S.: Critical Hours: Search and Rescue in the White Mountains. University Press of New England (2018)

    Google Scholar 

  50. Velásquez, J.D.: An emotion-based approach to robotics. In: Proceedings of the International Conference on Intelligent Robots and Systems, vol. 1, pp. 235–240. IEEE (1999)

    Google Scholar 

  51. Wallkotter, S., Stower, R., Kappas, A., Castellano, G.: A robot by any other frame: framing and behaviour influence mind perception in virtual but not real-world environments. In: ACM/IEEE International Conference on Human-Robot Interaction, pp. 609–618 (2020). https://doi.org/10.1145/3319502.3374800

  52. Weiss, Y., Simoncelli, E.P., Adelson, E.H.: Motion illusions as optimal percepts. Nat. Neurosci. 5(6), 598–604 (2002)

    Article  Google Scholar 

  53. Woods, S.N., Walters, M.L., Koay, K.L., Dautenhahn, K.: Methodological issues in HRI: a comparison of live and video-based methods in robot to human approach direction trials. In: ROMAN 2006 The 15th IEEE International Symposium on Robot and Human Interactive Communication, pp. 51–58. IEEE (2006)

    Google Scholar 

  54. Zhao, J., Gao, J., Zhao, F., Liu, Y.: A search-and-rescue robot system for remotely sensing the underground coal mine environment. Sensors 17(10), 2426 (2017)

    Article  Google Scholar 

Download references

Acknowledgment

This research was undertaken, in part, thanks to funding from the Canada 150 Research Chairs Program and funding from the Network for Aging Research at the University of Waterloo. We would like to thank the members of the Social and Intelligent Robotics Research Laboratory (SIRRL) at the University of Waterloo for their comments on the video effects.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Moojan Ghafurian .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ghafurian, M., Akgun, S.A., Crowley, M., Dautenhahn, K. (2021). Recognition of a Robot’s Affective Expressions Under Conditions with Limited Visibility. In: Ardito, C., et al. Human-Computer Interaction – INTERACT 2021. INTERACT 2021. Lecture Notes in Computer Science(), vol 12934. Springer, Cham. https://doi.org/10.1007/978-3-030-85613-7_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85613-7_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85612-0

  • Online ISBN: 978-3-030-85613-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics