skip to main content
10.1145/3638380.3638408acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
short-paper

Designing Emotional Expressions of Autonomous Vehicles for Communication with Pedestrians in Urban Shared Spaces: Use Cases, Modalities, and Considerations

Authors Info & Claims
Published:10 May 2024Publication History

ABSTRACT

Autonomous vehicles (AVs) in urban environments are increasingly equipped with external human-machine interfaces (eHMIs) to interact with nearby pedestrians for safety and social needs stemming from the absent driver-pedestrian interaction, such as eye contact and hand wave. In particular when it comes to pedestrian-vehicle shared spaces, communication strategies supporting social interactions, such as emotional expression, have the potential to improve AV-pedestrian interaction. Emotional expression has been investigated for human-robot interaction but has thus far not been explored as a communication strategy for AVs. To support the integration of AVs into urban areas, especially for spaces dominated by pedestrians and shared by AVs, we investigate emotional expressions of AVs as a communication strategy, through a focus group study with twelve domain experts from Human-Computer Interaction, User Experience/User Interface Design, and Intelligent Transportation Systems to collaboratively devise use cases, modalities of emotional expressions, and design considerations. Findings of this paper contribute to designing external communication strategies for AVs in urban shared spaces and highlight avenues for future research.

References

  1. Jessica R Cauchard, Kevin Y Zhai, Marco Spadafora, and James A Landay. 2016. Emotion encoding in human-drone interaction. In 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 263–270.Google ScholarGoogle ScholarCross RefCross Ref
  2. Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car: An Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Oldenburg, Germany) (AutomotiveUI ’17). Association for Computing Machinery, New York, NY, USA, 65–73. https://doi.org/10.1145/3122986.3122989Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Mark Colley and Enrico Rukzio. 2020. A design space for external communication of autonomous vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 212–222.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Debargha Dey, Coen De Zeeuw, Miguel Bruns, and Bastian Pfleging. 2021. Shape-Changing Interfaces as eHMIs: Exploring the Design Space of Zoomorphic Communication between Automated Vehicles and Pedestrians. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 137–141.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Debargha Dey, Azra Habibovic, Andreas Löcken, Philipp Wintersberger, Bastian Pfleging, Andreas Riener, Marieke Martens, and Jacques Terken. 2020. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transportation Research Interdisciplinary Perspectives 7 (2020), 100174.Google ScholarGoogle ScholarCross RefCross Ref
  6. Debargha Dey, Azra Habibovic, Bastian Pfleging, Marieke Martens, and Jacques Terken. 2020. Color and animation preferences for a light band eHMI in interactions between automated vehicles and pedestrians. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Debargha Dey, Kai Holländer, Melanie Berger, Berry Eggen, Marieke Martens, Bastian Pfleging, and Jacques Terken. 2020. Distance-dependent eHMIs for the interaction between automated vehicles and pedestrians. In 12th international conference on automotive user interfaces and interactive vehicular applications. 192–204.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Debargha Dey, Marieke Martens, Berry Eggen, and Jacques Terken. 2019. Pedestrian road-crossing willingness as a function of vehicle automation, external appearance, and driving behaviour. Transportation research part F: traffic psychology and behaviour 65 (2019), 191–205.Google ScholarGoogle Scholar
  9. Melissa Donaldson. 2017. Plutchik’s wheel of emotions—2017. Update. SixSeconds. Retrieved from https://www. 6seconds. org/2017/04/27/plutchiks-model-of-emotions (2017).Google ScholarGoogle Scholar
  10. Chris S Dula and E Scott Geller. 2003. Risky, aggressive, or emotional driving: Addressing the need for consistent communication in research. Journal of safety research 34, 5 (2003), 559–566.Google ScholarGoogle ScholarCross RefCross Ref
  11. Friederike Eyssel, Frank Hegel, Gernot Horstmann, and Claudia Wagner. 2010. Anthropomorphic inferences from emotional nonverbal cues: A case study. In 19th international symposium in robot and human interactive communication. IEEE, 646–651.Google ScholarGoogle ScholarCross RefCross Ref
  12. Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. 2003. A survey of socially interactive robots. Robotics and autonomous systems 42, 3-4 (2003), 143–166.Google ScholarGoogle Scholar
  13. Mary Ellen Foster, Rachid Alami, Olli Gestranius, Oliver Lemon, Marketta Niemelä, Jean-Marc Odobez, and Amit Kumar Pandey. 2016. The MuMMER project: Engaging human-robot interaction in real-world public spaces. In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings 8. Springer, 753–763.Google ScholarGoogle ScholarCross RefCross Ref
  14. Ben Hamilton-Baillie. 2008. Shared space: Reconciling people, places and traffic. Built environment 34, 2 (2008), 161–181.Google ScholarGoogle Scholar
  15. Viviane Herdel, Anastasia Kuzminykh, Andrea Hildebrandt, and Jessica R Cauchard. 2021. Drone in love: Emotional perception of facial expressions on flying robots. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–20.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Guy Hoffman, Oren Zuckerman, Gilad Hirschberger, Michal Luria, and Tal Shani Sherman. 2015. Design and evaluation of a peripheral robotic conversation companion. In Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction. 3–10.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Marius Hoggenmueller, Jiahao Chen, and Luke Hespanhol. 2020. Emotional Expressions of Non-Humanoid Urban Robots: The Role of Contextual Aspects on Interpretations(PerDis ’20). Association for Computing Machinery, New York, NY, USA, 87–95. https://doi.org/10.1145/3393712.3395341Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Marius Hoggenmueller, Martin Tomitsch, and Stewart Worrall. 2022. Designing Interactions With Shared AVs in Complex Urban Mobility Scenarios. Frontiers in Computer Science 4 (2022), 866258.Google ScholarGoogle ScholarCross RefCross Ref
  19. Marius Hoggenmüller. 2022. Urban Robotic Interfaces: Designing for Encounters with Non-Humanoid Robots in Cities. Ph. D. Dissertation.Google ScholarGoogle Scholar
  20. Kai Holländer, Ashley Colley, Christian Mai, Jonna Häkkilä, Florian Alt, and Bastian Pfleging. 2019. Investigating the Influence of External Car Displays on Pedestrians’ Crossing Behavior in Virtual Reality. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (Taipei, Taiwan) (MobileHCI ’19). Association for Computing Machinery, New York, NY, USA, Article 27, 11 pages. https://doi.org/10.1145/3338286.3340138Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Kai Holländer, Marius Hoggenmüller, Romy Gruber, Sarah Theres Völkel, and Andreas Butz. 2022. Take It to the Curb: Scalable Communication Between Autonomous Cars and Vulnerable Road Users Through Curbstone Displays. Frontiers in Computer Science 4 (2022), 844245.Google ScholarGoogle ScholarCross RefCross Ref
  22. Nathanael Jarrasse, Vittorio Sanguineti, and Etienne Burdet. 2014. Slaves no longer: review on role assignment for human–robot joint motor action. Adaptive Behavior 22, 1 (2014), 70–82.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Oskar Juhlin. 1999. Traffic behaviour as social interaction-implications for the design of artificial drivers. In PROCEEDINGS OF 6TH WORLD CONGRESS ON INTELLIGENT TRANSPORT SYSTEMS (ITS), HELD TORONTO, CANADA, NOVEMBER 8-12, 1999.Google ScholarGoogle Scholar
  24. Auttapone Karndacharuk, Douglas J Wilson, and Roger CM Dunn. 2013. Analysis of pedestrian performance in shared-space environments. Transportation research record 2393, 1 (2013), 1–11.Google ScholarGoogle Scholar
  25. Richard A Krueger. 2014. Focus groups: A practical guide for applied research. Sage publications.Google ScholarGoogle Scholar
  26. T. Lagstrom and V. M. Lundgren. 2015. AVIP-Autonomous vehicles interaction with pedestrians. Master of Science Thesis, Chalmers University of Technology (2015).Google ScholarGoogle Scholar
  27. Yang Li, Hao Cheng, Zhe Zeng, Hailong Liu, and Monika Sester. 2021. Autonomous vehicles drive into shared spaces: ehmi design concept focusing on vulnerable road users. In 2021 IEEE International Intelligent Transportation Systems Conference (ITSC). IEEE, 1729–1736.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Andreas Löcken, Carmen Golling, and Andreas Riener. 2019. How should automated vehicles interact with pedestrians? A comparative analysis of interaction concepts in virtual reality. In Proceedings of the 11th international conference on automotive user interfaces and interactive vehicular applications. 262–274.Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Stefanie M. Faas, Johannes Kraus, Alexander Schoenhals, and Martin Baumann. 2021. Calibrating pedestrians’ trust in automated vehicles: does an intent display in an external HMI support trust calibration and safe crossing behavior?. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–17.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Karthik Mahadevan, Sowmya Somanath, and Ehud Sharlin. 2018. Communicating Awareness and Intent in Autonomous Vehicle-Pedestrian Interaction. Association for Computing Machinery, New York, NY, USA, 1–12. https://doi-org.ezproxy.library.sydney.edu.au/10.1145/3173574.3174003Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Natasha Merat, Tyron Louw, Ruth Madigan, Marc Wilbrink, and Anna Schieben. 2018. What externally presented information do VRUs require when interacting with fully Automated Road Transport Systems in shared space?Accident Analysis & Prevention 118 (2018), 244–252.Google ScholarGoogle Scholar
  32. Simon Moody and Steve Melia. 2014. Shared space–research, policy and problems. In Proceedings of the Institution of Civil Engineers-Transport, Vol. 167. Thomas Telford Ltd, ICE Publishing, London, UK, 384–392.Google ScholarGoogle ScholarCross RefCross Ref
  33. Trung Thanh Nguyen, Kai Holländer, Marius Hoggenmueller, Callum Parker, and Martin Tomitsch. 2019. Designing for Projection-Based Communication between Autonomous Vehicles and Pedestrians. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Utrecht, Netherlands) (AutomotiveUI ’19). Association for Computing Machinery, New York, NY, USA, 284–294. https://doi.org/10.1145/3342197.3344543Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Max Oudshoorn, Joost de Winter, Pavlo Bazilinskyy, and Dimitra Dodou. 2021. Bio-inspired intent communication for automated vehicles. Transportation research part F: traffic psychology and behaviour 80 (2021), 127–140.Google ScholarGoogle Scholar
  35. Manon Prédhumeau, Anne Spalanzani, and Julie Dugdale. 2021. Pedestrian Behavior in Shared Spaces with Autonomous Vehicles: An Integrated Framework and Review. IEEE Transactions on Intelligent Vehicles Early Access (2021), 1–1.Google ScholarGoogle Scholar
  36. Amir Rasouli and John K Tsotsos. 2019. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE transactions on intelligent transportation systems 21, 3 (2019), 900–918.Google ScholarGoogle Scholar
  37. Alexandros Rouchitsas and Håkan Alm. 2023. Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles. Multimodal Technologies and Interaction 7, 2 (2023), 10.Google ScholarGoogle ScholarCross RefCross Ref
  38. Brian Scassellati. 2002. Theory of mind for a humanoid robot. Autonomous Robots 12 (2002), 13–24.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Sichao Song and Seiji Yamada. 2018. Designing expressive lights and in-situ motions for robots to express emotions. In Proceedings of the 6th international conference on human-agent interaction. 222–228.Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Cristen Torrey, Susan R Fussell, and Sara Kiesler. 2013. How a robot should give advice. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 275–282.Google ScholarGoogle ScholarCross RefCross Ref
  41. Tram Thi Minh Tran, Callum Parker, Yiyuan Wang, and Martin Tomitsch. 2022. Designing Wearable Augmented Reality Concepts to Support Scalability in Autonomous Vehicle–Pedestrian Interaction. Frontiers in Computer Science (2022), 39.Google ScholarGoogle Scholar
  42. Christiana Tsiourti, Astrid Weiss, Katarzyna Wac, and Markus Vincze. 2017. Designing emotionally expressive robots: A comparative study on the perception of communication modalities. In Proceedings of the 5th international conference on human agent interaction. 213–222.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Yiyuan Wang, Luke Hespanhol, and Martin Tomitsch. 2021. How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technologies and Interaction 5, 12 (2021), 84.Google ScholarGoogle ScholarCross RefCross Ref
  44. Yiyuan Wang, Luke Hespanhol, Stewart Worrall, and Martin Tomitsch. 2022. Pedestrian-Vehicle Interaction in Shared Space: Insights for Autonomous Vehicles. In 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 330–339.Google ScholarGoogle Scholar
  45. Yiyuan Wang, Senuri Wijenayake, Marius Hoggenmüller, Luke Hespanhol, Stewart Worrall, and Martin Tomitsch. 2023. My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic Eyes. Proceedings of the ACM on Human-Computer Interaction 7, MHCI (2023), 1–30.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Steve Whittaker, Yvonne Rogers, Elena Petrovskaya, and Hongbin Zhuang. 2021. Designing Personas for Expressive Robots: Personality in the New Breed of Moving, Speaking, and Colorful Social Home Robots. J. Hum.-Robot Interact. 10, 1, Article 8 (Feb 2021), 25 pages. https://doi.org/10.1145/3424153Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Sonja Windhager, Dennis E Slice, Katrin Schaefer, Elisabeth Oberzaucher, Truls Thorstensen, and Karl Grammer. 2008. Face to face: The perception of automotive designs. Human Nature 19 (2008), 331–346.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Designing Emotional Expressions of Autonomous Vehicles for Communication with Pedestrians in Urban Shared Spaces: Use Cases, Modalities, and Considerations

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Article Metrics

      • Downloads (Last 12 months)14
      • Downloads (Last 6 weeks)14

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format