ABSTRACT
Fully automated vehicles (SAE Level 5) will eliminate the need for a human driver, allowing passengers to focus on non-driving activities, such as those provided by infotainment services. There has been previous research on the use of non-driving related content on windshield displays. However, the placement area is limited and the content is likely to be viewable only from a single viewpoint (perspective issues). Therefore, we propose to detach content from the windshield and instead use an augmented reality space to provide infotainment content to passengers. In a within-subject virtual reality user study (N=19), we examined how front seat passengers would place infotainment content windows (and with whatproperties) in the open space using controllers while seated in a fully automated vehicle (within-factor: seating position, driver’ or front passenger’s seat). We also looked at the similarities and differences based on sitting position (left or right). We found that most content was not placed on top of windows/the windshield, with the exception of video content, which raises the question of whether windshield displays are the right medium for fully automated vehicles. In addition, between 40 to 63% of content was placed in mirrored positions when comparing between seating positions.
- Audi. 2023. Der Audi activesphere concept - Future in the making. http://aiweb.techfak.uni-bielefeld.de/content/bworld-robot-control-software/. [Online; accessed 12-June-2023].Google Scholar
- Josef Erl. 2022. Augmented Reality: Nreal & Nio show new AR headset. https://mixed-news.com/en/augmented-reality-nreal-nio-show-new-ar-headset/. [Online; accessed 12-June-2023].Google Scholar
- Michael A Gerber, Ronald Schroeter, Li Xiaomeng, and Mohammed Elhenawy. 2020. Self-interruptions of non-driving related tasks in automated vehicles: Mobile vs head-up display. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–9.Google ScholarDigital Library
- Hongwei Guo, Facheng Zhao, Wuhong Wang, and Xiaobei Jiang. 2014. Analyzing Drivers’ Attitude towards HuD system using a stated Preference survey. Advances in Mechanical Engineering 6 (2014), 380647.Google ScholarCross Ref
- Renate Haeuslschmid, Yixin Shou, John O’Donovan, Gary Burnett, and Andreas Butz. 2016. First steps towards a view management concept for large-sized head-up displays with continuous depth. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications.Google ScholarDigital Library
- Renate Häuslschmid, Sven Osterwald, Marcus Lang, and Andreas Butz. 2015. Augmenting the driver’s view with peripheral information on a windshield display. In Proceedings of the 20th International Conference on Intelligent User Interfaces. 311–321.Google ScholarDigital Library
- Robert S. Kennedy, Norman E. Lane, Kevin S. Berbaum, and Michael G. Lilienthal. 1993. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. The International Journal of Aviation Psychology 3, 3 (1993), 203–220. https://doi.org/10.1207/s15327108ijap0303_3Google ScholarCross Ref
- Young Jin Kim and Hoon Sik Yoo. 2021. Analysis of User Preference of AR Head-Up Display Using Attrakdiff. In Intelligent Human Computer Interaction: 12th International Conference, IHCI 2020, Daegu, South Korea, November 24–26, 2020, Proceedings, Part II 12. Springer, 335–345.Google Scholar
- Sebastian Osswald, Daniela Wurhofer, Sandra Trösterer, Elke Beck, and Manfred Tscheligi. 2012. Predicting information technology usage in the car: towards a car technology acceptance model. In AutomotiveUI. ACM, New York, NY, USA, 51. https://doi.org/10.1145/2390256.2390264Google ScholarDigital Library
- Hye Sun Park, Min Woo Park, Kwang Hee Won, Kyong-Ho Kim, and Soon Ki Jung. 2013. In-vehicle AR-HUD system to provide driving-safety information. ETRI journal 35, 6 (2013), 1038–1047.Google ScholarCross Ref
- Kibum Park and Youngjae Im. 2020. Ergonomic guidelines of head-up display user interface during semi-automated driving. Electronics 9, 4 (2020), 611.Google ScholarCross Ref
- Bastian Pfleging, Maurice Rang, and Nora Broy. 2016. Investigating user needs for non-driving-related activities during automated driving. In Proceedings of the 15th international conference on mobile and ubiquitous multimedia. 91–99.Google ScholarDigital Library
- Andreas Riegler, Andreas Riener, and Clemens Holzmann. 2022. Content presentation on 3D augmented reality windshield displays in the context of automated driving. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 543–552.Google ScholarCross Ref
- UEQ Team. 2018. User Experience Questionnaire. https://www.ueq-online.org/Google Scholar
- Unity Technologies. 2022. Unity. https://unity.com/Google Scholar
- Viswanath Venkatesh, Michael G. Morris, and Fred D. Davis. 2003. User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly 27, 3 (2003), 425–478. https://doi.org/10.2307/30036540Google ScholarCross Ref
- Philipp Wintersberger, Anna-Katharina Frison, Andreas Riener, and Tamara von Sawitzky. 2019. Fostering User Acceptance and Trust in Fully Automated Vehicles: Evaluating the Potential of Augmented Reality. Presence 27, 1 (2019), 46–62. https://doi.org/10.1162/pres_a_00320Google ScholarDigital Library
- World Medical Association. 2013. World Medical Association Declaration of Helsinki: ethical principles for medical research involving human subjects. JAMA 310, 20 (2013), 2191–2194. https://doi.org/10.1001/jama.2013.281053Google ScholarCross Ref
Index Terms
- Detaching from the Windshield: Augmented Reality Interfaces for Infotainment in Fully Automated Vehicles
Recommendations
Increasing trust in fully automated driving: route indication on an augmented reality head-up display
PerDis '19: Proceedings of the 8th ACM International Symposium on Pervasive DisplaysCooperative, intelligent transportation systems (C-ITSs) have capabilities far beyond what human drivers can achieve. For instance, intelligent systems that plan the exact trajectories of vehicles could increase throughput at intersections, allowing ...
Exploring head-up augmented reality interfaces for crash warning systems
AutomotiveUI '13: Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsCrash warning systems are designed to help avoid vehicle accidents by notifying drivers of potential hazards. In typical crash warning systems, primary warning information is provided through visual, audible and/or haptic cues. In general, the use of ...
How to Display Vehicle Information to Users of Automated Vehicles When Conducting Non-Driving-Related Activities
MHCIAutomated vehicles (AVs) are expected to enable users to engage in non-driving-related activities (NDRAs). However, users do not easily trust an automated vehicle which poses new challenges for automotive human-machine interfaces (HMIs). Over-presenting ...
Comments