Are highly automated vehicles as useful as dishwashers?

Abstract Due to technological improvements, the vehicle of the future is expected to be autonomous. However, vehicle automation is a matter not only of technology but also of the human behind the wheel. Highly automated vehicles have an impact on drivers’ cognitive processes and these should be considered carefully before introducing automation. It is argued here that automation should not be implemented based solely on what is technologically possible. Instead, human-machine cooperation should be considered and automation adapted accordingly.


Introduction
Despite being a subject of interest for several decades, at least in the robotics community (e.g. Moravec, 1985), it is only in recent years that autonomous vehicles have become the subject of special attention. The trend toward ever greater vehicle automation is exemplified by the service provider Google (e.g. Thrun et al., 2006) and has been very extensively reported in the media. The main benefits of autonomous vehicles are said to lie in their potential to reduce pollution, ease congestion, free up urban space (e.g. Alkim, Bootsma, & Hoogendoorn, 2007) and, above all, dramatically reduce human errors and consequently improve safety-given that human error is the leading cause of traffic accidents (e.g. Brookhuis, De Waard, & Janssen, 2001;Dingus et al., 2006). This paper adopts a metaphorical comparison with a widely used autonomous tool, i.e. the dishwasher. This comparison was chosen deliberately in order to provoke a debate on the role of humans in connection with highly automated vehicles. We do not question the immense Jordan Navarro ABOUT THE AUTHOR Jordan Navarro is a lecturer in psychology and cognitive sciences at University of Lyon and has been appointed junior member of the Institut Universitaire de France. He was awarded a PhD in cognitive ergonomics from the University of Nantes in 2008 and spent a year as a research fellow at Monash University. His research interests include human-machine cooperation in driving as well as driving behavior and visual strategies analysis.

PUBLIC INTEREST STATEMENT
The idea of this contribution is to offer an overview of known psychological phenomena inducted by the use of highly automated machines and their impact on "drivers" of highly automated vehicles. The focus is set on function delegation automation and key questions already addressed by empirical data are synthetized. New research directions are also proposed to consider more directly humans as far as human-machine interactions are considered.
usefulness of vehicle automation itself and our aim is clearly not to produce a direct comparison between dishwashers and vehicle automation. Instead, we want to illustrate the need to consider drivers even in a context of ever-increasing automation.
The aim of the current article is to discuss the impact of highly automated vehicles on drivers from a technological perspective (section 2) and a psychological perspective (section 3). We then propose new research directions (section 4) before coming to our conclusion (section 5).

Highly automated vehicles: A technological perspective
The driver's psychology involves both physical and cognitive processes. These processes and associated outcomes cause errors that would ideally be eliminated by vehicle automation. Automatic tools are mostly developed because it is technologically possible to do so rather than because their development is necessary in order to meet users' needs (Hancock, 2014). The case of automated vehicles is consistent with this general trend and automation implementation is driven by technology. Evidence of this can be seen in the fact that the chronology of implementation of automation solutions has followed the sequence of technological advances (e.g. Volvo, 2017).
Indeed, the world's three most influential authorities with regard to driving automation have all classified automated driving tools on the basis of current technological advances without paying much attention to drivers (German federal highway administration (BASt) Gasser & Westhoff, 2012; National Highway Traffic Safety Administration (NHTSA) National Highway Traffic Safety Administration (NHTSA), 2013; Society of Automotive Engineers (SAE) SAE international, 2014). In these classifications, drivers are only mentioned in the context of technological capabilities and in order to indicate the role they have to play depending on the level considered. Despite some differences, the three classifications are consistent with one another (see Table A1 for a comparison). In our opinion, similar criteria have been used to elaborate the three classifications. The main criterion is (1) the automated tasks: which of the various driving subtasks are automated? The other two criteria used are (2) the capabilities of the automation solution in the traffic environment in which it might be used, and (3) the limits of automation: are drivers expected to manage the situation if the automation solution fails?
The driver and the associated human-machine interactions are not directly considered in these classifications (see Navarro, 2017 for more details). The driver is not completely excluded but simply considered as the agent in the human-machine team who is in charge of the operations that automation is not able to deal with. This becomes even clearer when we consider that highly automated vehicles only process information in the environment outside the vehicle, but no information about what is occurring inside the vehicle.
To continue the comparison with dishwashers, highly automated vehicles have not yet reached the same level of autonomy as dishwashers. In France, for instance, PSA became, in 2017, the first car manufacturer allowed to carry out open-road autonomous driving experiments. These experiments will be continued in 2018 in order to investigate "automatic driving functions under driver supervision". This corresponds to a vehicle that takes over lateral and longitudinal control on motorways, that parks itself, and that identifies and augments the visibility of obstacles at night (Potter, 1999). Fully autonomous vehicles, corresponding to the last level of all three classifications presented in Table 1, are expected in the coming years.
In the early 80's, Hollnagel and Woods (1983) proposed that human-machine relations should be considered together and as a system. This is indeed the only approach that makes it possible to understand and adequately describe human-machine interactions as a whole. Following this idea, the concept of human-machine cooperation has been introduced (Hoc, 2000(Hoc, , 2001, and applied to driving automation (Hoc, Young, & Blosseville, 2009;Navarro, Mars, & Young, 2011). Within this framework, each automation is classified in the light of the associated human-machine cooperative activities (i.e. human-machine cooperation modes) rather than of the capabilities of the automation solution alone (Navarro, 2017;Navarro et al., 2011). These classifications are of interest because humans are considered as a key element in the human-machine system. From the human perspective, it is not only techno-"logical" automation but also psycho-"logical" elements that are considered.

Highly automated vehicles: A psychological perspective
From a technological perspective, the more automation that can be introduced in a vehicle, the better. This explains the current race toward ever-increasing vehicle automation. But can drivers' psychologies be subject to the same rationale?
Highly automated vehicles tend to replace drivers in a variety of driving sub-tasks, in particular those involving lateral and longitudinal control (see Table 1). In terms of human-machine cooperation mode, this is referred to as function delegation (Navarro, 2017). The driver no longer controls the vehicle directly but is expected to monitor automation and/or take over control if necessary (SAE levels 2 and 3: "partial automation" and "conditional automation"; BASt levels 3 and 4 "partially automated" and "highly automated"; and NHTSA levels 2 and 3 "combined function automation" and "limited self-driving automation"). Can humans effectively monitor automation and take over control if required?

General psychological insights
Researchers have identified several psychological concepts that are relevant to this question. Below, we examine how the concepts of human vigilance, complacency, automation bias and outof-the-loop behavior enrich the debate on highly automated vehicles.
Psychological research into human vigilance, i.e. the ability to sustain attention on a given task, started with the seminal work carried out by Mackworth (1948). Participants were asked to detect a double movement of a clock hand that moved every second. This work provided the first experimental demonstration that human monitoring performances (i.e. efficiency of double-movement detection) decrease after a certain time, equal to about 30 minutes in Mackworth's experiment. Much research has been conducted on human vigilance since Mackworth's first study, and many factors are known to influence monitoring performances, such as signal frequency, stimulus frequency, signal magnitude, signal distribution and task complexity, for instance. Taken together, studies on vigilance show that humans are not very efficient at maintaining their attention in order to detect events, especially after long periods of inactivity (Parasuraman, 1979;Warm, Parasuraman, & Matthews, 2008 for a review). This is a situation that drivers of highly automated vehicles would certainly face, since automation is expected to be reliable most of the time.
Another observation associated with the vigilance issue is that while monitoring automation, human operators tend to be complacent. Although there is no agreement on a definition of complacency, it could be defined succinctly as "a psychological state characterized by a low index of suspicion" (Wiener, 1981, p. 119). Researchers agree on three of the features of complacency: (1) it emerges when humans monitor automation, (2) the monitoring frequency is lower than optimal, and (3) it results in a fall-off in performance (Parasuraman & Manzey, 2010). A complacent human considers that "all is well" (Parasuraman & Manzey, 2010) while in fact "all in not well".
Automation bias or "the tendency to use automated cues as a heuristic replacement for vigilant information seeking and processing", Mosier and Skitka (1996, p. 205), may also be induced by automation use. Humans have often been found to underutilize (disuse) or overly rely on (misuse) automated systems (Parasuraman & Riley, 1997). Misuse refers to human overreliance on automation. An automation failure is problematic if the human trusts that the automation can identify and manage all possible events. In this way, overreliance reinforces complacency. Disuse is the neglect of automation. Disuse may be caused by automation inaccuracies, but could also be linked to the human tendency to perform even mundane tasks manually rather than delegate these tasks completely to a more effective automation solution (Navarro & Osiurak, 2015;Osiurak, Wagner, Djerbi, & Navarro, 2013).
The introduction of highly automated machines is also known to result in the out-of-the-loop performance problem (Endsley & Kaber, 1999;Endsley & Kiris, 1995;Kaber & Endsley, 1997). When the machine performs the majority of usual human operations, the human is, to a certain extent, removed from the continuous direct control of the situation. This translates into performance impairments due to a loss of awareness of the situation, a loss of skills, and a shift from active to passive information processing. This negative psychological effect exacerbates the previously mentioned undesired effects resulting from the introduction of highly automated equipment.
In sum, the available psychological knowledge indicates that monitoring a highly automated machine is a very challenging situation for humans and a source of skill-based errors as well as of rule-based and knowledge-based mistakes (according to Reason classification 1990). The following section deals with the data specifically available regarding vehicle automation.

Specific insights regarding vehicle automation
The growing interest shown in highly automated vehicles has given rise to a number of experiments conducted to examine human and ergonomic factors, in particular after 2010. A special issue of Human Factors devoted to "Human Factors and Automation in Vehicles" was published in 2012. This contained ten articles intended to contribute to the design of "Highly Automated Vehicles With the Driver in Mind". The specific findings were consistent with the general psychological insights presented above. Highly automated vehicles tend to slow down drivers' responses and cause difficulties when they are required to take over control from automation (Merat & Lee, 2012). This led to the conduct of several other experiments focusing on the transition between highly automated and manual driving. The difficulties faced by drivers when required to take over control from automation have been confirmed and explained in terms of changes in drivers' visual exploration of the driving environment (e.g. Merat, Jamson, Lai, Daly, & Carsten, 2014;Navarro, François, & Mars, 2016). These difficulties can be related to vigilance, complacency, automation bias and out-of-the-loop phenomena as described above.
In sum, from the human perspective, monitoring a highly-automated vehicle is a different task from driving manually. The experimental data collected in the specific context of vehicle automation are consistent with the general psychological insights. While an increase in the level of automation translates into improved performances, it does not directly result in a simplification of drivers' tasks. Thus, from a human perspective, it can be said that using a highly automated vehicle is a more complicated task than using a dishwasher. Of course, driving is in itself a more complicated task than washing dishes. However, in its current state, and unlike the case of a dishwasher, vehicle automation requires drivers to undertake an automation supervision task they are not always ideally equipped to perform.

Heading toward new research directions
Extensive literature exists on a range of psychological difficulties faced by humans when they have to monitor and/or take over from automation (see Lu, Happee, Cabrall, Kyriakidis, & de Winter, 2016;Navarro, 2018 for a recent reviews of vehicle automation experiments). Given these psychological observations, the whole concept of delegating functions to automation under human supervision is an awkward one. After all, would you be enthusiastic if you had to monitor your dishwasher and/or take over cleaning duties in the case of a malfunction? Would you even think of designing an automation solution of this type?
We believe that this imperfect approach to the way functions are delegated to automation is related to a common misunderstanding of all the tools with which we interact (Osiurak, 2014). Even sophisticated tools, such as vehicle automation, tend to be considered for what there are. This approach is flawed because all tool use is dependent on user-specific, limited and temporary needs. Consequently, any automation solution can be diverted from its intended use to better match the user's needs. It is not possible to imagine all use cases in full before an automation solution enters real-life use. It is clear, however, that the imperfect delegation of functions to automation puts humans in difficult situations that give rise to significant problems and safety issues.
Consequently, research into human-machine cooperation should be oriented toward other, lower levels of automation (e.g. warnings, haptic shared control) or higher levels of automation that do not require any human supervision. More generally, vehicle automation should be conceived of and designed with its possible consequences for the driver in mind. In other words, technology (associated with the machine) and psychology (associated with the driver) should be accorded equal importance during the entire automation design process. This type of systemic (human-machine) approach could prevent some of the ironies of automation (Bainbridge, 1983).
The advent of autonomous vehicles also raises new questions. If humans and tools are considered to interfere with one another in a bidirectional way (Gould, 1987), then it is clear that tools shape us, just as we shape them (Hancock, 2007). Consequently, the way humans are defined should also be reconsidered when we consider completely autonomous tools.

Conclusion
If dishwashers are extremely useful for many of us on a daily basis, it might be expected that autonomous vehicles will be much more useful for a variety of uses. The purpose of this comparison was to point out the fact that the statement "the higher the level of automation, the better" is not necessarily correct from the user's perspective. If highly automated vehicles require supervision by drivers and entail their responsibility, then they cause more difficulties to users than dishwashers do. It is, however, assumed that autonomous vehicles (i.e. automation is responsible and has the ability, the authority and the control of driving, as defined by Flemisch et al., 2012) will solve these difficulties.
Technological advances offer fantastic possibilities and promising new avenues in many different areas. Vehicle automation is one of these. But it is more than time to consider the human as a significant part of the human-machine system. If we fail to do so, the relationships between humans and tools will never be completely understood and optimized. In the case of vehicle automation, failing to do so would compromise the safety of drivers and other road users.
The advent of autonomous vehicles might radically change how we consider mobility. In the past, most people have considered driving their own car to be a source of pleasure (e.g. Hagman, 2010), associated with the feeling of being at home (Dubois & Moch, 2006), and not only as an effective means of transportation. In the future, vehicles may be considered primarily as a means of transportation that controls every aspect of the journey. In consequence, personal vehicles would no longer be necessary and could be replaced by autonomous vehicles managed, for instance, at municipal level. Simulations have been undertaken in order to anticipate this possibility (Lima Azevedo et al., 2016;e.g. Marczuk, Soh, Azevedo, Lee, & Frazzoli, 2016). We should now ask ourselves what role humans will have to play in such a vision of mobility, and by extension, of society. Table A1. Summary comparison of the three classification systems proposed by the Society of Automotive Engineers (SAE), the National Highway Traffic Safety Administration (NHTSA) and the German federal highway administration (BASt). The levels of automation are presented in rows, the three classifications in columns with the name associated with each level in bold. For the SAE classification: in levels 0 to 2 in light gray, the driver monitors the driving environment, whereas in levels 3 to 5 in dark gray, automation monitors the driving environment Level SAE 8 NHTSA 9 BASt 10 0 No automation All tasks performed by the human driver alone or with warnings or intervention systems

No automation
All tasks performed by the human driver alone or with warnings Driver only All tasks performed by the human driver without any assistance 1 Driver assistance A specific control function executed by the assistance system (steering OR acceleration/deceleration)

Function-specific automation
Automation may assist or augment the driver in operating either steering OR acceleration/ deceleration controls

Assisted
Automation participates in either lateral OR longitudinal control. The driver must monitor automation and be prepared to take over complete control at any time 2 Partial automation A specific control function executed by the assistance system (steering AND acceleration/deceleration)

Combined function automation
Automation relieves drivers of at least two primary control functions (e.g. steering AND acceleration/deceleration controls). Physical disengagement of the driver is allowed with both hand off the steering wheel AND foot off the pedals

Partially automated
Automation takes over lateral AND longitudinal control under certain conditions. The driver must monitor automation and be prepared to take over complete control at any time 3

Conditional automation
Automated conduct of all aspects of driving WITH the driver expected to intervene when requested Limited self-driving automation Automation assumes full control of all safety-critical functions under certain conditions. In addition to physical disengagement, the driver is not expected to constantly monitor the roadway

Highly automated
Automation takes over lateral and longitudinal control under certain conditions. The driver is not required to permanently monitor automation and must take over complete control after a certain time lag