Law and Order: Investigating the Effects of Conflictual Situations in Manual and Automated Driving in a German Sample

Minor violations of traffic regulations are common today and partially socially accepted. Automated vehicles (AVs), however, will be obliged to keep to the letter of the law . This can lead to situations where user requests cause the AV to reach its legal boundaries, creating novel user-vehicle conflicts. To investigate whether traffic-violating driver interests are transferred to the automated context, we conducted an online survey with three conflict-prone scenarios (N=49). The results indicate that legally compliant AV behavior is desired but that users would intervene in the vehicle’s behavior to enforce interests. In a subsequent Virtual Reality study (N=30), we evaluated the effect of legal boundary-handling strategies (Responsibility and Control Shift, Responsibility Shift, No Shift) and other traffic participants’ violating traffic regulations on behavior, conflict, and trust in a legally conflict-prone parking scenario. Results show that conflict is perceived significantly higher in all strategies compared to the manual baseline, while situational trust in the vehicle is higher in the automated conditions but independent of the handling strategy.


Introduction
People are complex beings whose behavior and decisions depend on various factors such as descriptive and subjective norms or attitude (Forward, 2006).In manual driving, this frequently leads to aberrant behaviors that seem irrational (Chung, 2015), such as violating traffic regulations.Those violations can range from minor offenses, such as parking without a ticket, to offenses that can lead to safety-critical situations or even accidents, such as failing to maintain safe distances, speeding, or spontaneously stopping in no-stopping areas.According to the German Kraftfahrt-Bundesamt (KBA), in 2022 alone, 4.137.831traffic offenses were penalized (Kraftfahrt-Bundesamt, 2022b), which includes both minor and serious offenses.
Meanwhile, the number of automated driving functions, such as driver assistance systems, steadily increases, and their capabilities improve.The Society of Automotive Engineers (SAE) categorized the extent of vehicle automation into levels ranging from 0 (no automation) to 5 (full automation) (Standard J3016_202104).At Level 0, drivers control all tasks, at Level 1 (and 2), systems assist with either (both) steering or (and) acceleration/deceleration but drivers remain in control.Level 3 introduces the possibility for vehicles to take over control from the driver, i.e., they have the capability to drive automated in a defined Operational Design Domain (ODD).However, the driver must be ready to intervene when requested.In Level 4, vehicles can perform all driving tasks in the ODD without driver intervention, and in Level 5, the vehicle can drive fully automated under any conditions without driver intervention.
Automated vehicles (AVs) (i.e., vehicles of SAE Level 3-5) develop their own goals and interests (Flemisch et al., 2020), which are subordinated to the greater goal of making road traffic safer and more efficient.True to the adage with great power comes great responsibility (Lee, 1962), this power leads to responsibility towards other road users such as manual drivers but also pedestrians (Holländer et al., 2021).It can, therefore, be assumed that AVs will behave rationally and in accordance with the law.This may cause drivers to perceive law-abiding AV behavior to be against their interests (e.g., the driver would prefer the AV to drive faster), potentially making them feel disadvantaged relative to other manual drivers with aberrant driving behavior.
Independent of this, it is expected that situations will occur where drivers may want to spontaneously intervene in an AV's behavior without any safety-driven reasons (Wang, 2019;Tscharn et al., 2017).Thus, the automotive industry considers the possibility of taking control in future concepts, albeit it would be technically possible to take the driver completely out of the loop (i.e., at SAE Level 4 or 5; e.g., see BMW's Vision M NEXT (BMW Group)).Besides possible driver-initiated takeovers, concurrently, researchers examine strategies that allow drivers to spontaneously intervene without having to take over the whole driving task, for example, through speech or multimodal approaches (e.g., speech and gesture) (Wang, 2019;Tscharn et al., 2017).
However, this opportunity could also lead to drivers making requests that violate traffic regulations (e.g., requesting the AV to drive faster than allowed or to stop in an absolutely non-stopping area to pick up a friend).This would inevitably lead to AVs reaching legal boundaries.In this case, the driver and the AV pursue different interests, which can lead to novel driver-vehicle conflicts.It is therefore necessary to examine which scenarios could lead to such requests, how the AV could handle such situations on a driving-task level, and how this affects the driver.Thus, we investigated the research questions (RQs): RQ 1 What is the attitude of drivers towards law-abiding vehicles in conflict-prone situations?RQ 2 How does a vehicle's legal boundary-handling strategy affect user behavior, perceived conflict, and situational trust in legal conflict-prone situations?
To address the RQs, we conducted two distinct studies.An online survey with 49 German participants investigated whether traffic-violating driver interests carry over from manual to automated driving.Therefore, we assessed drivers' attitude towards law-abiding AVs in three mixed traffic scenarios, resulting in driving behaviors against the driver's goals or disadvantageous in relation to other manual drivers with aberrant driving behavior.Results show that in the considered scenarios, most participants desire legally compliant AV behavior.Nevertheless, a conflict was perceived and most of the participants stated that they would intervene in the AV's behavior to pursue their interests.In a Virtual Reality (VR) simulator study with 30 German participants, we then explored three AV strategies to handle legal boundaries on the driving-task level and compared them against manual driving, with and without legal assistance.We assessed how those strategies affect the user's behavior, perceived conflict, and situational trust in a parking scenario, which was identified as a conflict-prone situation in the online survey.We further examined how the presence of other vehicles violating traffic regulations affects outcomes.
The results indicate a heightened potential for conflict when AVs strictly adhere to legal boundaries and reveal that users are inclined to override AV controls to commit traffic violations, corresponding with the findings in the online study.Despite this, our results suggest no significant difference in conflict levels or trust across different AV legal boundary-handling strategies.
In summary, we make the following contributions: We first provide empirical insights into user attitudes toward law-abiding AVs, suggesting that conflicting legal interests will arise that are likely to lead to traffic-violating AV interventions.Second, we derive possible handling strategies for legal boundaries that AVs reach in the case of trafficviolating user interventions.Third, by empirically examining these handling strategies, we enhance the understanding of user-AV interactions in scenarios that are prone to legal conflicts.

Background
First, we introduce work on traffic-violating driver behavior in manual driving.Second, we present previous work on conflicts of interest and their emergence in the AV context.This is followed by an overview of research on trust.

Driver Behavior and Road Traffic Offenses
The German KBA provides a register of driver fitness (FAER), where "legally binding decisions on traffic violations and driving license measures are stored" (Kraftfahrt-Bundesamt, 2022a).The most common traffic violations include parking (Fidelsberger), distance, and speeding violations (Kraftfahrt-Bundesamt, 2022a).
In examining the motivations of drivers to commit traffic violations, the Theory of Planned Behavior (TPB) provides insight into the behavioral intentions of individuals (Forward, 2006(Forward, , 2009)).In this theory, it is considered that a "person's intention to perform a certain [behavior] [is] determined by attitude, subjective norm and perceived [behavioral] control" (Forward, 2006, p. 413).Parker et al. (1992) were able to justify people's unlawful behavior based on the TPB for the four driving violations: (1) drinking and driving, (2) speeding, (3) close following, and (4) overtaking in risky circumstances.Connolly and Åberg (1993) investigated social effects on speeding behavior, proposing the speeding behavior of directly surrounding vehicles as an influencing factor.McNabb et al. (2017) additionally showed that drivers tend to engage in riskier driving behavior when they deliberately follow another vehicle.Lheureux et al. (2016) showed that besides TPB constructs, habits directly impact offending behaviors (i.e., speeding and driving under the influence of alcohol).In some cases, regulatory violations can also result from inattention or distraction (Wundersitz, 2019).
In summary, previous work demonstrates that drivers occasionally behave contrary to traffic regulations.Yet, what remains completely unexamined is whether the underlying traffic-violating interests translate from the manual to the automated context and affect the attitude towards law-abiding AVs.
From these perspectives, several definitions emerged, agreeing that a conflict results from one or more agents pursuing different interests.The disagreement may also concern, for instance, needs, ideas, beliefs, values, goals, or performance strategies (Thakore;DeChurch et al., 2013;Tessier and Dehais, 2012).While in their definition, Tessier and Dehais (2012)  In the automotive context, interests concerning the driving process can conflict between AVs and passengers.Flemisch et al. (2014) break down the driving process into the levels of navigation (i.e., route planning), maneuver (e.g., overtaking, left/right turn), trajectory (i.e., path-taking and timing, e.g., speeding up, taking a tighter turn in a curve), and control level (i.e., being in control of the vehicle dynamics, e.g., braking, controlling lateral/longitudinal acceleration).Interests lead to action intentions (expectations/wishes on how the vehicle should behave) concerning one or more of these driving process levels.Conflicts occur if there is a difference between the passenger's desired output and the AV's expected behavior (Huang et al., 2020).
Various factors can play a role in the emergence of conflict in the interdependence between AVs and users.Examples of this may include (1) Lack of shared situation awareness: The AV and the user perceive and interpret the environment and context differently, which may cause the user to assess a situation differently and thus expect or desire a vehicle behavior contradicting the planned actions of the AV (Woide et al., 2019).Woide et al. (2019) considered such situations and presented a methodical approach to reproduce driver-AV conflicts by gradually reducing the visibility by fog in an overtaking maneuver.They found that drivers' takeover behavior is significantly affected by conflicts.(2) Individual factors: The user may have individual and context-dependent interests that the AV may not be able to address due to limited flexibility and capabilities (Wang, 2019) or that are not compatible with the AV's interests.A possible conflict due to the different perspectives on traffic regulation violations would fit in here.To date, however, we are not aware of any work that examines conflict development in such situations and the influence of how an AV handles those.Woide et al. (2021) developed the Human-Machine-Interaction-Interdependence Questionnaire (HMII), providing a validated scale for measuring user-vehicle cooperation.This also includes a conflict subscale consisting of five subitems, which are listed in Table 2. Here, they refer to Gerpott et al., who defined conflict dimension as the "[d]egree to which the behavior that results in the best outcome for one individual results in the worst outcome for the other" (Gerpott et al., 2018, p. 718).It is measured with a 5-point Likert scale (ranging from 1=Do not agree at all to 5=Fully agree).In our work, we use this scale to assess conflict.

Trust in Automated Vehicles
It has been shown that trust in the reliability of AVs is of great importance (Holländer et al., 2021;Molnar et al., 2018;Azevedo-Sa et al., 2021;Li et al., 2019;Kraus et al., 2019) and presented a variety of human trust models (Akash et al., 2020;Carter and Bélanger, 2005;Yuen et al., 2021).Lee and See define trust as "the attitude that an agent will help achieve an individual's goals in a situation characterized by uncertainty and vulnerability" (Lee and See, 2004, p. 51).
Trust calibration, in particular, plays a major role in the context of automation.Considering that insufficient trust can lead to an inordinately skeptical behavior, which in turn may lead to non-or misuse of the system (Choi and Ji, 2015;Muir and Moray, 1996;Parasuraman and Riley, 1997).Simultaneously, overtrust can cause inappropriate reliance on the automation capabilities, which can result in dangerous situations or even accidents (Lee and See, 2004;Muir, 1987).Thus, the trust level should correlate with the reliability of the automated system (Akash et al., 2020).However, differences in the expected behavior of an automation and that of the actual behavior can negatively affect trust, even if the system is reliable (Lee and See, 2004).
Further, trust is highly dependent on the situation and context (Holthausen et al., 2020), as several studies have found that driver trust can vary depending on the environment and the driving scenario (Frison et al., 2019;Li et al., 2019).Thus, Holthausen et al. (2020) presented the Situational Trust Scale for Automated Driving (STS-AD), based on the trust model of Hoff and Bashir (2015), which takes into account the above-mentioned characteristics of trust in the context of automated driving.The questionnaire consists of six items, which are listed in Table 2.
Multiple trust theories underline that human behavior in conflict situations is influenced by the level of trust (Balliet and Van Lange, 2013).Lee and Moray (1992) investigated the effects of operator control strategies on trust.However, to the best of our knowledge, no work has investigated the effects of legal conflict-prone situations and how the AV handles those on the user's trust in the AV.In our study, we used the STS-AD questionnaire to examine this relationship more closely.

Online Study -Attitude toward Law-Abiding Automated Vehicles
As previous work shows, drivers occasionally behave deliberately contrary to traffic regulations (Chung, 2015;Forward, 2006).Traffic-violating interests that transfer from manual to automated driving possibly cause trafficviolating expectations towards AV behavior and conflicts.Thus, we conducted a study to investigate drivers' attitudes towards law-abiding AVs in conflict-prone situations (RQ 1).

Design
The study intended to examine three exemplary traffic scenarios that contribute to traffic-violating behavior in manual driving and assess participants' attitudes and self-perceived behavior towards an AV that behaves law-abidingly in those scenarios.The law-abiding behavior of the AV results in the user's goals not being considered or the user being at a disadvantage compared to other manual drivers with aberrant driving behavior.Thus, to investigate RQ 1, a within-subject online study using the scenario as the within-subject factor (resulting in 3 conditions) was conducted.The conditions were presented in randomized order.

Scenarios
We designed the scenarios based on the three most-sanctioned traffic offenses in Germany, which are speeding, wrong parking, and distance violations (Fidelsberger;Kraftfahrt-Bundesamt, 2022a).Furthermore, due to their increased relevance, we focused on situations where traffic-violating interventions could have safety-critical consequences.Thus, for the design of the scenario where an intervention would lead to wrong parking, we decided not to choose a situation which would, for example, lead to parking without a valid parking ticket but rather the violation stopping in an absolute no-parking zone, as it could lead to rear-end collisions.For the design of the scenarios where an intervention would lead to speeding and distance violations, we considered that driving speed is known to be an important factor for road safety (Aarts and van Schagen, 2006).Not only does speed impact the severity of a crash, but it is also associated with the risk of being involved in an accident.Speeding in combination with shorter headway times and the given variability of reaction times can also lead to rear-end collisions (Chatterjee and Davis, 2016).This led us to the following scenarios, which were recorded as videos of a simulation in Unity version 2020.3.19f1 (Unity Technologies, 2022), with the perspective of the passenger (see Figure 1):

Speeding Scenario
In this scenario, an AV drives on an expressway.The maximum speed on this road is 60km/h, which is evident from street signs.The AV drives exactly 60km/h.However, some vehicles nearby do not adhere to this speed limit and overtake the AV at a higher speed.As a country road is considered in this scenario, the fine for speeding up to 20km/h over the maximum speed would be a fine of 45.50€ (up to 10km/h over maximum speed) to 88.50€ (at 20km/h over) as of 2023.Speeding violations above this speed can also be penalized with up to 2 penalty points registered in Flensburg, Germany, and 3 months driving license suspension (Fidelsberger, 2023).The scenario's overall duration was 8 seconds.It was introduced as: You are driving on an expressway, and the maximum speed in this road area is 60 km/h.Your autonomous vehicle will follow the rules, but most of the manual vehicles in its vicinity drive faster.

Parking Scenario
In the second scenario, an AV drives through an urban area.At the side of the road stands a female person the user wants to pick up.However, as this person is standing in an absolute no-stopping zone, the AV is not allowed to stop there.Unauthorized stopping in an absolute no-stopping zone would lead to a fine of 20€ or 35€ if other road users are obstructed.Therefore, the AV drives past this person to a nearby parking lot, which is 100m away from the woman.The scenario's overall duration was 18 seconds.It was introduced as: You are on the main road and want to pick up a friend (the woman with the white shirt on the right) on the side of the road.Unfortunately, this is absolutely prohibited, and your autonomous vehicle does not stop.Your vehicle continues and stops a hundred meters further in a parking lot.

Distance Scenario
In the third scenario, an AV drives on a country road.Vehicles drive in front of it, and the AV keeps the mandatory distance from them.Vehicles in the rear take advantage of this gap and repeatedly jump the queue.As the maximum speed in the scenario is 60km/h, the fine is 53.50€.Distance violations with more than 80km/h can also lead to penalty points and with more than 100km/h to additional driving license suspensions (Fidelsberger, 2023).The amount of the respective fine depends on the distance.The minimum distance is calculated as (5 * (∕2))∕10. Thus, the minimum distance with 60km/h would be (5 * (60∕2))∕10 = 15.The scenario's overall duration was 15 seconds and was introduced as follows: You are driving on an expressway, and a car is driving in front of you, so your autonomous vehicle must maintain the required safety distance.However, this is exploited by the other vehicles in your vicinity, and they crowd into the gap in front of you.

Measurements
We examined the scenarios for conflict using the conflict subscale items of the HMII by Woide et al. (2021) (see Table 2).In addition, we assessed the following items: (1) How should the vehicle behave in this situation?(With the response options 'As the vehicle did in the video' or 'Different') (2) How would you behave in the situation if you were driving manually?(With the response options 'As the vehicle did in the video' or 'Different), (3) Would you want to take over the control in this situation?(With the response options 'No' or 'Yes') (4) Would the cars around you affect your action in this situation?(With the response options 'No' or 'Yes') For questions (1), (2), and (4), participants were additionally asked 'How?' if they chose the second option (i.e., 'Different'/'Yes').

Procedure
Every session started with a brief introduction, agreeing to the consent form, and a demographic questionnaire.The three scenarios described were presented, each followed by the corresponding question items regarding perceived conflict, attitude, and behavior toward law-abiding AVs.On average, the study lasted 15 minutes.

Participants
Participants were recruited personally, via social media, and at Ulm University.All participants were required to hold a valid driver's license and have good knowledge of English (as the questionnaire was in English).The final sample consisted of 49 participants holding German citizenship (16 female, 33 male) aged 19 to 66 years (M=29.06,SD=9.66).They held driver's licenses for between 2 to 48 years (M=11.57,SD=9.52).Most of the participants drive an average of 7.000km-14.999km(16 participants) or 15.000km-24.999km(16 participants).A few participants drive an average of less than 7000km (14 participants).The others drive 25.000km-32.999km(2 participants) or more (2 participants).They were further asked about their general level of trust in AVs.32 indicated that they would generally trust an AV, whereas 17 answered with 'No'.When delving into specific scenarios, the level of trust varied.For instance, in moving traffic within urban areas, 25 respondents stated they would trust an AV, while 2 were unsure, and 5 said they would not.In situations of high urban traffic, the level of trust decreased, with 10 participants saying 'Yes', they would trust an AV, 13 being 'Uncertain', and 9 saying 'No'.In non-urban areas with moving traffic, the level of trust was relatively high, with 23 respondents saying they would trust the AV, 5 being uncertain, and only 4 expressing distrust.In high traffic conditions outside urban areas, the numbers were: 15 said 'Yes', 10 were 'Unsure', and 7 said 'No'.
The study was conducted as an online survey via LimeSurvey (LimeSurvey GmbH, 2023).Participation was voluntary and was not compensated.

Data Analysis
A Friedman test was used to assess group differences on non-parametric data.For binomial data, we used Cochran's Q test and multiple McNemar's tests with Holm correction as a post-hoc test.R in version 4.3.2 and RStudio in version 2023.09.1 were used.All packages were up to date in November 2023.

Attitude Toward Law-Abiding Automated Vehicle Behavior
Figure 2 shows the items on behavior with the respective number of participants who rated them as positive.Using inductive analysis for the qualitative feedback, two authors read the answers, grouped them into themes, and developed codes.Then, deductively, the authors coded the answers again after discussing and merging the codes.Disagreements were resolved via discussion.The feedback showed that participants who wanted the AV to behave differently indicated that the AV should stop directly (16/17; Parking Scenario), speed up slightly (9/10; Speeding Scenario), and close the gap to the vehicle in front (9/12; Distance Scenario).The same behavior could also be observed in most participants' responses, indicating that they would behave differently.In the Parking Scenario, the participants who indicated that they would behave differently stated that they would stop (35/37).In the Speeding Scenario, participants who would behave differently would drive (a bit) faster/ as fast as the other cars around (31/32).In the Distance Scenario, most of the participants, with presumed different behavior, describe that they would close the gap in front of them or drive faster (29/35).Three stated that they would increase the distance even more.If subtotals are missing, the participants have not provided any information.
To compare participants' traffic-regulation-violating interests, we calculated the number of participants who indicated that they would want the AV to behave in a traffic-violating way and the number of participants who indicated that the vehicle should behave as in the video or differently but in a law-abiding way (e.g., indicated that the AV should even increase the distance to the lead vehicle in the distance scenario; also including the participant that did not further clarify their answer).The same approach was taken to the question of how the participants themselves would have behaved.
Across the scenarios, a Cochran's Q test shows no significant differences in the number of participants with trafficregulation-violating interests for the desired AV behavior ( 2 (2)=3.39,p=.18) or for their own behavior ( 2 (2)=1.85,p=.40).Nevertheless, it shows a significant difference in the number of participants who would take over ( 2 (2)=8.86,p=.01,  2 =.09).A McNemar's test found that significantly more participants would take over in the Parking Scenario than in the Speeding Scenario ( 2 (1)=6.72,p  =.03).A Cochran's Q test further found a significant difference across the scenarios in the number of participants who stated that the behavior of surrounding vehicles would affect their decision ( 2 (2)=6.47,p=.04,  2 =.07).A McNemar's test found that significantly more participants would be influenced by the surrounding vehicles' behavior in the Distance Scenario than in the Speeding Scenario ( 2 (1)=5.26,p  =.07).

Transfer of Traffic Violating Behavior from Manual to Automated Driving
The descriptive analysis shows that for all scenarios considered, most participants indicated that they would violate the traffic regulations, while comparatively few wanted the AV to act in this way.Further, at least half of the participants (three-quarters in the parking scenario) indicated that they would intervene in the AV's behavior.
The fact that most participants did not want the AV to behave against traffic regulations is also consistent with expectations for future AV designs.However, this can lead to mixed traffic situations in which AV users are disadvantaged compared to manual drivers (e.g., leading to situations such as the Distance Scenario considered).The low number of participants who indicated that they would want the AV to behave differently further suggests that users want to retain responsibility regarding traffic-violating behavior.At the same time, however, it also shows that preferences for AV behavior do not always coincide with the users' own driving behavior.This is shown to be influenced by trust in AVs.For example, a study by Delmas et al. (2023) shows that users with low trust in the AV prefer a slower driving speed than their own, while users with high trust desire a driving speed similar to their own.Therefore, further research is needed to investigate how expectations of traffic-violating AV behavior are influenced by user trust.
Overall, the results show that different legal interests are likely to arise, leading to AV interventions.Based on the SAE Level and AV design, this can appear as a driver-initiated takeover or traffic-violating request to the AV.The first case can lead to safety-critical situations (Merat et al., 2014;Gold et al., 2016), so further studies should investigate how to deal with such interventions in the AV.The second case leads to the AV reaching its legal boundaries, which it must handle.This supports our assumption that legal boundary-handling strategies are needed to deal with such situations.

Feasibility of the Scenarios for an Investigation of Legal Handling Boundary Strategies
To further investigate how AVs should handle legal boundaries, we compared the scenarios pairwise to identify differences in the conflict-proneness and potential to lead to interventions in automation.Our results show that all assessed scenarios are prone to constitute legal conflicts that result in the AV reaching its legal boundaries.Thus, each of them would be worth considering.However, significantly more people would interfere with automation in the Parking Scenario, which makes the scenario the one for which legal boundary-handling strategies are most likely to take effect.

Limitations
A limitation of the present study is that a moderate number of participants (N=49) took part.As mostly younger persons participated, the sample might not be representative.Thus, the question is how the findings transfer to other groups.Additionally, using self-reported measures necessarily implies some level of subjectivity and the risk of socialdesirability bias (Lajunen and Summala, 2003), especially for questions related to traffic-violating behavior.Bias could further arise from questions about the behavior of AVs, as the technology has been introduced but is not yet ubiquitous, and expectations are strongly influenced by factors such as trust and acceptance of the technology.

Legal Boundary-Handling Strategies
Our online study showed that traffic-violating interests are transferred from manual to automated driving (see Section 3.6.1.Thus, if the AV is designed in such a way that it can react to spontaneous user requests (which is conceivable for SAE Level 3-5 AVs according to current future vehicle designs), an AV is likely to encounter trafficviolating interests.When the AV is in automated mode, the AV has control and responsibility over its behavior and over other road users.As such, a traffic-violating interest leads to a legal boundary that the AV must be able to handle in a reliable and trustworthy manner.
At the driving task level, driver-vehicle interaction approaches have been proposed to deal with situations where AVs reach their limits (for an overview, see Walch et al. (2019b)).These strategies were mainly proposed for SAE Level 3 to deal with situations in which the AV leaves ODD and cannot handle a situation on its own, i.e., technical boundaries are reached.The basic distinction between these approaches is whether they include a mode shift (i.e., a complete transition of the vehicle control to the driver) or if the vehicle remains in control but the driver supports the vehicle with the driving task (Walch et al., 2019b).
Based on these approaches, we derived legal boundary-handling strategies (see Figure 3 for an illustration), which are introduced in the following.One possible and extensively researched strategy for dealing with AV boundaries is to ask the driver to take full control of the vehicle, i.e., to change the mode (McCall et al., 2016(McCall et al., , 2019)).This means that the vehicle evades legal accountability by shifting responsibility and control to the driver (RC-Shift).RC-Shifts are only possible for AVs that contain all necessary hardware for complete takeovers (e.g., a steering wheel, brakes, and a gas pedal).Further, approaches where the driver is allowed to take over driving control can lead to post-automation effects such as unstable lateral control (Merat et al., 2014) and can lead to safety-critical situations (Merat et al., 2014;Gold et al., 2016Gold et al., , 2013)).One reason is the "out-of-the-loop" effect, meaning the driver is no longer engaged in the driving task, resulting in decreased situation awareness and skill loss in the long term (Merat et al., 2019;Endsley and Kiris, 1995).Safety-criticality is additionally reinforced by the risk of aberrant driving behavior in general.Hence, it is desired to keep automation activated (Walch et al., 2019b).
Therefore, cooperative methods were proposed.In those methods, the passenger helps an AV in a given task without the need to take over vehicle control (Walch et al., 2019a(Walch et al., , 2017(Walch et al., , 2015(Walch et al., , 2019c;;Colley et al., 2021a).Transferring such cooperative approaches (Zimmermann et al., 2014;Flemisch et al., 2014) to legal boundaries, the vehicle retains control, but the user could be explicitly asked to take over legal accountability.Thus, the vehicle shifts the responsibility to the user (R-Shift).Although automation remains activated, R-Shifts cannot prevent aberrant driving behavior.
Compared to scenarios in which the AV reaches technical limits and is unable to handle the situation, in scenarios in which it reaches legal limits, the AV can still handle it.Thus, another possible strategy is to retain responsibility and control and not to shift either (No Shift).To meet the legal requirements, the user's request would need to be rejected (see, e.g., Takayama et al. (2009)).This means that the user's interests can not be taken into account.This approach can avoid aberrant behavior and, thus, safety-critical situations.However, it means that the user's intentions and goals can not be met.

Virtual Reality Study on Legal Boundary-Handling Strategies
To investigate RQ 2 and, thus, how legal boundary-handling strategies affect users in terms of (1) Behavior, (2) Perceived Conflict, and (3) Situational Trust in legal conflict-prone situations, we designed and conducted a VR simulator within-subject study with N=30 participants.

Design
We compare the legal boundary-handling strategies with the baseline of manual driving (Manual) and the baseline of manual driving with legal assistance (Manual with Assistance).We consider the latter to determine whether the effects on the user in terms of conflict, behavior, and trust are solely due to the initial automation or are actually caused by the law-abiding vehicle addressing a potential conflict with traffic regulations.Taken together, we speak of the Interaction Strategies considered in the study.They are also illustrated in Figure 3.We further assessed the influence of other vehicles behaving disorderly.This resulted in a total of 5 (Interaction Strategy) x 2 (Influencing Vehicles) = 10 conditions.The participants encountered the conditions according to a balanced Latin square.
Our online study showed that all assessed scenarios (see Section 3.5) are prone to constitute legal conflicts that result in the AV reaching its legal boundaries.Thus, each of them would be worth considering.Nevertheless, we decided to assess RQ 2 through the parking scenario as in this scenario, compared to the others, most participants stated that they would wish to overwrite the AV's behavior.Additionally, the parking scenario was deemed favorable for study implementation as its conflict independence from other vehicles (in contrast to the distance scenario), coupled with the simple study task of picking someone up (compared to the speeding scenario, which would, for example, require inducing time pressure).

Virtual Reality Simulator
The scenario was modeled in Unity version 2020.3.19f1(Unity Technologies, 2022).The urban environment was modeled with the Fantastic City Generator Unity asset (MasterPixel3D, 2022).The required traffic and vehicle automation was implemented using the Simple Traffic System Unity asset (TurnTheGameOn, 2022).Further, we used an HTC VIVE Pro Eye and the Thrustmaster T150 Pro steering wheel with pedals for manual driving.As for the participants' vehicle, a model of the Mercedes F015 (Benz, 2015) was used.However, we exchanged the vehicle's steering wheel model to better match the shape of the Thrustmaster's wheel.Further, we removed the wheel completely for the conditions in which the participants had no ability to drive manually and added animation in the conditions where the participant could perform a takeover in which the steering wheel was extended (as done by Colley et al. (2021b)).

Scenario
The scenario takes place in a city.Thus, the speed is limited to 30km/h.The participants sit in a vehicle parked in a driveway.Before the participants started into the scene, they were tasked with giving a lift to a person on the right-hand walkway while they were on their way to work.The road the participants had to drive along was a straight track with six absolute no-stopping signs (sign 283 StVo) scattered along it.According to the German StVO §62 (which can be assessed via (BMDV)), road users are thus not allowed to stop on this road section.Additionally, a parking sign indicates that a nearby parking area is 250m meters away.The parking area could also be seen from a distance and, thus, was visible to participants.The complete distance from the starting point to the parking area is about 500m.Along the route, at a short distance to the parking sign, a male person (the person who should be lifted) stands on the walkway facing the participants' vehicle.If the participant drove past the man, he followed the vehicle at a walking pace.The task was finished when the person reached the participant's vehicle.If the participant decided to drive to the parking area, it took about 60 seconds until the person arrived at the vehicle.The latter forced the participants to wait in the meantime (as they would have to do in real life) to promote realistic decision-making behavior.In the conditions with influencing vehicles, two vehicles are parked on the walkway, one after a third of the total distance and thus approximately 200m in front of the person to be taken along and a second shortly after this person.The complete road scenario can be seen in Figure 4.

Study Task
The task and the vehicle with which the participants had to complete the task in the manual baseline conditions were presented as: You are in a manual vehicle.Thus, you can steer, accelerate, and brake.At the beginning of the scene, you are standing in a driveway.You are on your way to work.You have to drive out of the driveway onto the road and straight along it.There is a male person on the right-hand side of the road.Your task is to give him a lift.(Translated from German) For the automated conditions where the legal boundary-handling strategies apply, the task was set as follows: You are in an autonomous vehicle and are initially standing in a driveway.The vehicle drives off on its own and is programmed to be on its way to your work.Along the way, there is a male person on the right side of the road.Your task is to give this person a ride.To do this, you can give commands to your vehicle via voice control.(Translated from German)

Conditions
The study leader acted as Wizard of Oz (Dahlbäck et al., 1993) and executed the participant's voice input via keystroke on a keyboard to create a realistic interaction between the participant and the AV.The specific interaction possibilities to the respective conditions with the given vehicle output are stated in Table 1.In the following, the conditions are described in more detail: Manual.In this condition, participants drive without automation.Thus, they have to brake, accelerate, and steer manually.Regardless of whether the participant wants to stop in the absolute no-stopping zone or drive to the parking area, the decision is left up to them.
Manual with Assistance.In this second manual driving condition, if a participant slows down to stop in the nostopping zone, the vehicle assists them by pointing out that the participant is tempted to park in a no-stopping zone.RC-Shift.In this condition, participants drive automated, which means that the AV holds control and responsibility.The steering wheel is retracted and the pedals are inoperative.The participant can intervene in the AV behavior by making voice requests.If the participants request traffic-regulation-violating AV behavior leading to the AV reaching its legal boundaries, in this condition, the AV pursues the approach of an RC-Shift.Thus, the participant is asked to take over responsibility and control.If the participant agrees, the steering wheel is extended, and a complete mode shift (from automated to manual driving) is performed.If the handover is unsuccessful, the steering wheel is retracted again, and the AV drives to the next parking area.R-Shift.Participants drive automated, which means that the AV holds control and responsibility.The steering wheel is retracted and the pedals are inoperative.If the participants request traffic-regulation-violating AV behavior leading to the AV reaching its legal boundaries, in this condition, the AV pursues an R-Shift, by ensuring that the participant is aware of the disorderly request and asking for a confirmation of the request.If the participant actively confirms the request, the AV executes it.If the request is not confirmed, the AV drives to the next parking area.
No Shift.Participants drive automated, which means that the AV holds control and responsibility.The steering wheel is retracted and the pedals are inoperative.If the participants request traffic-regulation-violating AV behavior leading to the AV reaching its legal boundaries, in this condition, the AV pursues the approach of No Shift, and, thus, rejects the request but drives alternatively to the next possible parking area.
Conditions with Influencing Vehicles.In the conditions with influencing vehicles, two vehicles are disorderly parking on the sidewalk.In conditions without vehicle influence, these vehicles are not present.

Questionnaire
Item Code Item HMII -Conflict Subscale C1 I reject the system's preferred action.

C2
We can both achieve our preferred outcomes in this situation (reverse scored).

C3
Our preferred outcomes in this situation are in conflict.

C4
The system prefers a different outcome than I do in this situation.C5 I prefer a different outcome than the system in this situation.

STS-AD
Trust I trust the automation in this situation Performance I would have performed better than the automated vehicle in this situation (reverse scored).

NDRT
In this situation, the automated vehicle performs well enough for me to engage in other activities (such as reading).

Risk
The situation was risky (reverse scored).

Judgement
The automated vehicle made an unsafe judgement in this situation (reverse scored).

Reaction
The automated vehicle reacted appropriately to the environment.Own Items Feasibility 1 I could perform the task well with the vehicle I had.Feasibility 2 The task was difficult to complete (reverse scored).

Effect of Vehicle Behavior on Decision
The behavior of my vehicle influenced my decision.

Effects of Other Vehicles' Behavior on Decision
The vehicles around me influenced my decision.

Effects of Other Vehicles' Behavior on Emotions
The vehicles in around me had a negative influence on my emotional state.

Table 2
Used question items of the conflict subscale of the HMII questionnaire (Woide et al., 2021), the adjusted STS-AD questionnaire (Holthausen et al., 2020) and own items, where the latter were translated from German (see Appendix Figure 6 for the original items).

Measurements
Objective dependent variables.The vehicle position was logged.Further, the participants' behavior was logged in terms of whether they stopped the vehicle in the non-parking area or at the parking area (as a binary data point) and, additionally, whether the vehicle output made them change their decision to drive to the parking space or not (contrary to their original traffic-violating interest; also as a binary data point).
Subjective dependent variables.After each condition, we measured conflict with the conflict subscale of the HMII (Woide et al., 2021) with a 5-Point Likert scale ranging from 1=Do not agree at all to 5=Fully agree.We further employed the situational trust based on the Situational Trust Scale for Automated Driving proposed by Holthausen et al. (2020) on a 5-point Likert scale (1=Do not agree at all to 5=Fully agree).As we also consider manual driving conditions, we adjusted the scale items and replaced "automation" and "automated vehicle" with "vehicle".In addition, the participants rated the feasibility of the study task, the influence of the vehicle's behavior, and the influence of surrounding vehicles on a 5-point Likert Scale (1=Do not agree at all to 5=Fully agree).Table 2 shows all items.After completing each of the ten conditions, participants were asked to textually describe if they were aware (1) that the person stood in a no-stopping zone, (2) that they were violating the law by stopping on the road to give the person a lift, and (3) how their trust in the vehicle was influenced when their request to the vehicle has been questioned or rejected.Afterward, the participants had to complete a final questionnaire with demographic questions and questions addressing their general trust in AVs.

Procedure
The study was conducted at Ulm University.Initially, all participants were introduced to the procedure and asked to sign a consent form.Afterward, they encountered the 10 conditions.Each condition was followed by a questionnaire employing the named subjective measurements (see subsubsection 5.2.5) and was completed by a demographic questionnaire.The complete study duration was about 60 min.Each participant was compensated with 10€.

Participants
The required sample size was calculated via an a-priori power analysis using the R package pwr in version 1.3.0.To achieve a suspected high effect using Cohen's f2 measure (0.35) with a significance level of 0.05 and a power of 0.8, 30 participants are required.Thus, 30 participants holding German citizenship (10 female, 19 male, 1 non-binary) aged 18 to 34 years (M=23.77,SD=2.45) were recruited personally, via social media, and at Ulm University.All participants held valid driver's licenses for between 1 to 12 years (M=6.47,SD=2.52) and stated a good knowledge of English (as some original questionnaires were in English).
Most of the participants drive an average of less than 7.000km annually (17 participants).The others drive 7.000km-14.999km(7 participants) or 15.000km-24.999km(6 participants).In our study, participants were asked how lawabiding they consider themselves in road traffic on a 4-point Likert scale (1=Not at all to 4=High).23 participants rated their law-abidingness to be 'moderate', 5 as 'high', and 2 as 'less'.
They were further asked about their level of trust in AVs.Most participants, 26, indicated that they would generally trust an AV.Only 4 answered with 'No'.When delving into specific scenarios, the level of trust varied.For instance, in moving traffic conditions within urban areas, 16 participants stated they would trust an AV, while 9 were uncertain, and 1 said they would not.In situations of high urban traffic, the level of trust decreased slightly, with 7 participants saying 'Yes', 11 being 'Uncertain', and 8 saying 'No'.
In non-urban areas with smooth traffic, the level of trust was relatively high, with 22 participants saying they would trust the AV, 3 being unsure, and only 1 expressing distrust.However, in heavy traffic conditions outside urban areas, the numbers were more balanced: 16 said 'Yes', 10 were 'Uncertain', and 6 said 'No'.
Parking situations also received a mixed response.9 participants would trust an AV, 12 were unsure, and 5 would not.When asked about trusting the vehicle throughout the entire journey, 7 participants answered with 'Yes', 14 being 'Uncertain', and 4 said 'No'.

Data Analysis
As in the online study, we checked the required assumptions (normal distribution and homogeneity of variance assumption) before every statistical test.As data was always non-normal, we compared the five Interaction Strategies via Friedman tests.We used Dunn's test with Holm correction for post-hoc tests if not stated differently.Further, we employed Wilcoxon signed-rank tests to compare the effects of Influencing Vehicles and calculated rank-biserial correlation effect sizes using the rank_biserial method in R (DataCamp, 2023).Additionally, we used the BayesFactor package (Morey, 2022), using Jeffreys-Zellner-Siow (JZS) priors, to compute Bayes factors.In the results, a most supported User model implies evidence against an influence of the considered factors (i.e., Influencing Vehicles and Interaction Strategies).If one of the User + Factors model is favored, this implies that one or both factors have a significant influence.For the factorial analysis of the AV's legal boundary-handling strategies, where the data was always non-parametric, we used Aligned Rank Transforms (ART) using ARTool (Kay et al., 2021) implemented as described by Wobbrock et al. (2011).

Parking Behavior
None of the participants failed to resolve the tasks.This means that each participant either stopped along the road or drove to the parking area to pick up the person in each of the ten conditions.Table 3 shows the distribution of parking behavior defined by parking in the no-stopping zone or stopping in the parking area based on the condition.In addition, it is listed how many participants changed their intention during the interaction with the vehicle, from stopping in the no-stopping zone to driving to the next available parking area.A change in the intention was detected as such if the vehicle first got the request to stop in the no-parking area, but in the interaction with the vehicle, the participant did not confirm or negate his or her request (see the participants' action possibilities in Table 1).

Conflict
Conflict was calculated by determining the mean value across all five items of the conflict subscale of the HMII questionnaire (see Table 2).Positive items (i.e., C2) were reverse scored.Considering Influencing Vehicles, a Wilcoxon signed-rank test found no significant effects on Conflict (V=3530.5,p=.59,r=.06).A non-parametric Friedman test of differences among Interaction Strategies was significant ( 2 (4)=60.56,p<0.001,Ŵ =0.25).A post-hoc test revealed that Conflict was rated significantly higher in Manual with Assistance (M=3.24,SD=1.15; p  <0.001), RC-Shift (M=3.15,SD=1.23; p  <0.001), R-Shift (M=3.32,SD=1.113; p  <0.001), and No Shift (M=3.29,SD=1.34; p  <0.001) compared to Manual driving (M=1.80,SD=0.80) (see Figure 5).Table 3 Overview of the participants' parking behavior.We computed a Bayes Factor analysis to evaluate whether there is evidence to support that there is no difference between the Interaction Strategies Manual with Assistance and those for automated driving (RC-Shift, R-Shift, and No-Shift).Compared to the User model (the most supported model), we found strong evidence (BF = 1/28.834)against the Interaction Strategy + User model, moderate evidence (BF = 1/4.63)against the Influencing Vehicles + User model, extreme evidence (1/125.31)against the Interaction Strategy + Influencing Vehicles + User model, and extreme evidence (BF = 1/717.48)against the Interaction Strategy + Influencing Vehicles +Interaction Strategy :Influencing Vehicles + User model (the least supported model) with regards to Conflict.This means that with strong evidence, the considered Interaction Strategies (i.e., Manual with Assistance and the automated conditions) do not influence the perceived conflict.
A factorial analysis using ART showed no significant effect on Conflict considering the AVs Legal Boundary Strategies (RC-Shift, R-Shift, and No-Shift) and Influencing Vehicles.
Descriptive results can be found in Appendix Table 4.
Considering these AVs Interaction Strategies, compared to the User model (the most supported model), we found moderate evidence (BF = 1/6.65)against the Interaction Strategy + User model; moderate evidence (BF = 1/4.96)against the Influencing Vehicles + User model; very strong evidence (BF = 1/32.30)against the Interaction Strategy + Influencing Vehicles + User model; extreme evidence (BF = 1/209.75)against the Interaction Strategy + Influencing Vehicles + Interaction Strategy :Influencing Vehicles + User model.This means that with moderate evidence, the legal boundary-handling strategy (R-Shift, RC-Shift, No Shift) does not influence Situational Trust.
A factorial analysis using ART showed no significant effect on Trust considering the AVs Interaction Strategies (RC-Shift, R-Shift, and No-Shift) and Influencing Vehicles.Descriptive results can be found in Appendix Table 4.

Task Feasibility.
Overall Task Feasibility was calculated by determining the mean value across the two Feasibility items.Considering Influencing Vehicles, a Wilcoxon signed-rank test found no significant effects on Task Feasibility (V=1648.0,p=.09, r=.22).A Friedman test found significant differences among Interaction Strategy for Task Feasibility ( 2 (4)=14.60,p<.001, Ŵ =.06; see Appendix Figure 8).A post-hoc test found that the study task was significantly more feasible in the Manual conditions (M=4.75, SD=0.40) compared to the conditions No Shift (M=4.36,SD=0.85; p  =.04) and RC-Shift (M=4.47,SD=0.59; p  =.03).

Effects of Other Vehicles' Behavior on Decision.
Considering Influencing Vehicles, a Wilcoxon signed-rank test indicated that the Effect of Other Vehicles' Behavior on Decision was rated significantly greater with influencing vehicles (M=2.15,SD=1.50) than without influencing vehicles (M=1.71,SD=1.27) (V=1511.00,p<.001, r=.50).Among Interaction Strategies a Friedman test found no significant differences ( 2 (4)=7.43,p=.11, Ŵ =.03; see Appendix Figure 8).Effect of Other Vehicles' Behavior on Decision was rated rather low in general (M=1.93,SD=1.40).

Effects of Other Vehicles' Behavior on Emotions.
Considering Influencing Vehicles, a Wilcoxon signed-rank test indicated that the Effect of Other Vehicles' Behavior on Emotion was rated significantly greater with influencing vehicles (M=1.58,SD=1.06) than without influencing vehicles (M=1.43,SD=0.79) (V=690.5, p=.048, r=.28).Among Interaction Strategies a Friedman test found no significant differences for Effects of Other Vehicles' Behavior on Emotions ( 2 (4)=3.16,p=.53, Ŵ =.01; see Appendix Figure 8).

Qualitative Feedback.
For the analysis of the qualitative feedback, two authors read the answers, separately grouped them into themes, and developed codes inductively.Afterward, the codes were discussed and merged into a final set of codes, and the authors coded the answers again deductively.We asked the participants whether they were aware that the person was standing in a no-parking zone.23 participants answered yes, one no, and five indicated that they only recognized this after the vehicle made them aware of it for the first time.We further asked the participants how their trust was influenced if the vehicle questioned or rejected their request.16 participants stated that their trust was rather not or not influenced at all.Seven participants indicated a heightened trust, as the vehicle showed that "it correctly recognized the traffic situation" [P7], "the vehicle prevents [one] from acting unlawfully" [P15], and "it has behaved compliant with the STVO and only my own behavior made the offense possible" [P20].However, two of those simultaneously criticize the loss of control ("I only feel somewhat powerless when my request to stop is not implemented directly" [P7]).Another participant similarly stated that he "was more likely to comply with the law when the vehicle gave [him] the opportunity to decide" [P6].One participant indicated that it depended on the situation and that he "either trust [ed] the vehicle or trust [ed] [him]self" [P2].Four participants stated that this had reduced their trust.They mentioned that "if the car simply continued to drive [it is] not clear when the car then again does not listen to [ones] instructions" [P11] and "the trust decreased (I can not stop even if I say it explicitly)" [P23].All answers are translated from German with DeepL (DeepL GmbH, 2023) and received a common agreement of three authors.

How Does Automation Affect Behavior in Conflict-Prone Situations Regarding Road Traffic
Offenses?
People frequently behave disorderly in manual driving.This is in line with general work showing the frequency and concerns of offenses (Gössling, 2017;Vardaki and Yannis, 2013) Also, in the main study, most of the participants decided to stop directly at the waiting person (see Section 5.2).In the automated conditions, this led to requests which were in violation of traffic regulations.Although these results cannot be generalized, they support the assumption that users might take advantage of the possibility of overriding the AV when reaching legal boundaries to enforce their interests.Thus, the AV should be able to handle them.Overall, our results are also in line with those of Woide et al. (2019), showing that contradicting interests lead to the wish to override the AV's behavior.
However, it is known that traffic offenses can also result from inattention and distraction (Wundersitz, 2019), leading to a lack of situational awareness.Owing to a lack of attention, for example, speed limits could be overlooked.We, therefore, assume that AVs could lead to more law-abiding behavior in situations where traffic offenses are committed unintentionally in manual driving.

How Does Automation Affect Conflict in Conflict-Prone Situations Regarding Road Traffic
Offenses?
Conflict describes the degree to which one's own interests correspond with the interests of the other party (Gerpott et al., 2018).Conflict is low with highly corresponding interests and high if one's own desired outcome means the worst result for the other (Kelley and Thibaut, 1978).Our VR study showed that conflict arises significantly more in conditions where the vehicle is automated and the vehicle's behaviors differ from the user's interest (see Section 6.3).Thus, we conclude that traffic regulations violating participants' interests were transferred from the manual to the automated driving context, which led to conflicts, which is consistent with the results of our online study.With strong evidence, the legal boundary-handling strategy (RC-Shift, R-Shift, and No Shift) has no effect on perceived conflict.We expected that conflict would be retrospectively perceived to be greater with the No Shift strategy because one's goals could not be achieved, and thus, one's outcomes are worse than in the other handling strategies (Kelley and Thibaut, 1978).
The HMII conflict subscale (Woide et al., 2021) primarily targets differences in preferred action/outcomes.It may be that participants only associated "preferred action" and "preferred outcome" with the overall goal of complying with the law.Thus, the fact that the vehicle's goal of law-abiding behavior did not change over the conflict-handling strategies may have resulted in no measurable differences in conflict ratings.In addition, the choice of scenario may have had an influence, as the task (picking up the person) was always achieved.We suggest that future studies should include qualitative feedback in addition to the HMII conflict items and consider scenarios in which the vehicle rejects requests without offering an alternative to the task solution.
Interestingly, compared to the condition Manual, a significantly higher conflict was also perceived in the condition Manual with (legal) Assistance, where the vehicle only alerts the driver to their intended traffic violation.So, conflict arises when the vehicle shows behavior that indicates law-abiding interests.We suspect that participants take the vehicle's statement as a judgment and thus attribute legal interests to the vehicle, which in turn conflict with their own interests.

How Does Automation Affect Situational Trust in Conflict-Prone Situations Regarding Road
Traffic Offenses?
Whether people trust an automation is a significant predictor of their reliance on it and whether they use it (Lee and Moray, 1992;Biros et al., 2004;Hergeth et al., 2016).The findings of our VR study (see Section 6.4) showed that the overall situational trust (STS-AD score) in the vehicle is rated significantly higher for R-Shift and No Shift than in the manual baseline conditions.To account for this, the subscales of the STS-AD questionnaire from which the score is derived must be considered.First, results show that the performance of the vehicle with regard to the possibility of being able to pursue NDRTs shows significant differences.This is plausible as the possibility of being able to pursue an NDRT is linked to the level of manual control: the higher this level (full control in manual driving conditions vs. control shift in RC-Shift condition vs. no manual control in R-Shift and No-Shift), the less capacity remains for an NDRT.Second, the vehicle's reaction was rated significantly more appropriate to the environment when the vehicle indicated law-abiding interests.One explanation for this could be that in these conditions, the vehicle shows environmental awareness and is able to interpret the situation correctly, whereas in the manual driving condition, no reaction and, thus, no environmental awareness is demonstrated.
Interestingly, our statistical analysis further revealed that, with moderate evidence, there is no difference in the overall situational trust for AV's legal boundary-handling strategies (RC-Shift, R-Shift, and No Shift).This suggests that in conflictual situations in which the AV evidently behaves reliably, situational trust is not affected by whether the user's goals can be achieved.This is in contrast to the results of Li et al. (2019), showing that differences in the expected behavior of an automation and that of the actual behavior can negatively affect trust, even if the system is reliable, which is also the case in our study.We suspect that participants wish the AV to behave in a way that violates the traffic regulations but do not expect the AV to actually do so.This was already indicated by the results of our online study (see Section 3.5), in which participants expect the AV to behave as it does in the video (law-abiding), whereas they would have behaved differently themselves.

How does Social Influence Affect Conflict and Behavior in Conflict-Prone Situations Regarding
Road Traffic Offenses in Automated Driving?
In the online study, we found that the influence of disorderly-behaving vehicles in the surrounding is significantly lower in the parking scenario than in the compared scenarios.This is also reflected in the results of our main study (see Section 6), which shows that the behavior of other vehicles did not significantly alter behavior, conflict, or situational trust.Nevertheless, the qualitative feedback of the online study allows the conclusion that a disadvantage due to the law-abiding behavior of the AV compared to manual disorderly behaving vehicles may be grounds for intervention.Since other works additionally showed the influence of other vehicles on manual driving behavior, for example, on speeding (Connolly and Åberg, 1993) or a general risky driving behavior (McNabb et al., 2017), we suggest considering this factor in further scenarios to be able to make more general statements in this regard.

Practical Implications
Different interaction strategies, such as cooperative and collaborative approaches, shared control strategies, or takeovers and handovers (see Walch et al. (2019b) for an overview) affect the level of control and responsibility that users have over the AV.Our work shows that there will most likely be legal conflict-prone situations where users take advantage of the possibility of taking over responsibility or control to enforce their interests.This can cause safetycritical situations (see e.g., Merat et al. (2019); Endsley and Kiris (1995); Merat et al. (2014); Gold et al. (2016Gold et al. ( , 2013))), especially when handling legal boundaries with RC-Shifts.Further, this can cause inefficient traffic flow (Wang et al., 2017), or economic problems (Gössling et al., 2022).
Our study found that conflict arises as soon as the vehicle shows behavior that indicates law-abiding interests.In the context of the simultaneously increasing situational trust, the question arises whether legal conflicts should actually be avoided or whether they are accepted and merely need to be managed appropriately.Additionally, we found that neither conflict nor trust changes significantly within the legal boundary-handling strategies.This, combined with the fact that the restrictive No Shift strategy is the best in terms of safety, as it can prevent traffic violations, suggests that rejecting traffic-regulation-violating requests should be preferred.However, future work should investigate the long-term effects of the legal boundary handling strategies on user acceptance, particularly for the restrictive No Shift strategy.
In general, when designing future interaction strategies, it is necessary to consider possible legal conflictual situations and not to neglect that the capabilities provided to users to intervene in an AV's behavior may be used to request or enforce unlawful behavior.Further, we recommend considering how AVs can interact with the users to prevent them from enforcing disorderly interests, for example, through persuasive methods (as already considered in HRI (Babel and Baumann, 2022;Babel et al., 2022b).Ultimately, it must be considered whether user interventions should be possible at all or how (see also Colley et al. (2022b)) as they maintain user control but can lead to conflict and safety-critical situations.
The practical implications of our studies point to a potential tension between different stakeholders, including legislators, vehicle manufacturers, and users.The current legal situation in Germany is that full liability lies with the driver.Automobile manufacturers strive to act in the best interest of the user, i.e., to avoid conflicts without compromising safety.Therefore, it stands to reason that manufacturers are interested in allowing users to influence AV behavior and maintain control.The results of our studies highlight the need for clear and consistent regulations for AVs in legal conflictual situations.This, in turn, can create tension between the interests of users and the legal constraints of AVs.We see a collaborative effort from all stakeholders as indispensable and that Human-Computer Interaction (HCI) can play a significant role in addressing this tension and finding solutions for it.

Limitations and Future Work
Even though the sequence of conditions in the VR study was balanced using a Latin square to reduce order effects, the multiple exposures of the same traffic scenario could impact the rating.A Friedman test found a significant influence of the exposure within individuals on Effect of Vehicle Behavior on Decision ( 2 (9)=18.95,p=.03, Ŵ =.07).However, a post-hoc test found no significant differences.No exposure effects were found for the remaining objective measurements.For the binomial data of law-abiding parking behavior, we used Cochran's Q test which also found no significant exposure effect within individuals.
A general limitation of simulator studies is that the risks and consequences that users are exposed to during intervening maneuvers in the context of violating traffic regulations are not apparent, which might influence participants' behavior.
Additionally, a moderate number of participants took part (N=30).As most of the participants were younger persons, it remains unclear how this work's findings are transferable to other age groups.For this first evaluation, we minimized the cultural bias and focused on one country.However, to improve the generalizability of the results, the behavior and attitude of users from other countries need to be considered in future investigations since, as shown by Warner et al. (2011), there are cross-cultural differences and the variability of penalty fees across different countries could have impacts.Future research could further explore the influence of factors such as long-term experience with AVs.
In general, the simulation could also benefit from using simulators with higher degrees of freedom (e.g., Colley et al. (2022a) or Hock et al. (2022)).
Our work compared possible handling strategies for legal boundaries on the level of the driving task.It showed that the behavior of the users did not differ between the manual baselines and the AV's legal boundary-handling strategies.Hence, in future work, we want to deal more with the question of how AVs can prevent disorderly requests/change the user's interests and reinforce law-abiding behavior in the long term from a non-driving-related level.

Conclusion
Today, there are situations in which drivers violate traffic regulations.Contrasting, AVs will most likely have to comply with traffic regulations.In an online within-subject study with N=49 participants, we evaluated today's attitude towards law-abiding AVs in conflict-prone situations.We found that for all scenarios considered (Speeding, Parking, and Distance), most participants indicated that they would violate the traffic regulations, while comparatively few wanted the AV to act in this way.Further, at least half of the participants (three-quarters in the parking scenario) indicated that they would intervene in the AV's behavior.We found that law-abiding AV behavior was desired in all considered scenarios but that users simultaneously would intervene in the automation to assert traffic-violating interests.This supported our assumption that in cooperative approaches, traffic-violating requests will occur, which the AV must be able to handle.Based on the findings of our online study, we conducted a VR simulator study with N=30 participants, examining legal boundary-handling strategies (1) RC-Shift, (2) R-Shift, and (3) No Shift.We further investigated the influence of other disorderly-behaving vehicles in the prohibited parking scenario, which was identified as conflictual through the online study.Results show that in each condition, most participants stopped in the absolute no-stopping area, which in the automated conditions meant that, if allowed by the vehicle, they took over control or responsibility to intentionally violate the traffic regulation.Further, a significantly higher conflict was perceived for each Interaction Strategy compared to the Manual baseline.However, with strong evidence, there are no differences in the perceived conflict, and, with moderate evidence, there are no differences in the situational trust among the legal boundary-handling strategies.Based on the findings of our studies, we discussed our research question and practical implications.

A.1. Main Study
describe conflict as the state in which one or more agents cannot reach their goal, DeChurch et al. (2013) (referring to a definition of De Dreu and Gelfand (2008)) and McNeese et al. (2021) understand conflict as an (interactive) process.

Figure 1 :
Figure 1: Scenarios of the online study.Showing screenshots of (1) the parking scenario and (2) the country road with overtaking vehicles, such as in the speeding and distance scenario.

Figure 2 :
Figure 2: Number of participants who agreed to the statement based on the question items presented in Section 3.2.

Figure 3 :
Figure 3: Illustration of the assessed Interaction Strategies in the VR study.With the two baselines Manual and Manual with Assistance left and the three identified legal boundary-handling strategies RC-Shift, R-Shift, and No Shift.The letters R stands for Responsibility, C stands for Control, and A for Assistance.The position of the letters shows whether the control and responsibility lie with the driver or the vehicle.Further, blue arrows indicate adoptions, while gray arrows indicate no changes.

Figure 4 :
Figure 4: Overview of the scene.(1) Starting point of the participants' vehicle, (2) Absolute no-stopping area marked by six no-stopping signs on the pavement next to the road, (3) vehicles unlawfully parking on the sidewalk (only visible in the conditions with influencing vehicles, (4) person to be taken along, and (5) free parking area.

Figure 6 :
Figure 6: Own questionnaire items, which have been translated to English for the main paper body.
ManualManually pull over to the right to pick up the person Manually drive to the parking area and wait for the person to arrive at the vehicle Manual with Assistance Manually pull over to the right to pick up the person Attention!You are currently in a strict no-parking zone.
Table 1Interaction possibilities with given vehicle output, based on level of control.All speech outputs are translated from German.

Table 4 :
Conflict and Overall Trust for each of the Interaction Strategies.