Human agency beliefs influence behaviour during virtual social interactions

In recent years, with the emergence of relatively inexpensive and accessible virtual reality technologies, it is now possible to deliver compelling and realistic simulations of human-to-human interaction. Neuroimaging studies have shown that, when participants believe they are interacting via a virtual interface with another human agent, they show different patterns of brain activity compared to when they know that their virtual partner is computer-controlled. The suggestion is that users adopt an “intentional stance” by attributing mental states to their virtual partner. However, it remains unclear how beliefs in the agency of a virtual partner influence participants’ behaviour and subjective experience of the interaction. We investigated this issue in the context of a cooperative “joint attention” game in which participants interacted via an eye tracker with a virtual onscreen partner, directing each other’s eye gaze to different screen locations. Half of the participants were correctly informed that their partner was controlled by a computer algorithm (“Computer” condition). The other half were misled into believing that the virtual character was controlled by a second participant in another room (“Human” condition). Those in the “Human” condition were slower to make eye contact with their partner and more likely to try and guide their partner before they had established mutual eye contact than participants in the “Computer” condition. They also responded more rapidly when their partner was guiding them, although the same effect was also found for a control condition in which they responded to an arrow cue. Results confirm the influence of human agency beliefs on behaviour in this virtual social interaction context. They further suggest that researchers and developers attempting to simulate social interactions should consider the impact of agency beliefs on user experience in other social contexts, and their effect on the achievement of the application’s goals.

In the current study, therefore, we investigated whether human agency beliefs have a 133 direct influence on joint attention behaviour. As in the studies reviewed above, participants 134 interacted with a virtual partner in a cooperative joint attention game. Half of the participants 135 believed that their partner was controlled by another human (Human condition). The remainder 136 were correctly informed that their partner was computer-controlled (Computer condition). Their 137 task was to catch a burglar located in one of six houses placed around the edge of the screen (see 138 Figure 1). At the start of each trial, the participant and their partner searched their allotted houses 139 and whomever found the burglar was then required to look back at the burglar to signal its 140 location. The burglar was caught when both players were looking at the correct location. Unlike 141 previous joint attention studies investigating the influence of human agency beliefs, this task 142 created a context in which sometimes the participant found the burglar and had to "Initiate" joint 143 attention, and other trials where they did not find the burglar, and had to "Respond" to their 144 partner instead. In addition to this "Social" task, participants also completed a non-social 145 "Control" task in which the virtual character's eyes remained closed and participants completed 146 the same sequence of eye-movements in response to geometric shape cues (circles and arrows).

147
We have used this task in previous studies but without the agency manipulation. In other 148 words, all participants believed that they were interacting with a real person ( Manuscript to be reviewed 155 makes a single eye-movement on each trial (Caruana et al., 2017b). This suggests that an 156 important part of the joint attention task is determining whether a shift in eye gaze is intended to 157 be communicative or not. If participants know that their partner is not human and, therefore, has 158 no mental states or intentions, they may not evaluate the communicative intent of their partner's 159 behaviour in the same way. We therefore predicted that this effect would be reduced in the 160 Computer condition compared to the Human condition..

161
Second, on "Initiate" trials, participants discover the burglar and are then required to look 162 back towards the avatar. We find they are slower to do this in the Social (IJA) condition than the 163 Nonsocial (IJAc) control condition. They are then required to either wait for eye contact from 164 their partner (IJA) or wait for the central fixation point to turn green (IJAc) before saccading 165 back to the burglar location. We have found that participants make more premature saccades 166 (i.e., failing to wait for the respective cue before looking back at the burglar) in the IJA condition 167 compared to IJAc. Again, these findings can be interpreted in terms of the inferred mental states 168 of the virtual partner. When participants think their partner is human, they assume that he will 169 intuitively know that they are looking at a location to initiate joint attention, even when eye 170 contact is not first established to signal their own communicative intent. In the Control condition, 171 they know they are interacting with the computer and so approach the task quite differently, 172 making the same robotic pattern of eye movements on each trial. Our prediction, therefore, is 173 that both of these effects will be reduced when participants know that their virtual partner is 174 computer-controlled. That is, they will approach the interaction with the virtual partner in a 175 similar fashion to their "interaction" with the symbols on the screen. If these predictions were 176 confirmed, they would provide the first direct evidence that beliefs about the human agency of   379 Participants in the Human group rated their partner as being significantly more cooperative 380 compared to participants in the Computer group W = 193.0, p = .039. They also found the task 510 trust placed in the utility of the training, companionship or therapy provided. Again, our 511 subjective ratings provide some tentative supporting evidence, with participants in the Human 512 group rating the task as being more pleasant, and their partner as more cooperative, than those in 513 the Computer group.

514
It is also possible that, when a virtual interaction appears and feels sufficiently real, users 515 may adopt an intentional stance, even when they know that their partner is not human. This is 516 supported by previous studies of human-robot interaction which report an association between 517 increased anthropomorphism and activation of brain regions implicated in mentalising processes

528
Virtual reality is a burgeoning industry that is promising many exciting applications for 529 consumers, science and enterprise, particularly given its ability to realistically simulate social 530 interactions between single users and virtual agents. In the current study, we investigate directly 531 whether beliefs about a virtual partner's human agency can significantly influence the way in 532 which users behave and feel -and present compelling evidence that at least in some interactive Manuscript to be reviewed