Introduction

Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder characterised by impairments in social communication and interaction, repetitive patterns of behaviour, and restricted interests (APA, 2013). Despite several interventions being available for this population, there remains limited evidence regarding their effectiveness in healthcare and care practice, possibly due to the heterogeneity of ASD (Lai et al., 2020; Mazza et al., 2021). There has been a strong interest in the use of technology for individuals with ASD (Grossard et al., 2018; Yuan & Ip, 2018). Among the technologies proposed, social robots have received particular attention, with a strong focus on intervention practices (Cabibihan et al., 2013). A substantial body of literature is available on the implementation of social robots for people with ASD. The review by Diehl et al. (2012) is the first critical assessment of the clinical use of robots for individuals with ASD. Their review investigated the responses of individuals with ASD to the robot, the behaviours elicited, and the robot's use in modelling, teaching skills, providing feedback, and encouraging participants. Scassellati and colleagues (2012) noted that most of the studies in the literature were lacking in quantitative rigour, with a dearth of large-scale longitudinal studies. Additionally, the systems developed for therapy applications mainly consisted of a 'Wizard of Oz' setup, where one person operates the robot remotely. Begum et al. (2016) conducted a review to comprehend the status of robot-mediated interventions as evidence-based practices. According to the authors, the use of well-defined inclusion criteria is uncommon, many studies utilised custom-made therapies, the variables considered often lack direct social significance, and most studies did not conform to standard research designs or had limited sample sizes. Pennisi et al. (2016) reported that individuals with ASD exhibit social behaviours towards social robots, rendering them effective mediators in interactions involving individuals with ASD and others. The authors also suggested that social robots are effective attractors for individuals with ASD, with one promising application being the reduction of repetitive and stereotyped behaviours. However, their review also highlighted that robots could sometimes act as distractions during task execution. Damianidou et al. (2020) investigated the impact of social robot interventions on enhancing social communication and interaction. In this context, they found that social robots were primarily utilised as agents to elicit behaviour, and most of the studies indicated a positive impact on the development of targeted skills. Specifically, the robot-mediated social interventions targeted behaviours such as joint attention, eye contact, gesture production, gesture recognition, verbal initiations, and positive affect. Saleh et al. (2021) performed a comprehensive review regarding the use of social robots with ASD participants. They searched for articles between 2008 and 2017 for studies that evaluated the use of robots in the diagnosis, rehabilitation, or education of people with ASD. Their results indicated that NAO was the most used robot and that most studies aimed to improve learning skills. They also proposed ten subcategories of studies according to their purposes (please refer to Saleh et al., 2021 for further details). Raptopoulou et al. (2021) reviewed the literature regarding the use of robots to develop social skills and communication. According to the authors, most of the identified studies used anthropomorphic robots, presented reduced sample sizes, did not include follow-up sessions, or did not have a control group. Salimi et al. (2021) conducted a systematic review of randomized controlled trials to understand if social robots could be better than traditional methods. They indicated encouraging results regarding robot-mediated interventions to prepare ASD for job interviews and teach gesture production and identification. The authors also reported that robots mainly increase engagement, a capacity that might come up short when participants are used to the robot. Sani-Bozkurt and Bozkus-Genc (2023) performed a systematic review regarding the use of robots to improve joint attention skills in individuals with ASD, indicating positive responses from participants with ASD to social robots during the interaction. However, given the methodological limitations of the identified studies, they can not draw definitive conclusions.

In summary, recent reviews of the scientific literature concur on the positive impact of using social robots for individuals with ASD. While these reviews provide valuable insights, most of them primarily focus on describing objectives, types of variables, intervention outcomes, elicited behaviours, and study structure/design. We believe that while these aspects are important, there is still a need to offer critical considerations regarding the use of social robots in the ASD population. This would primarily serve to assist clinicians in selecting and implementing social robots for individuals with ASD. For example, it's essential to determine which behaviours should be assessed during interactions with the robot, whether there are relevant factors to consider, or what recommendations professionals make after using a robot with individuals with ASD. For instance, a brief review indicated that robot therapy could be effectively integrated with low-functioning ASD individuals (Conti et al., 2020), shifting the focus towards user characteristics. Given the extensive literature available, it is worth exploring additional aspects that may be useful for clinical purposes. Furthermore, since the last critical review by Diehl et al. (2012), the literature has seen significant updates, highlighting the need for further considerations.

The present study aims to critically review, from a clinical point of view, the recent literature regarding the use of social robots in the care of patients with ASD. The objectives are to systematically examine the articles to filter out the work that is only technology-focused, then to provide a comprehensive analysis of the clinical applications, highlighting opportunities, advantages and disadvantages of the current solutions, to support the practitioners in adopting these promising technologies in daily clinical settings. This work specifically aims to address several key questions: 1) the dimensions related to the ASD user that can be analysed during the interaction with the robot, which can support clinical assessment and follow-up; 2) the advantages of using social robots in the interaction with ASD patients; 3) newly proposed uses in clinical or home settings for these robots; 4) suggestions and insights from clinical staff who have experience using these robots; 5) the various areas of intervention that have been considered to date.

Method

We conducted a systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA, Moher et al., 2009) guidelines. The search strategy covered the period up to 31 December 2022. We searched for papers published in Scopus, Web of Science, ScienceDirect, IEEEXplore, Association for Computing Machinery (ACM), and PubMed using the keywords 'autism' AND 'robot.' No restrictions were placed on text availability, publication date, or article type. We considered peer-reviewed papers eligible for review. Conference papers were also eligible since they could introduce emerging technology and new implementations. Our search yielded a total of 2797 articles, and after removing 1262 duplicates, 1535 articles remained. Figure 1 shows the flow diagram.

Fig. 1
figure 1

PRISMA flow diagram of study selection

Abstract Screening

One author screened the titles and abstracts of the identified studies. During this phase, studies were selected following the subsequent inclusion criteria: a) the study must focus on autism; b) the study must evaluate the effectiveness of one or more social robots in diagnosis, study, and/or rehabilitation; c) the study must include at least one experiment, quasi-experiment, pilot study, observational study, feasibility study, or clinical trial, moreover, we also consider studies that provided guidance regarding the design of the robot as adequate; d) the study must include at least one sample consisting solely of individuals with ASD; e) the robot must be physically present and actively used within the study; f) the paper should not primarily focus on robot capabilities and descriptions; g) the paper should be published in a peer-reviewed journal or conference; h) conference papers should be published between 2017 and 2022 (we chose a relatively wide time range of 6 years considering that the pandemic might have slowed down the publication process); i) the study should be in English; j) the paper should not be a review, a survey, a book chapter, a conference proceeding, or an editorial letter.

As a result, 1031 articles were excluded, and the remaining 504 were considered against the eligibility criteria.

Eligibility Criteria

After completing the screening of titles and abstracts, we considered articles according to five categories: Intervention Studies, where the use of robots aimed to improve skills; Experimental Studies, which demonstrated a research hypothesis; Assessment Studies, where the interaction with the robot allowed to measure a specific dimension of the participants; Design Studies, where experts in the autism field provided feedback aimed at improving the robot after its use; and Feasibility Studies, which preliminary tested a new proposal in the clinical context.

Additionally, we excluded studies that did not comprehensively report results, setting descriptions, or had limited results that hindered the assessment of effectiveness, as well as those consisting primarily of project descriptions. To ensure result accuracy, we excluded studies that did not report participant age (either through a numerical range or means and standard deviations), gender, and diagnosis. We also excluded studies that lacked reported effectiveness assessed through statistical tests, effect sizes, or accuracy outcomes. Furthermore, we excluded studies that primarily focused on programming classes, such as Wainer et al., 2010.

In the case of feasibility and design studies, statistical tests or effect sizes did not imply their exclusion. However, to align with our goal of identifying new potential applications tested in actual patients' daily contexts, we excluded feasibility studies that did not specify the context (home or clinic). Moreover, we only included design studies in which experts in the autism field provided feedback after clinicians used the social robot with patients diagnosed with ASD.

As a result, we excluded 329 articles, leaving us with 175 articles for analysis.

Results

Potential Use of the Social Robots in Detecting Characteristics and Behaviours of ASD

The systematic literature search yielded 24 articles that focused on the interaction between robots and individuals with ASD to measure behavioural variables and provide outcomes to support clinicians. Arent and colleagues (2022) recommended using interactions with a robot to assess turn-taking behaviours in children with ASD. In their study, children engaged in two 5-min games with the NAO robot after an initial familiarisation phase. Video recordings of these interactions were later analysed by professionals who assessed turn-taking behaviours using a predefined observational scale. The study's findings indicated that children with ASD exhibited lower levels of turn-taking behaviours compared to typically developing (TD) children. Alnajjar et al. (2021) employed a NAO robot (Aldebaran/Softbank NAO robot) equipped with a chest-mounted mobile phone to detect attention and various emotions (neutral, happy, surprised, angry, and sad) during a five-minute interactive dialogue with a sample of 11 male children with ASD. The robot operated autonomously, with the therapist intervening in potentially harmful situations. Alnajjar et al. reported that the system effectively measured attention, providing numerical attentional levels. Two studies conducted by Javed and colleagues (2020a; b) used video recordings of interactions between children with ASD and typically developing children and two robots, Robotis Mini and Romo. These interactions occurred during a 10-min session within a sensory maze, creating opportunities for engagement with the children. The authors utilised these interactions to assess the risk of ASD and the level of engagement among participants. The detection of the risk of ASD involved assessing facial expressions and movement patterns, while engagement was determined through eye gaze, vocalisations, and smiles. In their 2018 study, Moghadas and colleagues aimed to differentiate between individuals with ASD and those with TD by analysing the individual interactions of 8 children with ASD and 8 TD children with RobotParrot. These interactions were recorded using two cameras, and RobotParrot was remotely controlled to perform communication actions during a 190-s session with the children. Based on the child's spatial positioning during the session, the system achieved an accuracy rate of 81.3%. The study conducted by Kumazaki et al. (2019a) employed two robots to assess both the diagnosis of ASD and its severity. The protocol, implemented with two CommU robots, involved a script based on the ADOS birthday party scene (Lord et al., 2000) administered to 15 children with ASD and 19 TD children. The two robots autonomously executed the scripts while parents were allowed to stay in the room, and an external clinician observed the tasks, assigning scores based on child behaviour. Results indicated that, despite the procedure appearing suitable for assessing autism severity, it yielded poor outcomes as a diagnostic tool. Another approach to screening for autism was proposed by Dehkordi et al. (2015). They recorded video interactions, both individual and group, involving 35 individuals with ASD and RobotParrot. Individual sessions lasted 8–12 min, while group interactions extended to 20–30 min. An expert evaluated the recordings based on the criteria from the DSM-IV-TR (American Psychiatric Association, 2013), assessing aspects of social interactions, communication, and repetitive behaviours. The authors suggested that this method could effectively screen for high-severity ASD with a high level of confidence. In the last study aiming to assess ASD severity, 12 individuals with mild or minimal severity, as assessed by the Childhood Autism Rating Scale (CARS) (Schopler et al., 1980), interacted with two NAO robots (Ali et al., 2021). The participants were recorded by a Kinect, with the operator in a separate room. The two robots interacted with each participant, using signals to capture their attention. Additionally, if a participant focused their gaze on one robot for 5 s, the robot initiated an imitation task, as previously described in a study by Mehmood et al. (2019). The system detected autism severity (mild or minimal) with 76% accuracy using measures of attention and imitation accuracy. Baird et al. (2017) analysed audio data from 14 individuals with ASD from the DE-ENIGMA dataset (Shen et al., 2018) who underwent emotion recognition training led by a human or a robot. Following the Social Responsiveness Scale, second edition (Constantino & Gruber, 2012), Baird and colleagues classified most of the vocalisations made by children. Another study (Amiriparian et al., 2018) recognised echolalic patterns in 15 participants with ASD using the DE-ENIGMA dataset, achieving an accuracy of 83.5%. Rudovic et al., (2018a, b, 2019) conducted a series of studies to measure the engagement of children with ASD. They recorded video and audio during various tasks with NAO and a set of emotion-representing cards. The therapist controlled the robot in the same room. Participants had to associate the cards with the emotions expressed by the robot, imitate emotions, and describe how they would feel after listening to a story. Each task was completed within a 25-min session. Authors estimated children's engagement using multimodal data, including face, body, autonomic physiology, and audio data, yielding encouraging results. One study (Zhang et al., 2019a) used a robot to assess the theory of mind in 20 children with ASD and compared their performance with that of 20 TD children. A NAO robot and a therapist performed two false belief tasks with each child. Afterwards, the child was asked about the robot's beliefs in the presented context. The study revealed that participants with ASD exhibited lower accuracy compared to the TD children. Del Coco et al. (2018) measured the affective states of 8 participants with ASD while the robot Zeno displayed videos on a tablet. Petric et al. (2017) proposed an algorithm for recognising gestures during an autonomous imitation task. The algorithm indicated many false positives, highlighting the need for improvements. Another study utilised a multicamera array to record the interactions of 29 children with ASD and 16 children with other conditions with the robot ONO during a 5-min joint attention task (Ramírez-Duque et al., 2020). This task could be conducted by the therapist or autonomously by the robot, with the therapist introducing the robot and remaining in the room. The study aimed to measure participants' joint attention, eye contact, and adult seeking, achieving an agreement with the external judge of 67%, 76%, and 79%, respectively. Di Nuovo et al. (2018) studied the use of the robot to identify children's attention during therapy. They analysed recordings of a NAO robot embedded into the daily therapy of six children with ASD (Conti et al., 2021). In this context, each participant performed imitation tasks with the robot while the therapist was nearby. Using recordings from the frontal camera of the robot, the authors accurately detected children's visual attention during the imitation training in a clinical setting. Another study, proposed by Cai et al. (2018), aimed to extract multiple sources and provide output to the therapist. This system used a NAO robot, two Kinects, and three cameras. The system could recognise the child's actions, emotions (positive or negative), and gaze during joint attention, imitation, and turn-taking interventions, thus assessing their performance. The findings indicated that the system was suitable for measuring variables of interest and reporting intervention scores. In another identified study, the goal was to recognise actions and emotions performed by children with ASD during a therapy session (Marinoiu et al., 2018). The system used data from the DE-ENIGMA dataset and assessed positive or negative emotions, as well as various actions that could occur during therapy, such as pointing, grabbing a card, high-fiving, or waving. Finally, Fassina and colleagues (2022) introduced an algorithm to recognise various total-body gestures in real-time during a therapy protocol. This algorithm has been employed to recognise ASD gestures online during a robot-mediated intervention (Ivani et al., 2022).

Evidence from Experimental Studies

Our review identified 69 experimental studies investigating the benefits and limitations of using social robots for individuals with ASD, offering valuable insights into their applications.Footnote 1 These articles are categorised into subsections based on their outcomes.

Strengths of the Robot in the Interaction with Patients with ASD and Considerations Regarding its Features

The studies considered within the review have assessed the interaction of individuals with ASD with social robots, highlighting significant advantages and considerations. Mehmood et al. (2021a) proposed that children with ASD can become more engaged in active interactions, where the robot responds based on environmental data. Engaging children in familiar activities can enhance participation (Rakhymbayeva et al., 2021). Furthermore, tailoring sessions to align with their individual play preferences can increase engagement, as observed by Telisheva et al. (2022). It has also been suggested that there may be an optimal sequence of interaction objectives, as indicated by Baraka et al. (2022). Ensuring the robot's behaviours are predictable can benefit attention, as high variability in speech, motion, and responses might lead to reduced attention levels over time (Schadenberg et al., 2021). However, it's worth noting that a previous study found no significant differences between contingent and non-contingent robot actions (Peca et al., 2015). Children with ASD tend to follow more instructions than tasks, at least in an imitation paradigm (Arent et al., 2019). One study indicated that individuals with ASD can engage in activities involving multiple robots (Mehmood et al., 2019). Regarding the robot's appearance, one study suggested that a humanised appearance and congruent intonation tend to evoke more positive feelings. However, this aspect does not appear to influence task performance (van Straten et al., 2018). Modulating a robot's facial expressions may enhance a child's accuracy in recognising emotions (Askari et al., 2018). Additionally, stimuli from the robot have the potential to improve engagement (Li et al., 2020a), and one study reported that speech stimuli capture attention more quickly than visual stimuli (Mehmood et al., 2021b). However, this consideration should be taken with caution, as it has been reported that individuals with ASD could also be more responsive to visual or motion stimuli according to their severity (Ali et al., 2020a). The work of Jouaiti and colleagues (2022) preliminarily indicates a positive effect due to the sound emitted by the robot's actuator. Studies have indicated that interacting with social robots has the potential to reduce stress levels (Bharatharaj et al., 2017a, b). Furthermore, a separate study by Kumazaki et al. (2021a) proposed the use of huggable robots, suggesting that tactile seeking may help reduce stress levels (Kumazaki et al., 2021a). In this context, it's worth noting that huggable robots do not appear to enhance the performance of children with ASD. However, they have shown the potential to elicit heightened emotional responses and foster increased social interaction (Pinto-Bernal et al., 2022). Social robots could foster interaction with others, promoting collaboration during triadic interactions and in subsequent dyadic interactions with others (Wainer et al., 2014a). For instance, during a free-play scenario, children with ASD successfully used the robot as a mediator to interact with others (Giannopulu, 2013). Another study compared triadic interactions of children with ASD, a confederate adult, and an interaction partner, who could be a human adult, a computer game, or a social robot. Results indicated that children with ASD directed more speech to the confederate when the interaction partner was the robot; they also tended to talk more (Kim et al., 2013).

Aspects Related to the ASD User Influencing the Interaction

Articles have indicated that characteristics related to individuals with ASD can have implications for their interactions with robots. Indeed, individuals with ASD may exhibit different reactions when facing a robot. Short et al. (2017) found that children might view the robot as an engaging object that provides an enjoyable sensory experience or as an agent that elicits social behaviours. However, the most significant characteristic to consider is the symptomatologic severity of these individuals. Autism severity is associated with a preference for the type of stimuli used in a reinforcement paradigm, with children of lower severity tending to prefer visual stimuli and children with milder severity favouring physical ones (Ali et al., 2021). It also appears that different levels of severity are linked to varying levels of engagement during interactions with the robot, as children with more severe symptoms exhibit fewer instances of eye gaze and facial expressions (Ahmad et al., 2017). Individuals with higher severity levels also display lower levels of positive affect during these interactions (van den Berk-Smeekens et al., 2020), though they may experience improved engagement across sessions (Telisheva et al., 2022). Additionally, it seems that individuals with higher severity levels tend to prefer humanoid robots over those perceived as mechanical or mascots, as reported by Kumazaki et al. (2017a). However, as Kumazaki et al. noted, this preference might be due to the perceived high technical level of the robot. One study indicated that individuals with ASD exhibiting fewer symptoms and better language skills tend to initiate more interactions with both the robot and adults when prompted by the robot (Schadenberg et al., 2020). Indeed, when comparing individuals with ASD who are verbal to those who are non-verbal, the verbal group consistently tends to display higher engagement and lower levels of aggression toward the robot (Sandygulova et al., 2022). It's worth noting that Amirova et al. (2022) suggest that parental presence can significantly influence children's behaviour based on their verbal abilities. Specifically, Amirova et al. found that verbal children are more compliant with therapist and robot instructions when their parents are absent, while non-verbal children tend to be more engaged and less aggressive when their parents are present. In addition to studies examining symptom severity, other studies have explored sensory aspects. Children with ASD who exhibit hyporeactivity to visual stimuli and an overreliance on proprioceptive stimuli are likely to experience more difficulties during imitation tasks. They tend to look less at their partner and carry out fewer successful imitations, suggesting that robot-mediated interventions should consider this characteristic (Chevalier et al., 2017). Sensitivity to vision is linked to improved joint attention skills in children with ASD during robot-assisted therapy, especially for those with lower visual sensitivity. Conversely, higher sensitivity to hearing is associated with better outcomes when combining standard therapy with robot activities, potentially due to heightened responsiveness to the robot's motor sounds (Chevalier et al., 2022).

During interactions with robots in mock interviews, participants with ASD (aged between 13 and 35 years) with higher sensory sensitivity tended to find it easier to converse with the android robot with limited motion (Kumazaki et al., 2022a). Finally, one study reported effects related to children's age, indicating that preschool-aged children with ASD might show lower likability toward the robot compared to school-aged children with ASD (van den Berk-Smeekens et al., 2020). Sandygulova et al. (2022) reported similar findings, indicating that comorbid conditions, such as ADHD, could also influence the interaction.

Cultural Differences

Studies evaluated differences in the interaction due to cultural aspects. One study did not indicate differences between Japanese and French children with ASD during the interaction with the robot, contrary to what happened during the interaction with a human (Giannopulu et al., 2020). One study compared Asian and Serbian children’s interaction with the robot, indicating differences in engagement displays (Rudovic et al., 2017). Another study involved Serbian and English teachers and reported that using the robot during activities with children with ASD affected their touch styles (Li et al., 2020b).

Differences Between ASD and TD During the Interaction with the Robot

Some studies evaluated differences between individuals with ASD and TD related to the interaction with the social robot. In Barnes et al. (2021), participants with ASD showed a higher level of engagement and attention than the control group during a dancing activity with the robot. In contrast, another study indicated that, during an interaction task, children with ASD showed fewer responses to joint attention inductions, spending less time gazing toward the target and presenting higher space displacement (Anzalone et al., 2019). Moreover, children with ASD exhibit more maladaptive behaviours when interacting with the robot, intended as avoidance or distractions, socially inappropriate actions, and demands (Costescu et al., 2016). One study involved a sample of adolescents with ASD interviewed by a robot and indicated that, compared to a control group, they preferred interacting with the robot as reported from their comfort levels and demonstrating lengthier disclosure (Kumazaki et al., 2018a).

Differences Between the Interactions with Social Robots and with Human Agents

Another category of studies investigated the differences in individuals with ASD when interacting with a robot versus a human. In Cao et al. (2020), participants with ASD indicated better performance with a human partner, compared to a robot, during a joint attention task, thus confirming a previous study (Anzalone et al., 2014). This finding aligns with another study in which the robot induced a lower fixation time on a target, and children with ASD exhibited more interest in the robot (Cao et al., 2019). Similarly, Ghorbandaei Pour et al. (2018) found that, compared to a robot, participants performed better with a human agent during a facial imitation task, while another study (Taheri et al., 2021a) showed similar results between the two conditions in an imitation task. These studies suggest that human agents yield better performances. However, other studies present contrasting results. For instance, one study indicated that children with ASD performed better during interactions with a robot in a joint attention task (Kumazaki et al., 2018b). Another study revealed that, during the presentation of social prompts, children responded more to a robot partner than to a human (Kumazaki et al., 2019b). Additionally, research conducted by Trombly and colleagues (2022) demonstrated that children with ASD exhibited similar learning behaviours when instructed by humans and robots in a group classroom setting. Thus, the results regarding better performance seem contradictory. Other evidence evaluated behavioural differences of participants with ASD between the two types of agents. One study indicated that, during an imitation task, participants with ASD directed more attention to the robot and exhibited fewer stereotyped behaviours (Costa et al., 2018). Furthermore, three studies indicated that individuals with ASD tend to focus more on the robot than a human, showing a particular preference for the robot's eye area (Bekele et al., 2013, 2014; Yoshikawa et al., 2019). One study suggested that children with ASD showed higher attention levels (Warren et al., 2015a, b), as well as a visual and contact preference for the robot over the therapist (Ramírez-Duque et al., 2020)1. Another study indicated that children with ASD reported better feelings during interactions with a robot compared to interactions with a person (Giannopulu et al., 2018; Giannopulu and Watanabe, 2016). Despite discordant results regarding performance, it seems that these studies concur that children with ASD tend to be more entertained and interested in the robot than in a human partner (Wainer et al., 2014b). In line with this, Šimleša and colleagues (2022) found that children with ASD exhibited performances comparable to TD children in an imitation task. However, it was noteworthy that children with ASD displayed heightened focus and interest when interacting with robots. Other evidence supports this consideration; indeed, introducing a robot to a child with ASD has a more positive effect than introducing a new person (Bharatharaj et al., 2018). Moreover, the social robot appears to facilitate verbal and emotional expressions (Giannopulu et al., 2016) and elicit more collaborative play when compared to a human agent (Pop et al., 2014).

Three studies specifically involved children's teachers. Among these, one study (Fachantidis et al., 2020) found that children with ASD made more eye contact and interactions with the robot than with the teacher. Another study (Pliasa & Fachantidis, 2019) suggested that children with ASD demonstrated better performance and higher participation in activities when interacting with the robot. Finally, Huijnen et al. (2021) reported that participants with ASD displayed more non-verbal imitations, engaged in more physical contact, and paid greater attention to the robot than to the teacher during the sessions.

We identified only two studies that considered adolescents with ASD. These studies indicated that adolescents preferred interacting with a humanoid robot interviewer over a human (Kumazaki et al., 2018a) and followed the robot's gaze more closely (Yoshikawa et al., 2017). In a more recent study conducted by Kumazaki and colleagues (2022b), individuals diagnosed with ASD, with an average age of 21.7 years and a standard deviation of 5.1, tended to engage in greater self-disclosure when exposed to exemplification provided by an android as opposed to exemplification delivered by humans, especially when discussing negative subjects.

Additional Considerations

In addition to the experimental studies already presented, the review identified other studies that provide meaningful insights but do not directly address the subsections presented. Pop et al. (2013a) used a robot to perform social stories with children with ASD, indicating better outcomes when compared to a computer-based presentation. Another study compared the calming effect between a live dog and a robot dog and suggested higher calming effects for the first one (Silva et al., 2018a, 2019). Zantinge et al. (2019) used a social robot to induce fear in children with ASD and a control group by making the robot approach participants by emitting noise and moving its arms. Results revealed that both groups presented an increase in arousal levels; there were no differences between the two groups (Zantinge et al., 2019). Lastly, it is worth noting that one study suggests that following interactions with a robot, children tend to develop a greater appreciation for it, while parents' acceptance and overall user experience with robots may decrease (Zehnder et al., 2022).

Robot Design Suggestions from Trials in the Clinical Context

Within this category, we identified three articles where clinical professionals provided feedback and suggested improvements after testing the robot within a clinical setting. Elbeleidy (2021), and Elbeleidy et al., (2021), analysed therapists’ dialogues and interactions performed through an interface to teleoperate a social robot during interventions. Their results suggest that the interface should be adapted to allow the clinician to quickly change between the many phases that characterize the intervention sessions. Moreover, they suggested providing a dedicated view for content to establish rapport with the child and easily provide feedback. Sochanski et al. (2021) recruited ABA therapists, trained them in teleoperating Pepper using a virtual reality interface or the software Choreographe (Pot et al., 2009), and allowed them to independently design an intervention which was then delivered to a child with ASD during a single session. After that, they interviewed therapists, and the resulting transcripts were analysed. On design aspects, therapists reported concerns about timing and responsiveness in prompt delivery and the need to deliver physical prompts and be able to adapt during problematic behaviours. On the use of virtual reality, they reported a lack of awareness, preferring the use of Choreographe for more precise control of the robot’s movement. Moreover, they indicated that the robot could reduce the workload by automating repetitive interventions.

Potential Uses Highlighted by Feasibility Studies in the Clinical Field

The systematic review identified 21 articles according to the criteria. The analysis of these studies indicates new perspectives on using social robots within clinical and home-based settings. Proceeding from the most recent study, Tobar et al. (2021) proposed a portable robotic kit for reinforcement therapy to teach gestures in a home setting. While highly portable, a dedicated smartphone app provides assembly instructions tailored to specific intervention purposes. The study was conducted in a rehabilitation structure, where individuals with Level 1 ASD indicated 100% satisfaction rates, while Level 2 and Level 3 reached a satisfaction of 50%. Beaudoin et al. (2021) proposed combining NAO with a wearable haptic device to facilitate individuals with ASD transitions between activities. The haptic device sends vibration clues before the therapist’s instructions, while the robot performs a verbal script to announce the transitions. Their preliminary results indicated good responses. Another study introduced a teleoperated robot in a Serbian autism centre and a school in the UK (Li et al., 2020c). In this study, 31 minimally verbal children with ASD controlled and responded to the robot using a tablet in activities aimed at teaching face features and emotions. The robot was also being monitored and controlled by an adult in the same room as the child. Results indicated that the system was accessible to this specific clinical group. Lytridis et al. (2020) had two robots in a therapy room interact with each other to perform tasks such as greetings, following the rhythm of music, using idioms, and identifying basic emotions. Results obtained from one participant indicated that a multi-robot approach seems suitable and could improve the skills considered. An interesting study evaluated the long-term deployment of the social robot Kaspar (Dautenhahn et al., 2009) in a nursery for children with ASD, without the presence of the researchers (Syrdal et al., 2020). Staff members and volunteers were trained to use the robot in their daily activities with different scenarios. Participants were children ranging from 2 to 6 years old; they took part in the study for an average of 16.35 months. Results obtained from the study were encouraging as the presence of Kaspar resulted in positive outcomes, besides being used regularly. In Lemaignan and colleagues study (2022), Pepper was deployed in a Special Educational Needs school for children with ASD for three weeks, resulting in successful integration, consistent interactions with a significant group of children, and positive outcomes from most of the children and professionals. The study of Desideri et al. (2020) proposed a four-step model (Goal setting; Activity identification and development; Trial and implementation; and Follow-up) to co-develop robot activities to facilitate robot usage in mainstream contexts such as schools. A single-case study indicated positive outcomes in the use of the proposed model. Di Nuovo et al. (2020) used two robots, NAO and MiRO, to introduce and prepare ASD patients for clinical procedures by simulating procedures with the robots. According to their results, most participants enjoyed the interaction and showed compliance, suggesting that social robots could be supportive tools to prepare children with ASD for clinical situations. The study by Ishak et al. (2019) evaluated Rero (Cytron Technologies Sdn Bhd) to help children with ASD imitate actions, follow instructions, name objects, and match colours. Their study involved two centres for autism. Children with ASD indicated positive levels of engagement. Sandygulova et al. (2019) conducted a study in a rehabilitation centre. They performed a series of sessions with a robot and hospitalised children with ASD and ADHD for 21 days. They refined the robot throughout various sessions, according to observations and the feedback of therapists and parents. This specific approach led to positive outcomes. A series of studies (Clabaugh et al., 2018, 2019a; Pakkar et al., 2019) aimed to improve the math skills of children with ASD by deploying an autonomous robot in their home environment. The home-based intervention lasted for about a month. The robot gave instructions and feedback tailored to the child’s proficiency. Wood et al., (2017, 2018) evaluated Kaspar to improve visual perspective-taking in children with ASD. Silva et al. (2018b) proposed the social robot ZECA combined with an object-based playware technology, PlayCube, in an emotion recognition game. PlayCube is a haptic device that participants can manipulate to complete the task conducted by the robot. Participants with ASD responded positively to the activity proposed. Another study evaluated the usefulness of using the robot KIBO to teach children with ASD coding skills through a series of activities. Results indicated the robot was engaging, but children with severe symptoms seemed to perform the activities individually (Albo-Canals et al., 2018). Zaraki et al. (2018) evaluated the use of a system called Sense-Think-Act in a school setting with four children with ASD. The system was developed to make the robot a semi-autonomous social agent, that is, able to autonomously interact with children during activities under the supervision of a human operator. The study indicated the potential usefulness of the proposed system. Jimenez et al. (2017) investigated the feasibility of using a robot in a collaborative activity with three children with ASD while the teacher was observing. Specifically, to encourage collaborative learning, the robot was programmed to give incorrect answers to pre-programmed questions, and then to ask the child to teach him. The study indicated the feasibility of the proposed method. Golestan et al. (2017) conducted a feasibility study in a treatment centre where children with ASD verbally controlled the robot Sphero. During a short session of 15 min, participants showed interest in its use. Simut et al. (2016) evaluated the use of Probogotchi to set a play environment between a child with ASD and her sibling, suggesting its potential to encourage social interactions.

Interventions with Social Robots

Our review identified 58 articles that used social robots to improve ASD skills, where most studies have indicated positive results in using robots. The literature already reported that intervention studies present a wide heterogeneity of measures and methods. Giving a detailed description of the proposed interventions is out of the scope of this review. We provide an updated overview of which areas have been considered by interventions, identifying those studies with high impact in the literature. Additionally, we focus our attention on those studies that have attempted to introduce the robot in long-term ecological settings, representing a situation closer to a systematic and widespread social robot use for professionals.

Robot-mediated interventions have considered several areas; Table 1 presents a summary of the areas of intervention and related studies. A graphical representation of interventions by date and impact was generated using Litmaps (https://www.litmaps.com) and is presented in Fig. 2. In this figure, each identified article is depicted as a node, positioned based on its publication year and the logarithmic function of its number of citations. Lines connecting the nodes indicate whether these articles have been cited in other publications. This allowed us to identify those who were more impactful for each year. An interactive version is also available at the following link (https://app.litmaps.com/shared/8fa32129-f271-4e31-834b-45b0fb15aec3). The identified studies comprehend: interventions related to imitation, communicative skills, attention and engagement, social skills, theory of mind and perspective-taking, emotion recognition and expression, restricted interests and repetitive behaviours, body awareness and knowledge, cognitive flexibility, non-verbal communication, vocational skills, developmental and skills for daily activities, parental stress, and motor coordination.

Table 1 Skills considered in robot interventions
Fig. 2
figure 2

This is a graphical map representing the identified articles related to interventions for people with ASD mediated by a robot. Each article is represented by a node, and they are arranged based on the date of publication (x-axis) and a logarithmic function of the number of citations (y-axis). Lines connect the articles to indicate references between them

Among these, Clabaugh et al. (2019b) performed a long-term intervention at children’s homes for a mean of 41 days, with a minimum of 30 days. After the system setup, the research team informed the family on how to use the robot, which was fully autonomous, encouraging them to conduct five weekly sessions. The robot could tailor the intervention based on the child’s feedback. They indicated better outcomes in math reasoning and numerical operations. Scassellati and colleagues (2018) performed a long-term home-based intervention to improve social skills. Twelve families with children with ASD participated and finished the intervention. It lasted 30 days, including 30-min daily sessions, where the social robot Jibo (Jibo Inc.) autonomously conducted a daily social skills game. Games proposed by the robot consisted of a story game where the child needed to understand social situations and the character’s emotions, two virtual barrier games designed to improve perspective-taking, and ordering or sequencing games. Their results indicated that children maintained their engagement after one month of sessions, with improvements in joint attention and social behaviours directed toward others and themselves.

The most impactful article identified is the study of Tapus et al. (2012), where they used a single-subject ABAC design with four children. Their NAO robot could mirror the movements of a person in front of it. This feature was used as a reward system to encourage motor initiation towards the robot. After a familiarisation phase, the experimenter showed this feature to the participant. Children interacted autonomously with the robot, while the experimenter supervised the interaction and could offer prompts. In the last phase, the experimenter showed a movement awaiting the child’s imitation, which the robot mirrored in return. The same protocol was also performed by a human instead of the robot. Results indicated that two participants were more engaged with the NAO robot than the human, and another child performed more motor initiations with the robot than during a baseline task. Among the most recent and impactful studies, Ali et al. (2019) performed an intervention for joint attention and imitation, and their study is of particular interest as they implemented a multi-robot system to introduce children to multi-person communications. Zhang et al. (2019b) implemented a robot intervention to teach children with ASD complex social rules by making them perform tasks regarding distrust and deception. Their study indicated that children with ASD could present more difficulties in learning these social rules than a TD group. So et al., (2019a, b) proposed a robot-drama intervention to improve the social skills of children with ASD, reporting better outcomes regarding narrative skills. David et al. (2020) performed single-case experiments focused on children’s ability to wait for their turns. Each child performed a collaborative activity with the robot where they had to pair expressions appearing on a touchscreen device within their respective category when allowed to do so. The robot was remotely controlled and gave feedback during the task. Although David et al. found a similar effect between robot and human intervention, children seemed more interested in the robot. Marino et al. (2020) implemented NAO as a co-therapist to improve children's emotional comprehension. While the therapist performed the intervention, he controlled NAO through an iPad, which provided prompts using body movements and verbal scripts. Through this, the robot helped the therapist interact with children during the intervention by providing prompts and feedback and maintaining motivation and attention. Results indicated better outcomes after the proposed intervention. The study by Ghiglino and colleagues (2021) suggested that combining standard therapy and robot therapy could provide better improvements in social skills than standard therapy alone. In their study, children faced the robot Cozmo and two cubes that could light up. During the intervention, the robot looked at one of these cubes, and then the therapist asked the participants to indicate where it had looked. The robot also gave feedback regarding the correctness of the answer. Taheri et al. (2021b) used a social robot to perform a series of music games related to imitation, joint attention, and turn-taking. Their results showed improvements, including a reduction in stereotyped behaviours.

Discussion

Potential Use of the Social Robots in Detecting Characteristics and Behaviours of ASD

Although there is a strong interest in using social robots to evaluate participants with ASD, it has significantly involved the technical side, while from a clinical perspective has been neglected. Indeed, previous reviews have not discussed this topic concerning the clinical aspects, even if the assessment of people with ASD represents a fundamental step for the healthcare surveillance system. Thus, we first provide an initial critical evaluation of these studies. The analysis of data recorded from the interaction between participants with ASD and the robot may support the assessment of several dimensions: attentional levels, engagement, turn-taking, joint attention, affective states, facial expressions, movement patterns, specific actions, imitation accuracy during a task, vocalisations, and the risk or severity of ASD. The accuracy reported from the studies is encouraging, indicating high rates of agreements with external evaluators. The proposed approaches could potentially support screening, performance evaluation, and follow-up evaluations. However, we should note some limitations. Most of the reported studies did not control for novelty effects and were carried out in single brief sessions, limiting the generalisation regarding increased exposure to the robot. Only two studies implemented tasks related to psychometric measures with the robot, whose Zhang et al. (2019a) performed false belief tasks with ASD and TD children, and Kumazaki et al. (2019a) performed the birthday party scene derived from the ADOS. Despite this may indicate consideration of well-known clinical standard measurements implemented within the robot, a critical aspect is the lack of validation of assessments carried out through robots. Indeed, in our review, we did not identify valid and reliable robot-supported psychometric evaluations or any validation studies to check the psychometric properties of a robot-mediated assessment. This consideration represents a major current limitation for using robots in clinical assessment. Even if the proposed assessments may monitor the individual with ASD, most of the outcomes measured are not related to the core symptomatology of the condition. Two studies evaluated the risk of ASD based on clinical measures (ADOS and DSM-IV-TR) according to the evaluation made by experts who observed the interactions between children and robots (Dehkordi et al., 2015; Kumazaki et al., 2019a, b, c, d). We should also note that some studies indicate that movement patterns are promising in identifying the risk of ASD (Moghadas & Moradi, 2018; Javed et al., 2020a, b). Beyond stereotyped behaviours (APA, 2013), individuals with ASD also present a high prevalence of motor impairments (Ming et al., 2007) and coordination deficits (Fournier et al., 2010). Many studies that aimed to identify behaviours utilised multi-camera installations, which could hinder their practical integration. Indeed, this will require a dedicated room and significant alterations to the facilities. Robotic assessments are not conducted online; instead, they transfer the data to an external computing unit, which requires time for processing. This delay is primarily due to the current limitations of the robot's onboard computing resources. Improvements may not be easily attainable, as they are closely related to costs and battery capacity, necessitating a trade-off compromise.

Evidence from Experimental Studies

The evidence seems to indicate that one strength of the robot could be the predictability of its behaviour, as when this aspect is missing, participants with ASD are less interested in the robot. In addition, the robot is an effective medium for improving interactions with others, reducing stress, and capturing attention. We should note that the number of studies considering these aspects is still limited. We also identified studies that accounted for the characteristics of participants with ASD during the interaction. Among them, autism severity is the most studied. Autism severity influences the quality of the interaction with the robot and is related to different behavioural manifestations. However, we should note that severity was associated with lesser eye gaze, facial expressions, and interactions in line with what one would expect when the severity levels are higher (Madipakkam et al., 2017; Scheeren et al., 2012). Children with ASD also show more maladaptive behaviours than TD when interacting with the robot, still as one would expect. The review revealed interesting findings regarding children's verbal abilities. Two studies suggested that children with weaker language skills may have less favourable interactions with the robot (Sandygulova et al., 2022; Schadenberg et al., 2020). However, one study suggested that the presence of parents could moderate this effect. Hence, future research should take into account the verbal skills of the ASD sample to better address this aspect. Based on these considerations, we might conclude that sole exposure to the social robot (intended as a panacea) could not reduce the maladaptive behaviours of the individual with ASD. Thus, the knowledge and experience of clinical therapists in ASD are crucial for its proper use. We believe it is relevant to report that one study indicated an association between severity and stimulus preference (Ali et al., 2021). Stimulus preference has also been linked to perceptive sensitivity, as demonstrated by two studies involving children, showing that perceptive sensitivity influences the interaction with the robot (Chevalier et al., 2017) and the outcomes of the robot-mediated intervention (Chevalier et al., 2022). Kumazaki and colleagues (2022a) also indicated that sensory sensitivity affects the interaction with the robot, even in older participants and adults. An atypical sensory experience is included among diagnostic criteria and occurs in 90% of individuals with ASD (Robertson & Baron-Cohen, 2017). Thus, this is another aspect that requires further consideration as the robot typically relies on sensory stimuli to attract attention or to interact. Overall, these results indicate that assessing language skills and sensory sensitivity is recommended in studies utilising robots with the ASD population. Regarding culture, the most relevant outcome indicated differences in engagement displays (Rudovic et al., 2017), which we suppose could be unrelated to the robot itself but could suggest the importance of validating behaviour assessments and identification (discussed in the previous section) accounting for cultural differences. The comparison between ASD and TD peers during interactions with robots reveals that children with ASD are more engaged with the robot, while adolescents with ASD tend to feel more comfortable. Notably, the latter group interacted with an android robot that resembled a human (Kumazaki et al., 2018a). Furthermore, two studies indicate that school-aged children with ASD exhibit a stronger preference for robots compared to preschoolers. Although a systematic review of prior literature did not identify an ideal age for using robots with individuals with ASD (Pennisi et al., 2016), these studies collectively suggest that utilising robots may be most suitable starting from school age. Nevertheless, they also emphasise the importance of employing age-appropriate robots. This is supported by Van den Berk-Smeekens et al. (2020), who suggested that the tasks implemented with the robot should align with the age-related interests of the participants to enhance likability. Future studies should delve deeper into this aspect. The comparisons between the interaction with human or robot agents during a task indicated mismatched results in terms of performance. However, studies converge on the evidence that a social robot is more engaging and entertaining than a human agent for children and adolescents. Therefore, we can conclude that this aspect represents one of the main strengths of using social robots for individuals with ASD.

Robot Design Suggestions from Trials in the Clinical Context

Studies related to this topic are limited. The main concern expressed by professionals who use robots for clinical practice focuses on the robot's capacity to adapt to situations by giving timely responses (Elbeleidy, 2021, Elbeleidy et al., 2021; Sochanski et al., 2021; Pot et al., 2009). As mentioned previously, online processing capacity is still limited, and most of the studies present results analysed offline after transferring the data to an external computing unit. Therefore, we suggest that future studies should consider increasing the autonomy of the robots, especially considering how interaction with individuals with ASD requires flexibility and timely actions (Cooper et al., 2018; Leaf et al., 2014). Autonomy should include predefined behaviours discussed with clinicians.

Potential Uses Highlighted by Feasibility Studies in the Clinical Field

We found several recent feasibility studies, some of which introduced new proposals for social robots in clinical practice for ASD. These studies highlighted various benefits, including easy transportability and service implementation, the use of haptic feedback, adaptability for nonverbal users, multi-robot interventions, both home-based and clinical long-term deployments, models for integrating the robot into mainstream settings, post-deployment refinement, preparing ASD patients for hospital procedures, employing semi-autonomous robots, and new intervention implementations. These studies offer new insights into the clinical applications of robots, and rigorous methods will help establish their effectiveness in clinical contexts. It is worth noting that social robots have found implementation within healthcare services and everyday settings associated with ASD, emerging as promising tools to enhance the person-environment fit in the daily lives of these individuals (Lai et al., 2020). The study by Syrdal et al. (2020) suggests the feasibility of utilising robots for extended periods in a clinical context without the presence of researchers. The work of Lemaignan and colleagues (2022) focuses on the special education context. Further research should aim to strengthen the evidence of the positive impact of long-term deployments in these two contexts. Long-term home-based deployment studies have moved beyond feasibility and are reported among intervention studies.

Interventions with Social Robots

Articles reported interventions that targeted a wide range of skills, and social skills are those of greater interest. We identified two studies that deployed the robot in home-based natural contexts for long-term intervention, which indicated improvements in the targeted skills. This evidence highlights the possibility of implementing the robot in a natural environment, avoiding researchers' actual presence. Therefore, highlighting a potential mainstream placement of robot-mediated interventions in individuals with ASD's daily contexts. Since other reviews discussed specific areas of intervention in detail, and for summary purposes, we refer to some of these for further information (e.g. Damianidou et al., 2020; Raptopoulou et al., 2021; Salimi et al., 2021). A notable limitation in current intervention studies is robots' implementation in highly structured settings. Therefore, studies could expand the available literature regarding robot-mediated interventions by considering more natural approaches. For instance, naturalistic developmental behavioural interventions consist of structuring the intervention in natural environments and implementing natural activities and reinforcements. This approach has shown positive outcomes for clinical practice. Considering these aspects could improve the generalization of targeted skills and promote a user-centred approach.

Limitations of this Systematic Review

Since we performed a systematic review, the results rely on the research questions, the search strategy, and the inclusion and exclusion criteria. As we aimed to answer the research questions and given the number of studies identified, we did not fully address some aspects, such as sample size and the design. However, we should note that these aspects have been discussed in the literature. We did not perform a meta-analysis; future studies should further quantify the effects of our findings accordingly. For instance, it would be interesting to measure if the underlined differences between participants with ASD and TD during the interaction with the robot also reach a statistical significance, likewise regarding the differences between the interaction with robots and a human agent.

Conclusion

This article presents a systematic review and critical analysis of the scientific literature regarding the use of social robots in the care of individuals with ASD. The studies provide a large body of evidence that social robots in clinical settings can serve as successful tools for improving the quality of services provided by clinicians. However, most of the studies focused on the development of the technology and lack of significant clinical evidence. Indeed, we found clinical relevance in only 18% of the articles in our literature searchFootnote 22. Therefore, more clinical evidence is needed to validate and confirm the applicability of these results in daily practice. To this end, we encourage new interdisciplinary research with a clinical focus that emphasises service development over technology.

Furthermore, we would like to see more training programs to support health and care professionals interested in learning and practising how to use robots in their profession. We suggest expanding the offer of Continuing Professional Development courses and introducing specific hands-on workshops in Universities’ health courses to enhance practical knowledge of robots in the clinical context.