Review article
Augmented reality applications for K-12 education: A systematic review from the usability and user experience perspective

https://doi.org/10.1016/j.ijcci.2021.100321Get rights and content

Abstract

In the past two decades, we have witnessed soaring efforts in applying Augmented Reality (AR) technology in education. Several systematic literature reviews (SLRs) were conducted to study AR educational applications (AREAs) and associated methodologies, primarily from the pedagogical rather than from the human–computer interaction (HCI) perspective. These reviews vary in goal, scale, scope, technique, outcome and quality. To bridge the gaps identified in these SLRs, ours is to meet fourfold objectives: to ground the analysis deeper in the usability and user experience (UX) core concepts and methods; to study the learning effect and usability/UX of AREAs and their relations by learner age; to reflect on the prevailing SLR process and propose improvement; to draw implications for the future development of AREAs. Our searches in four databases returned 714 papers of which 42, together with 7 from three existing SLRs, were included in the final analysis. Several intriguing findings have been identified: (i) the insufficient grounding in usability/UX frameworks indicates that there seems a disconnection between the HCI and technology-enhanced learning community; (ii) a lack of innovative AR-specific usability/UX evaluation methods and the continuing reliance on questionnaire may hamper the advances of AREAs; (iii) the learner age seems not a significant factor in determining the perceived usability and UX or the learning effect of AREAs; (iv) a limited number of studies at home suggests the missed opportunity of mobilizing parents to support children to deploy AREAs in different settings; (v) the number of AREAs for children with special needs remains disappointedly low; (vi) the threat of predatory journals to the quality of bibliometric sources amplifies the need for a robust approach to the quality assessment for SLR and transparency of interim results. Implications of these issues for future research and practice on AREAs are drawn.

Introduction

Augmented Reality (AR) is a form of technology that superimposes 3D virtual objects or content in a real-world environment to create a sense of mixed reality (Azuma, 1997, Milgram et al., 1995). In the recent decade, AR technology has become increasingly sophisticated, advancing from conventional fiducial markers and location-based GPS (e.g. Pokéman GO) to sophisticated depth cameras (e.g. Google Glass, Hololens) to create richer interaction experiences. These technological advances have stimulated research efforts in various sectors, especially education, to harness the power of AR to transform the prevailing work.

In the ever-growing number of research studies exploring how AR applications could help realize specific educational goals, a plethora of design and evaluation methodologies has been employed. Understandably, many of these studies focus on their methodological approaches from the pedagogical perspective, such as applying the constructivist learning theories to develop AR-based learning materials (e.g. Chang et al., 2016, Moreno-Guerrero et al., 2020, Wojciechowski and Cellary, 2013) and employing the traditional pretest–posttest method to evaluate AR-induced learning effects (e.g. Fokides and Mastrokoukou, 2018, Lu et al., 2020, Winarni and Purwandari, 2019). Despite the uptake of AR technology in education started only about two decades ago, several systematic literature reviews (SLRs) or survey1 on AR educational applications (AREAs) have already been conducted, albeit with varied quality. In a nutshell, an SLR aims to identify relevant research studies on a specific topic, analyze and synthesize constructs of interest systematically, thereby producing a broad as well as deep understanding of that topic and drawing implications for future research and practice (Siddaway, Wood, & Hedges, 2019).

The existing SLRs on AREAs address primarily their educational impacts rather than their usability and user experience (UX), which are critical qualities for determining the acceptance and adoption of AR as new teaching and learning tools. As usability and UX are the main concepts of this work, it is necessary to define them upfront here (cf. Section 2.2). Usability is a core notion in the field of Human–Computer Interaction (HCI) with a widely accepted definition documented in the standard ISO 9241-210: 2019, 3.13: “The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use”. Accordingly, an interactive system is usable when it can support its users to achieve their goals by completing related tasks with low or no error-rate, using optimal resources in terms of time and mental effort, and feeling satisfied with comfort. Otherwise, the design of the system is flawed with usability problems that undermine user acceptance. The notion of UX emerged when the HCI community had become aware of the limitations of the traditional usability paradigm. Moving beyond the non-utilitarian aspect of human–technology interactions, UX puts emphasis on user affect and sensation, and the meaningfulness of such interactions in everyday life (Hassenzahl, 2005). In ISO 9241-210:2019, 3.15, UX is defined as “user’s perceptions and responses that result from the use and/or anticipated use of a system, product or service”. This broad definition seems to imply a subsumptive relation between usability and UX, though it is not explicitly stated in the standard. Aligning with the view of some but not all HCI professionals, we adopt the stance that usability is part of UX. Nonetheless, to accommodate the range of research studies with some addressing only the usability aspect of AREAs (e.g. user performance) and some covering the UX aspect as well (e.g. user emotion), we use both terms throughout this paper.

A handful of reviews studying usability/UX of AR-based applications in education and other domains are available. The survey conducted by Santos and five colleagues (Santos et al., 2013) covers the AREA research studies published in 2002–2012 with discovering usability issues being one of their review foci. The SLR carried out by Dey and colleagues (Dey, Billinghurst, Lindeman, & Swan, 2018) covers the related work published in 2005–2014 with education being one of the AR application domains. In their review of AREA publications in 2011–2015, Akçayır and Akçayır (2017) briefly mentioned usability as a factor undermining the positive learning effect of AR. However, none of the three studies analyzed the usability issues systematically. Furthermore, two usability/UX-focused reviews on non-education-specific AR applications are available. The work of Swan and Gabbard (2005) was probably the first endeavor of this kind. They argued for the need of user-based studies to advance the development and uptake of AR applications. Built upon the work of Swan and Gabbard (2005), Bai and Blackwell (2012) conducted an analytic review to investigate usability/UX studies in the context of AR research in different domains other than education.

Overall, while the aforementioned reviews did provide some useful information on the usage of AREAs in general and their design and evaluation issues in particular, there remain questions to be answered: Which usability/UX core concepts are used to inform the design and evaluation of AREAs? How established usability/UX methodologies are employed to design and evaluate AREAs? Are any novel usability/UX methods and tools created to address AR features? What are the relations between the usability/UX and learning effect evaluated in the research studies on AREAs? Methodologically, these and other questions along this line of inquiry can viably be explored with an SLR.

To explore the aforementioned questions, we conducted an SLR on AREAs designed for learners in K-12 education (i.e. from kindergartens up to secondary schools). A key rationale for this inclusion criterion is that such end-users of AREAs are sensitive to usability/UX issues, which can undermine their acceptance of new educational technologies (Munsinger and Quarles, 2019, Sim et al., 2006). This is particularly relevant as many of them are yet to develop skills to circumvent interaction issues arisen, which their older counterparts in tertiary education are more equipped to handle. Furthermore, we will examine whether and how perceived usability/UX and learning efficacy of AREAs vary with learner age.

Our SLR followed the well-recognized Preferred Reporting Items for Systematic Reviews and Meta-analysis (PRISMA) guidelines (Moher, Liberati, Tetzlaff, Altman, & Group, 2009) and involved searches in four databases and existing SLRs. The process of identification, screening and filtering has resulted in a batch of 49 included papers (Section 3). In particular, when planning and implementing the process of SLR, we identified some limitations such as the lack of explicit guidelines for the quality assessment of the articles retrieved (Section 2.3) and the insufficient transparency of reporting intermediate results. We have introduced alternative approaches, namely, employing publicly available citation count and h-index as complementary quality criteria, and using tree as and Venn diagrams (i.e. figures in Section 3) as supplementary reporting tools. Nonetheless, without the intention to claim that they are perfect solutions for the limitations found, we aimed to invite feedback from the wider research community on these attempts to improve the process of SLR.

Overall, the main research goal of our SLR is to gain data-driven insights into design and evaluation of augmented reality educational tools used by schools. This goal informs six research questions (RQs):

  • RQ1.

    Are there any discernible patterns of target groups, learning subjects and settings in deploying AREAs?

  • RQ2.

    What is the trend in hardware and software tools used for developing AREA over time?

  • RQ3.

    Which usability/UX frameworks, concepts, methods and tools have been used for the design and evaluation of the AREA?

  • RQ4.

    What usability/UX problems of AREAs have been identified and whether as well as how they have been addressed?

  • RQ5.

    What are the relations between usability/UX qualities and learning efficacy of AREAs?

  • RQ6.

    How are usability/UX qualities and learning effect of AREAs related to age groups?

Answers to these RQs are based on qualitative synthesis of the studies presented in the 49 papers included in our SLR, which is not a meta-analysis, as it does not rely on statistical approaches. A caveat is that our work does not aim to prescribe a set of usability/UX frameworks and methodologies that the research studies on AREAs should use. In contrast, we use an inductive approach to identify which concepts, models, methods and tools have been applied to gain an in-depth understanding about their potentials as well as limitations, thereby drawing implications for improvement. Overall, the contributions of our SLR are:

  • ground the analysis deeper in the usability and UX core concepts;

  • analyze the relation between the learning effect of AREAs and their usability/UX;

  • examine how usability/UX issues of AREAs vary with age groups;

  • reflect on the prevailing SLR process and introduce alternative approaches for improvement;

  • draw relevant implications for future work on AREAs;

The rest of the paper is structured as follows: In Section 2, three strands of the related work are reviewed. In Section 3, a detailed description of the SLR methodology, which comprises three main stages — identification, screening/filtering, and synthesis, is presented. In Section 4, results are reported with Sections 4.1 Patterns of basic attributes, 4.2 Patterns of contextual attributes focusing on basic attributes pertinent to AREAs and Section 4.3 on usability and UX. In Section 5, the six research questions are answered with respect to the insights gained from the SLR outcomes, implications and limitations are discussed. The paper is concluded in Section 6.

Section snippets

Related work

Three strands of work are relevant to our realization of SLR: uses of AR in education (Section 2.1); core concepts of usability and UX (Section 2.2); how systematic review differs from scoping review (Section 2.3). While the relevance of providing the background for the first two strands is self-explanatory, reviewing the arguments for the unique characteristics of SLR, especially the notion of quality assessment, is pertinent as it informs our methodological decisions.

Method

In the following we report the three stages of our SLR: Identification, Screening/Filtering, and Synthesis. While the two authors are the core contributors of all three stages, they have been supported in Stage 1 and Stage 2 by a research assistant and eight trained postgraduate students to realize the laborious process of the SLR. The first and second author have about twenty and ten years of research experience in HCI methodologies, respectively, and both are actively involved in exploring

Results

Given the focus of this SLR, we mainly present the synthesis results based on the 49 papers included in the final batch of usability/UX with quality assessment (i.e. the innermost circle of Fig. 2). The above detailed descriptions of the three-stage process are to ensure the transparency and replicability of our SLR. In the following, we first present the results about the basic attributes of the papers, including year/source of publication and application domain (Section 4.1), followed by the

Discussion

In this section, we revisit the six research questions posed in Introduction (Section 1) by referring to our analysis and synthesis results presented above (Section 4) and to the comparable findings of the existing SLRs (Section 2.1), where appropriate. Implications are drawn for the future work on AREAs, especially from the usability and UX perspective.

Conclusion

The recent growth of research interest and effort in Augmented Reality, especially in the education sector, has inspired as well as motivated us to conduct a review of the published literature systematically. While clearly we are not the first (or the last) research group taking up this challenge, we aimed to bring specific contributions to this burgeoning area. In contrast to the existing SLRs, we endeavored to investigate the issues pertaining to usability and UX of AREAs by grounding the

Selection and participation

There were no participants involved.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

The publication has been supported by European Union’s Horizon 2020 research and innovation program under grant agreement No. 856533, project ARETE. We would like to express our thanks to anonymous reviewers of the earlier drafts of this paper; their constructive comments have helped improve its quality.

All 49 papers included in the SLR sorted by ID number:

  • Ex001

    Di Serio, Á., Ibáñez, M. B., & Kloos, C. D. (2013). Impact of an augmented reality system on students’ motivation for a visual art course. Computers & Education, 68, 586–596.

  • Ex002

    Sin, A. K., &

References (94)

  • Martín-MartínA. et al.

    Google scholar, web of science, and scopus: A systematic comparison of citations in 252 subject categories

    Journal of Informetrics

    (2018)
  • PajićD.

    On the stability of citation-based journal rankings

    Journal of Informetrics

    (2015)
  • SimG. et al.

    All work and no play: Measuring fun, usability, and learning in software for children

    Computers & Education

    (2006)
  • SwellerJ.

    Cognitive load theory, learning difficulty, and instructional design

    Learning and Instruction

    (1994)
  • WojciechowskiR. et al.

    Evaluation of learners’ attitude toward learning in ARIES augmented reality environments

    Computers & Education

    (2013)
  • AndersonF. et al.

    Augmented reality improves myoelectric prosthesis training

    International Journal on Disability and Human Development

    (2014)
  • ArkseyH. et al.

    Scoping studies: towards a methodological framework

    International Journal Social Research Methodology

    (2005)
  • AzumaR.T.

    A survey of augmented reality

    Presence: Teleoperators & Virtual Environments

    (1997)
  • BaccaJ. et al.

    Augmented reality trends in education: a systematic review of research and applications

    Educational Technology & Society

    (2014)
  • Bekker, T., & Antle, A. N. (2011). Developmentally situated design (DSD) making theoretical knowledge accessible to...
  • BennettJ.L.

    The commercial impact of usability in interactive systems

    Man-computer Communication, Infotech State-of-the-Art

    (1979)
  • BerrymanD.R.

    Augmented reality: a review

    Medical Reference Services Quarterly

    (2012)
  • BillinghurstM. et al.

    A survey of augmented reality

    Foundations and Trends in Human-Computer Interaction

    (2015)
  • BrookeJ.

    System Usability Scale (SUS): A quick-and-dirty method of system evaluation user information

    (1986)
  • ChangR.-C. et al.

    Developing an interactive augmented reality system as a complement to plant education and comparing its effectiveness with video learning

    Interactive Learning Environments

    (2016)
  • ChengK.H. et al.

    Affordances of augmented reality in science learning: Suggestions for future research

    Journal of Science Education and Technology

    (2013)
  • CsikszentmihalyiM.

    Flow: The psychology of optimal experience

    (1990)
  • Da GamaA. et al.

    Guidance and movement correction based on therapeutics movements for motor rehabilitation support systems

  • DavisF.D.

    Perceived usefulness, perceived ease of use, and user acceptance of information technology

    MIS Quarterly

    (1989)
  • DeyA. et al.

    A systematic review of 10 years of augmented reality usability studies: 2005 to 2014

    Frontiers in Robotics and AI

    (2018)
  • DiegmannP. et al.

    Benefits of augmented reality in educational environments-a systematic literature review

    Benefits

    (2015)
  • DucN.M. et al.

    Predatory open access journals are indexed in reputable databases: a revisiting issue or an unsolved problem

    Medical Archives

    (2020)
  • DumasJ.S. et al.

    A practical guide to usability testing

  • DurraniU. et al.

    Integration of virtual reality and augmented reality: Are they worth the effort in education?

  • EndsleyT.C. et al.

    Augmented reality design heuristics: Designing for dynamic interactions

  • FanM. et al.

    Augmented reality for early language learning: A systematic review of augmented reality application design, instructional strategies, and evaluation outcomes

    Journal of Educational Computing Research

    (2020)
  • FokidesE. et al.

    Results from a study for teaching human body systems to primary school students using tablets

    Contemporary Educational Technology

    (2018)
  • GabbardJ. et al.

    Usability engineering: domain analysis activities for augmented-reality systems

  • GarsideR.

    Should we appraise the quality of qualitative research reports for systematic reviews, and if so, how?

    Innovation

    (2014)
  • Gómez-López, P., Simarro, F. M., & Bonal, M. T. L. (2019). Analysing the UX scope through its definitions. In...
  • GonzalezS.L. et al.

    Do gross and fine motor skills differentially contribute to language outcomes? A systematic review

    Frontiers in Psychology

    (2019)
  • GouldJ.D. et al.

    Designing for usability: key principles and what designers think

    Communications of the ACM

    (1985)
  • GrudinJ.

    From tool to partner: The evolution of human–computer interaction

    Synthesis Lectures on Human-centered Interaction

    (2017)
  • HassenzahlM.

    The thing and I: understanding the relationship between user and product

  • HassenzahlM. et al.

    User experience-a research agenda

    Behaviour & Information Technology

    (2006)
  • HodhodR. et al.

    Adaptive augmented reality serious game to foster problem solving skills

  • HornbækK. et al.

    Technology acceptance and user experience: A review of the experiential component in HCI

    ACM Transactions on Computer-Human Interaction

    (2017)
  • View full text