Elsevier

Computers & Education

Volume 62, March 2013, Pages 130-148
Computers & Education

Factors influencing beliefs for adoption of a learning analytics tool: An empirical study

https://doi.org/10.1016/j.compedu.2012.10.023Get rights and content

Abstract

Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate a Learning Analytics Acceptance Model (LAAM) of factors influencing the beliefs of educators concerning the adoption a learning analytics tool. In particular, our model explains how the usage beliefs (i.e., ease-of-use and usefulness perceptions) about the learning analytics of a tool are associated with the intention to adopt the tool. In our study, we considered several factors that could potentially affect the adoption beliefs: i) pedagogical knowledge and information design skills of educators; ii) educators' perceived utility of a learning analytics tool; and iii) educators' perceived ease-of-use of a learning analytics tool. By following the principles of Technology Acceptance Model, the study is done with a sample of educators who experimented with a LOCO-Analyst tool. Our study also determined specific analytics types that are primary antecedence of perceived usefulness (concept comprehension and social interaction) and ease-of-use (interactive visualization).

Highlights

► Gives empirically validated models of the factors influencing tool adoption beliefs. ► Those engaged in teaching are better placed to internalize perceived tool utilities. ► Users appreciate tools that provide some hints on how to improve online courses. ► No significant relationship between ease-of-use belief and intention is observed.

Introduction

Many well-acclaimed educational systems, such as Moodle1 or BlackBoard/WebCT,2 are available for educators for structuring their online courses, preparing learning contents and designing student activities according to their preferred teaching styles. Numerous online communication tools, such as Elluminate,3 are also available for educators to interact and collaborate on learning activities with students. However, when it comes to personalization of the student learning process, the support offered by learning systems is rather limited (Dabbagh & Reo, 2011; Dawson et al., 2008; Essalmi, Ayed, Jemni, Kinshuk, & Graf, 2010; Willging, 2005).

Educators who adopt online learning systems are often required to constantly adapt and evolve their online courses to assure high performance and learning efficiency of their students (Gasevic et al., 2007). Effective adaptation requires a comprehensive and insightful awareness of students' learning experiences, comprehension, and their interactions in the learning systems. By having access to the learning analytics on students' completion status of lessons and quiz scores, educators should have a better sense of: students' ability to follow and comprehend the course contents; the topics students found difficult; students' social interactions and knowledge contributions; and the like. Educators, thus, require a learning system that provides learning analytics on their online courses that are both comprehensive and informative.

One way to attain comprehensibility is to interlink semantically all the major elements (i.e., data) of a learning process, including learning activities (e.g., reading and discussing), learning content, learning outcomes, and students (Jovanovic et al., 2007). Learning analytics derived from semantically interlinked data can provide a more comprehensive picture of students learning experiences and outcomes, and thus support personalization of student learning process. To be informative, learning analytics should be such that an educator can quickly and easily get an overall insight into a certain aspect of the learning process (this can be achieved, e.g., by using effective visualization of users' interactions).

The current breed of online learning systems, however, offers limited support for such insights. Many learning analytics are superficial and include, for instance, simple statistics such as, when and how many times students logged in, or low-level data about students' interactions with learning content (e.g., pages viewed). In addition, traditional course evaluations done at the end of a semester are too late for adapting courses for current students and often lack reliable feedback. As it has been shown, most students enrolled in online courses do not complete standard course evaluation forms and their response rate is far lower than for students attending conventional classroom-based courses.4

Recognizing the above stated issues associated with online learning systems, there has been a significant research interest recently in various aspects of learning analytics (Long, Siemens, Conole, & Gasevic, 2011). While many approaches and tools for learning analytics have been proposed, there is limited empirical insights of the factors influencing potential adoption of this new technology. To address this research gap, we propose and empirically validate a Learning Analytics Acceptance Model (LAAM), which we report in this paper, to start building research understanding how the analytics provided in a learning analytics tool affect educators' adoption beliefs. In particular, we investigated how the provided learning analytics inform the usage belief of educators, i.e., ease-of-use and usefulness perceptions, and how these beliefs are mutually associated with the intention to adopt the tool. In our study, we considered several factors.

First, pedagogical knowledge and information design skills of individuals can influence their perception of the usefulness of learning systems (Bratt, 2009a, 2009b). Furthermore, McFarland and Hamilton's (2006) refined technology acceptance model recognize prior experience as one of the context factors that could potentially impact the perceived usefulness of a system. These studies suggested that the participants who evaluate online learning systems could have varied perceptions of the systems' utility based on their own pedagogical background and experience. In our study, we asked educators to enlist their academic role and years of experience. Based on their responses, we examined how the evaluator's role and years of experience influenced their perception of the usefulness of the analytics provided in a learning analytics tool. We present our results in this paper.

Second, studies have shown that decisions to adopt technology-based systems are influenced by the evaluators' perceived utility of such systems (Bratt, Coulson, & Kostasshuk, 2009). The Technology Acceptance Model (TAM) theory (Davis, 1989) – whose roots lie in Fishbein and Ajzen's (1975) Theory of Reasoned Action – posits that perceived usage beliefs determine individual behavioral intentions to use a specific technology or service. Studies in related domains have shown that perceived usage belief (usefulness) is one of the strong drivers influencing users' intentions of adopting a software tool for practice (Davies, Green, Rosemann, Indulska, & Gallo, 2006; Recker, 2010). We found it relevant to examine how the usage belief perceptions of educators would influence their behavioral intention or commitment to adopt a learning analytics tool for their practice.

Third, the relation between the ease-of-use belief and intention to adopt is not very clear, however. Some studies suggest there is a direct association between ease-of-use and intention to adopt (Davis, 1989; Gefen, Pavlou, Rise, & Warkentin, 2002). Others fail to report such an association (Warkentin, Gefen, Pavlou, & Rose, 2002). Venkatesh, Morris, Davis, and Davis (2003), however, suggest that perceived ease-of-use indirectly influences intention to adopt through perceived usefulness. Faced with these conflicting studies, we examined how the users evaluating a learning analytics tool related their ease-of-use beliefs with their perceived usefulness and intention to adopt the tool. In our survey, we measured the ease-of-use construct by asking questions about the tool's intuitiveness. Moreover, Malhotra and Galletta (1999) observed that sustainability of one's perception or attitude is an indicative of usage behaviors. In this study, we examined how an individual's use of the features of a tool influences perception, which can in turn be associated with later evaluations.

Finally by leveraging the data we collected during this study, the paper aimed to identify the variables that could be acted upon to improve the pedagogical utility (Bratt, 2007) of a learning analytics tool, in general, and its GUI (Graphical User Interface) feature, in particular. We analyzed the associations between the items to identify the best predictors of the perceived utility and usability of a learning analytics tool.

To create the LAAM model, to conduct an empirical study, and to investigate the impact of the abovementioned factors in our study, we used LOCO-Analyst, a learning analytics tool, as an object of our study. LOCO-Analyst provides learning analytics on different objects of interest at varying levels of granularity (overview of the tool is provided in the Appendix).

Section 2 provides overview of the proposed LAAM model, including the definition of research questions (which fed the development of our theoretical model). Section 2 covers method including our research design and context-specific details of the empirical study. The study results are provided in Section 4 using descriptive and inferential statistics. Discussions on the results are also covered in this section. After presenting the related work in Section 5, we conclude the paper in Section 6.

Section snippets

Learning analytics acceptance model – LAAM

Following Davis (1989), we can understand perceived usefulness as the degree to which an educator believes that using a specific online learning system will increase his/her task performance. Ease-of-use is the degree to which an educator expects the use of the learning system to be free of effort. We built a LAAM research model to investigate our research questions and hypotheses. Fig. 1 shows a high-level view of the model. This model is intended to provide a visual conceptualization of the

Design

The purpose of this study was to evaluate the proposed LAAM by using it for evaluation of a concrete learning analytics tool. In our case, we decided to use LOCO-Analyst (see Appendix for details), which offers the following types of learning analytics: single lesson analytics; composite lesson analytics; module analytics; quiz analytics; social interaction analytics; lesson interaction analytics; and topic comprehension analytics (see Table A1 in the Appendix). The participants' usage belief

Results and discussion

In this section, we present results of our statistical analysis and their interpretation. The findings reported here are based on the analysis of 22 usable responses collected from the respondents. We used the JMP tool to perform all our statistical tests. The threshold of p < 0.05 was chosen to designate the statistically significant level. Cronbach's alpha coefficient for internal consistency reliability of the survey items was α = 0.90, which is rated as good according to George and

Empirical studies of educator-directed learning tools

Research efforts oriented towards educators and the learning analytics they require are rather scarce, and there is even less reports on comprehensive evaluation studies aimed at assessing the developed learning analytics tools. In particular, we have found only two papers reporting on the methodology and the results of such evaluation studies. One paper by Mazza and Dimitrova (2007) reports on the evaluation of the CourseViz tool, whereas the other, by Kosba, Dimitrova, and Boyle (2007)

Conclusion

In this empirical study we proposed and validated a Leaning Analytics Acceptance Model for adopting an learning analytics tool based on the perceived usage beliefs of educators about the learning analytics provided in a tool. To our knowledge, there have been no previous attempts that tried to build and empirically validate theoretical models of the factors influencing the beliefs for adoption of learning analytics tools. We hope that the model contributed and empirically studied (Fig. 2, Fig. 3

References (50)

  • S. Bratt

    A framework for assessing the pedagogical utility of learning management systems

  • Bratt, S. (2009a). A framework for assessing the pedagogical utility of learning management systems. World Conference...
  • S. Bratt

    Development of an instrument to assess pedagogical utility in e-Learning systems

    (2009)
  • Bratt, S., Coulson, I., & Kostasshuk, S. (2009). Utilizing a learning management system in a blended learning design to...
  • B.B. Bretzing et al.

    Note taking and passage style

    Journal of Educational Psychology

    (1981)
  • J. Carifio

    Measuring vocational preferences: ranking versus categorical rating procedures

    Career Education Quarterly

    (1978)
  • J. Carifio et al.

    Ten common misunderstandings, misconceptions, persistent myths and urban legends about Likert scales and Likert response formats and their antidotes

    Journal of the Social Sciences

    (2007)
  • J. Carifio et al.

    Resolving the 50-year debate around using and misusing Likert scales

    Medical Education

    (2008)
  • D.N. Chin

    Empirical evaluation of user models and user-adapted systems

    User Modeling and User-adapted Interaction

    (2001)
  • N. Dabbagh et al.

    Impact of Web 2.0 on higher education

  • F.D. Davis

    Perceived usefulness, perceived ease of use, and user acceptance of information technology

    MIS Quarterly

    (1989)
  • Dawson, S., McWilliam, E., & Tan, J. (2008). Teaching Smarter: how mining ICT data can inform and improve learning and...
  • C.A. Elliott et al.

    Statistical analysis: Quick reference guidebook

    (2007)
  • M. Fishbein et al.

    Belief, attitude, intention, and behavior: An introduction to theory and research

    (1975)
  • D. Gašević et al.

    Ontology-based annotation of learning object content

    Interactive Learning Environments

    (2007)
  • Cited by (86)

    • Learning Analytics: a bibliometric analysis of the literature over the last decade

      2021, International Journal of Educational Research Open
    • Categorizing learning analytics models according to their goals and identifying their relevant components: A review of the learning analytics literature from 2011 to 2019

      2021, Computers and Education: Artificial Intelligence
      Citation Excerpt :

      Feedback and performance are the third most frequently used model components (used in eight studies). The studies in which these components were used include those of Ali et al. (2013), Taraghi et al. (2015), Hadhrami (2017), and Zhang and Liu (2019). “Ethics and privacy” were used as model components in 2.97% of the reviewed articles; however, the results of Banihashem et al. (2018) indicate that ethics and privacy are major challenges in the application of LA in education.

    View all citing articles on Scopus
    View full text