Factors influencing beliefs for adoption of a learning analytics tool: An empirical study
Highlights
► Gives empirically validated models of the factors influencing tool adoption beliefs. ► Those engaged in teaching are better placed to internalize perceived tool utilities. ► Users appreciate tools that provide some hints on how to improve online courses. ► No significant relationship between ease-of-use belief and intention is observed.
Introduction
Many well-acclaimed educational systems, such as Moodle1 or BlackBoard/WebCT,2 are available for educators for structuring their online courses, preparing learning contents and designing student activities according to their preferred teaching styles. Numerous online communication tools, such as Elluminate,3 are also available for educators to interact and collaborate on learning activities with students. However, when it comes to personalization of the student learning process, the support offered by learning systems is rather limited (Dabbagh & Reo, 2011; Dawson et al., 2008; Essalmi, Ayed, Jemni, Kinshuk, & Graf, 2010; Willging, 2005).
Educators who adopt online learning systems are often required to constantly adapt and evolve their online courses to assure high performance and learning efficiency of their students (Gasevic et al., 2007). Effective adaptation requires a comprehensive and insightful awareness of students' learning experiences, comprehension, and their interactions in the learning systems. By having access to the learning analytics on students' completion status of lessons and quiz scores, educators should have a better sense of: students' ability to follow and comprehend the course contents; the topics students found difficult; students' social interactions and knowledge contributions; and the like. Educators, thus, require a learning system that provides learning analytics on their online courses that are both comprehensive and informative.
One way to attain comprehensibility is to interlink semantically all the major elements (i.e., data) of a learning process, including learning activities (e.g., reading and discussing), learning content, learning outcomes, and students (Jovanovic et al., 2007). Learning analytics derived from semantically interlinked data can provide a more comprehensive picture of students learning experiences and outcomes, and thus support personalization of student learning process. To be informative, learning analytics should be such that an educator can quickly and easily get an overall insight into a certain aspect of the learning process (this can be achieved, e.g., by using effective visualization of users' interactions).
The current breed of online learning systems, however, offers limited support for such insights. Many learning analytics are superficial and include, for instance, simple statistics such as, when and how many times students logged in, or low-level data about students' interactions with learning content (e.g., pages viewed). In addition, traditional course evaluations done at the end of a semester are too late for adapting courses for current students and often lack reliable feedback. As it has been shown, most students enrolled in online courses do not complete standard course evaluation forms and their response rate is far lower than for students attending conventional classroom-based courses.4
Recognizing the above stated issues associated with online learning systems, there has been a significant research interest recently in various aspects of learning analytics (Long, Siemens, Conole, & Gasevic, 2011). While many approaches and tools for learning analytics have been proposed, there is limited empirical insights of the factors influencing potential adoption of this new technology. To address this research gap, we propose and empirically validate a Learning Analytics Acceptance Model (LAAM), which we report in this paper, to start building research understanding how the analytics provided in a learning analytics tool affect educators' adoption beliefs. In particular, we investigated how the provided learning analytics inform the usage belief of educators, i.e., ease-of-use and usefulness perceptions, and how these beliefs are mutually associated with the intention to adopt the tool. In our study, we considered several factors.
First, pedagogical knowledge and information design skills of individuals can influence their perception of the usefulness of learning systems (Bratt, 2009a, 2009b). Furthermore, McFarland and Hamilton's (2006) refined technology acceptance model recognize prior experience as one of the context factors that could potentially impact the perceived usefulness of a system. These studies suggested that the participants who evaluate online learning systems could have varied perceptions of the systems' utility based on their own pedagogical background and experience. In our study, we asked educators to enlist their academic role and years of experience. Based on their responses, we examined how the evaluator's role and years of experience influenced their perception of the usefulness of the analytics provided in a learning analytics tool. We present our results in this paper.
Second, studies have shown that decisions to adopt technology-based systems are influenced by the evaluators' perceived utility of such systems (Bratt, Coulson, & Kostasshuk, 2009). The Technology Acceptance Model (TAM) theory (Davis, 1989) – whose roots lie in Fishbein and Ajzen's (1975) Theory of Reasoned Action – posits that perceived usage beliefs determine individual behavioral intentions to use a specific technology or service. Studies in related domains have shown that perceived usage belief (usefulness) is one of the strong drivers influencing users' intentions of adopting a software tool for practice (Davies, Green, Rosemann, Indulska, & Gallo, 2006; Recker, 2010). We found it relevant to examine how the usage belief perceptions of educators would influence their behavioral intention or commitment to adopt a learning analytics tool for their practice.
Third, the relation between the ease-of-use belief and intention to adopt is not very clear, however. Some studies suggest there is a direct association between ease-of-use and intention to adopt (Davis, 1989; Gefen, Pavlou, Rise, & Warkentin, 2002). Others fail to report such an association (Warkentin, Gefen, Pavlou, & Rose, 2002). Venkatesh, Morris, Davis, and Davis (2003), however, suggest that perceived ease-of-use indirectly influences intention to adopt through perceived usefulness. Faced with these conflicting studies, we examined how the users evaluating a learning analytics tool related their ease-of-use beliefs with their perceived usefulness and intention to adopt the tool. In our survey, we measured the ease-of-use construct by asking questions about the tool's intuitiveness. Moreover, Malhotra and Galletta (1999) observed that sustainability of one's perception or attitude is an indicative of usage behaviors. In this study, we examined how an individual's use of the features of a tool influences perception, which can in turn be associated with later evaluations.
Finally by leveraging the data we collected during this study, the paper aimed to identify the variables that could be acted upon to improve the pedagogical utility (Bratt, 2007) of a learning analytics tool, in general, and its GUI (Graphical User Interface) feature, in particular. We analyzed the associations between the items to identify the best predictors of the perceived utility and usability of a learning analytics tool.
To create the LAAM model, to conduct an empirical study, and to investigate the impact of the abovementioned factors in our study, we used LOCO-Analyst, a learning analytics tool, as an object of our study. LOCO-Analyst provides learning analytics on different objects of interest at varying levels of granularity (overview of the tool is provided in the Appendix).
Section 2 provides overview of the proposed LAAM model, including the definition of research questions (which fed the development of our theoretical model). Section 2 covers method including our research design and context-specific details of the empirical study. The study results are provided in Section 4 using descriptive and inferential statistics. Discussions on the results are also covered in this section. After presenting the related work in Section 5, we conclude the paper in Section 6.
Section snippets
Learning analytics acceptance model – LAAM
Following Davis (1989), we can understand perceived usefulness as the degree to which an educator believes that using a specific online learning system will increase his/her task performance. Ease-of-use is the degree to which an educator expects the use of the learning system to be free of effort. We built a LAAM research model to investigate our research questions and hypotheses. Fig. 1 shows a high-level view of the model. This model is intended to provide a visual conceptualization of the
Design
The purpose of this study was to evaluate the proposed LAAM by using it for evaluation of a concrete learning analytics tool. In our case, we decided to use LOCO-Analyst (see Appendix for details), which offers the following types of learning analytics: single lesson analytics; composite lesson analytics; module analytics; quiz analytics; social interaction analytics; lesson interaction analytics; and topic comprehension analytics (see Table A1 in the Appendix). The participants' usage belief
Results and discussion
In this section, we present results of our statistical analysis and their interpretation. The findings reported here are based on the analysis of 22 usable responses collected from the respondents. We used the JMP tool to perform all our statistical tests. The threshold of p < 0.05 was chosen to designate the statistically significant level. Cronbach's alpha coefficient for internal consistency reliability of the survey items was α = 0.90, which is rated as good according to George and
Empirical studies of educator-directed learning tools
Research efforts oriented towards educators and the learning analytics they require are rather scarce, and there is even less reports on comprehensive evaluation studies aimed at assessing the developed learning analytics tools. In particular, we have found only two papers reporting on the methodology and the results of such evaluation studies. One paper by Mazza and Dimitrova (2007) reports on the evaluation of the CourseViz tool, whereas the other, by Kosba, Dimitrova, and Boyle (2007)
Conclusion
In this empirical study we proposed and validated a Leaning Analytics Acceptance Model for adopting an learning analytics tool based on the perceived usage beliefs of educators about the learning analytics provided in a tool. To our knowledge, there have been no previous attempts that tried to build and empirically validate theoretical models of the factors influencing the beliefs for adoption of learning analytics tools. We hope that the model contributed and empirically studied (Fig. 2, Fig. 3
References (50)
- et al.
A qualitative evaluation of evolution of a learning analytics tool
Computers & Education
(2012 (January)) - et al.
How do practitioners use conceptual modeling in practice?
Data & Knowledge Engineering
(2006) - et al.
A fully personalization strategy of E-learning scenarios
Computers in Human Behavior
(2010) - et al.
Mining LMS data to develop an “early warning system” for educators: a proof of concept
Computers & Education
(2010) - et al.
CourseVis: a graphical student monitoring tool for supporting instructors in web-based distance courses
International Journal of Human-computer Studies
(2007) - et al.
Adding contextual specificity to the technology acceptance model
Computers in Human Behavior
(2006) - et al.
Review and process effects of spontaneous note-taking on text comprehension
Contemporary Educational Psychology
(1999) The personal learning environments – the future of eLearning?
eLearning Papers
(2007)- et al.
What is learned in note taking?
Journal of Educational Psychology
(1981) Analysing quantitative data
(2003)
A framework for assessing the pedagogical utility of learning management systems
Development of an instrument to assess pedagogical utility in e-Learning systems
Note taking and passage style
Journal of Educational Psychology
Measuring vocational preferences: ranking versus categorical rating procedures
Career Education Quarterly
Ten common misunderstandings, misconceptions, persistent myths and urban legends about Likert scales and Likert response formats and their antidotes
Journal of the Social Sciences
Resolving the 50-year debate around using and misusing Likert scales
Medical Education
Empirical evaluation of user models and user-adapted systems
User Modeling and User-adapted Interaction
Impact of Web 2.0 on higher education
Perceived usefulness, perceived ease of use, and user acceptance of information technology
MIS Quarterly
Statistical analysis: Quick reference guidebook
Belief, attitude, intention, and behavior: An introduction to theory and research
Ontology-based annotation of learning object content
Interactive Learning Environments
Cited by (86)
Connecting the dots: An exploratory study on learning analytics adoption factors, experience, and priorities
2021, Internet and Higher EducationLearning Analytics: a bibliometric analysis of the literature over the last decade
2021, International Journal of Educational Research OpenCategorizing learning analytics models according to their goals and identifying their relevant components: A review of the learning analytics literature from 2011 to 2019
2021, Computers and Education: Artificial IntelligenceCitation Excerpt :Feedback and performance are the third most frequently used model components (used in eight studies). The studies in which these components were used include those of Ali et al. (2013), Taraghi et al. (2015), Hadhrami (2017), and Zhang and Liu (2019). “Ethics and privacy” were used as model components in 2.97% of the reviewed articles; however, the results of Banihashem et al. (2018) indicate that ethics and privacy are major challenges in the application of LA in education.
Automatic feedback in online learning environments: A systematic literature review
2021, Computers and Education: Artificial IntelligenceCurriculum analytics adoption in higher education: A multiple case study engaging stakeholders in different phases of design
2024, British Journal of Educational TechnologyAdoption of learning analytics in higher education institutions: A systematic literature review
2024, British Journal of Educational Technology