Elsevier

Computers in Human Behavior

Volume 78, January 2018, Pages 408-420
Computers in Human Behavior

Student profiling in a dispositional learning analytics application using formative assessment

https://doi.org/10.1016/j.chb.2017.08.010Get rights and content

Highlights

  • Formative assessment data have high predictive power in generating learning feedback.

  • Learning disposition data are most actionable: triggering educational interventions.

  • Dispositional LA is instrumental in chaining dispositions, traces, performance.

  • Student profiling based on traces allows characterization in terms of dispositions.

Abstract

How learning disposition data can help us translating learning feedback from a learning analytics application into actionable learning interventions, is the main focus of this empirical study. It extends previous work (Tempelaar, Rienties, & Giesbers, 2015), where the focus was on deriving timely prediction models in a data rich context, encompassing trace data from learning management systems, formative assessment data, e-tutorial trace data as well as learning dispositions. In this same educational context, the current study investigates how the application of cluster analysis based on e-tutorial trace data allows student profiling into different at-risk groups, and how these at-risk groups can be characterized with the help of learning disposition data. It is our conjecture that establishing a chain of antecedent-consequence relationships starting from learning disposition, through student activity in e-tutorials and formative assessment performance, to course performance, adds a crucial dimension to current learning analytics studies: that of profiling students with descriptors that easily lend themselves to the design of educational interventions.

Introduction

The challenge to design “an optimal sequence of data collection and economic response times …” that includes “the minimum requirements for making valid predictions and creating meaningful interventions” (Ifenthaler, 2015) as one of the challenges to the application of learning analytics (LA), is the main topic of this empirical contribution to dispositional learning analytics. Learning Analytics (LA) is defined as ”the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Buckingham Shum and Ferguson, 2012, Gasevic et al., 2016, Siemens, 2013). In the early stages of LA, many scholars focused on building predictive models based on data extracted from both institutional student information systems (SIS) and digital platforms that organize and facilitate learning, such as learning management systems and e-tutorials (LMS, taking them together). While these studies provide important markers on the potential of LA in education, the findings were rather limited to the descriptive functions of LA, which is mostly based on demographics, grades, and trace data. Given the rigidity of SIS and LMS data, educators may encounter difficulties in designing pedagogically informed interventions (Conde and Hernández-García, 2015, Tobarra et al., 2014, Xing et al., 2015).

To overcome this shortcoming, Buckingham Shum and Crick (2012) proposed a Dispositional Learning Analytics (DLA) infrastructure that combines learning data (i.e. those generated in learning activities through the LMS) with learner data (e.g., student dispositions, values, and attitudes measured through self-reported surveys). Learning dispositions represent individual difference characteristics that impact all learning processes and include affective, behavioral and cognitive facets (Rienties, Cross, & Zdrahal, 2017). Student's preferred learning approaches are examples of such dispositions of both cognitive and behavioral type; in research on their role in learning, they are often simply labeled as ‘self-report data’ (see e.g. Buckingham Shum and Ferguson, 2012, Gašević et al., 2017). Different from LA research, stakeholders of DLA applications are typically restricted to students and teacher/tutors, as these applications can be positioned at both the meso- and micro-level (Ifenthaler, 2015), rather than the mega- or macro-level. Our study is a follow-up of previous research by the authors on the application of LA in a ‘data-rich context’ (Tempelaar, Rienties, & Giesbers, 2015). The availability of formative assessment data constitutes a crucial aspect of that data richness, together with learning activity trace data of students practicing in e-tutorial systems in order to be optimally prepared for these formative assessments, and later summative assessments. That data of cognitive type was complemented by learning disposition data to cover all “affective, behavioral and cognitive facets of the ABC framework of student learning” (Rienties et al., 2017).

Our previous research indicated a sensitive balance between timing and predictive power of the several data sources in a rich data context. Most informative, but least timely, is typically formative assessment data. Given that formative assessment data is not available until several weeks into a course, trace data from e-tutorial systems are a good second-best. However, it is important to note that the use of e-tutorial trace data is ill-advised at the very start of the course when practicing activities of students have not yet settled into stable patterns. Therefore, learning disposition data are an informative data source next to the trace data in predicting student performance (Tempelaar et al., 2015).

This follow-up study focuses on this very early stage of generating learning feedback at the start of courses that is “personalised, dynamic and timely” (Ifenthaler, 2015). The requirement of learning feedback to be timely implies a crucial role for learning disposition data. The requirement of learning feedback to be actionable too has strong links with the availability of dispositions; learning interventions such as academic counselling are often based on the same social-cognitive frameworks as the instruments used to measure learning dispositions (such as improving one's learning style, or changing mal-adaptive into adaptive approaches to learning, in case of setbacks) (Tempelaar, Rienties, & Nguyen, 2017a).

Section snippets

Formative testing and feedback

The classic function of testing is that of summative assessment or assessment of learning: students demonstrate their mastery of a particular subject to their teacher after completing the learning process. Formative assessment or assessment for learning takes place during learning rather than after learning, and has an entirely different function: to provide ongoing feedback to both students, to improve their learning, and teachers, to improve teaching (Spector et al., 2016). Thus beyond a

Context of the empirical study

This study takes place in a large-scale introductory mathematics and statistics course for first-year undergraduate students in a business and economics program in the Netherlands. The educational system is best described as ‘blended’ or ‘hybrid.' The main component is face-to-face: Problem-Based Learning (PBL), in small groups (14 students), coached by a content expert tutor (see Non & Tempelaar, 2016 and Williams et al., 2016 for further information on PBL and the course design).

Results

In this section, we will demonstrate the existence of the chain of three antecedent – consequence links: from learning depositions to traces in learning systems; from these traces to the outcomes of formative assessment; and from the outcomes of formative assessment to course performance. Demonstrating the last two of these links is a replication of our Tempelaar et al. (2015) study, with a different class year of students, and a different learning tool. In that study, we derived that the

Discussion

The first outcome section confirms results of previous research on the role of formative assessment in learning and LA applications. Formative assessment outcomes constituted crucial feedback to learners about where they stand in their learning process (Spector et al., 2016), and constituted the most important predictors in LA-based prediction equations (Tempelaar et al., 2015). Next, formative assessment outcomes were well explained by trace variables of student activity in e-tutorials. In the

Limitations and conclusions

The finding that self-reported disposition data are an important data source in this LA application does not come with the conjecture that these data are true, unbiased accounts of not directly observable dispositions. The scientific debate on whether self-reports, or trace-data, better approximate true levels of learning dispositions (Gašević et al., 2017) is not touched upon in this paper. The only criterion we have taken into consideration is that of predictive power, rather than

Acknowledgements

The project reported here has been supported and co-financed by SURF-Foundation (20150707-5-001-DARI-MIHO) as part of the Learning Analytics Stimulus and the Testing and Test-Driven Learning programs.

References (49)

  • D.T. Tempelaar et al.

    In search for the most informative data for feedback generation: Learning analytics in a data-rich context

    Computers in Human Behavior

    (2015)
  • L. Tobarra et al.

    Analyzing the students' behavior and relevant topics in virtual learning communities

    Computers in Human Behavior

    (2014)
  • A. Wigfield et al.

    Expectancy – value theory of achievement motivation

    Contemporary Educational Psychology

    (2000)
  • W. Xing et al.

    Participation-based student final performance prediction model through interpretable Genetic Programming: Integrating learning analytics, educational data mining and theory

    Computers in Human Behavior

    (2015)
  • M. Zhou et al.

    Modeling academic achievement by self-reported versus traced goal orientation

    Learning and Instruction

    (2012)
  • S. Buckingham Shum et al.

    Learning dispositions and transferable competencies: Pedagogy, modelling and learning analytics

  • S. Buckingham Shum et al.

    Social learning analytics

    Journal of Educational Technology & Society

    (2012)
  • F. Coffield et al.

    A Critical Analysis of Learning Styles and Pedagogy in post-16 learning. A systematic and critical review

    (2004)
  • R. Deakin Crick et al.

    Learner dispositions, self-theories and student engagement

    British Journal of Educational Studies

    (2014)
  • A.J. Elliot et al.

    On the measurement of achievement Goals: Critique, illustration, and application

    Journal of Educational Psychology

    (2008)
  • A. Elliot et al.

    Potential-based achievement goals

    British Journal of Educational Psychology

    (2015)
  • A. Elliot et al.

    A 3X2 achievement goal model

    Journal of Educational Psychology

    (2011)
  • R. Ferguson et al.

    Research evidence of the use of learing analytics; implications for education policy

  • D. Gašević et al.

    Detecting learning strategies with analytics: Links with self-reported measures and academic performance

    Journal of Learning Analytics

    (2017)
  • Cited by (0)

    View full text