Skip to main content
Log in

Transaction-level learning analytics in online authentic assessments

  • Published:
Journal of Computing in Higher Education Aims and scope Submit manuscript

Abstract

This paper presents a case for the use of transaction-level data when analyzing automated online assessment results to identify knowledge gaps and misconceptions for individual students. Transaction-level data, which records all of the steps a student uses to complete an assessment item, are preferred over traditional assessment formats that submit only the final answer, as the system can detect persistent misconceptions. In this study we collected transaction-level data from 996 students enrolled in an online introductory spreadsheet class. Each student’s final answer and step-by-step attempts were coded for misconceptions or knowledge gaps regarding the use of absolute references over four assessment occasions. Overall, the level of error revealed was significantly higher in the step-by-step processes compared to the final submitted answers. Further analysis suggests that students most often have misconceptions regarding non-critical errors. Data analysis also suggests that misconceptions identified at the transaction level persist over time.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Abdous, M., He, W., & Yen, C. J. (2012). Using data mining for predicting relationships between online question theme and final grade. Educational Technology and Society, 15(3), 77–88.

    Google Scholar 

  • Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd international conference on learning analytics and knowledgeLAK’12 (pp. 267–270). New York, NY: ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2330666

  • Baker, R. S. J. D., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–16. Retrieved from http://educationaldatamining.org/JEDM/index.php/JEDM/article/view/8

  • Berland, M., Martin, T., Benton, T., Petrick Smith, C., & Davis, D. (2013). Using learning analytics to understand the learning pathways of novice programmers. Journal of the Learning Sciences, 22(4), 564–599. doi:10.1080/10508406.2013.836655.

    Article  Google Scholar 

  • Blikstein, P. (2011). Using learning analytics to assess students’ behavior in open-ended programming tasks. In Proceedings of the 1st international conference on learning analytics and knowledgeLAK’11 (pp. 110–116). Banff, Alberta: ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2090132

  • Bowers, A. J. (2010). Analyzing the longitidunal K-12 grading histories of entire cohorts of students: Grades, data drive decision making, dropping out and hierachical cluster analysis. Practical Assessment, Research & Evaluation, 15(7), 1–18.

    Google Scholar 

  • Brown, J. S., & VanLehn, K. (1980). Repair theory: A generative theory of bugs in procedural skills. Cognitive Science, 4, 379–426. doi:10.1207/s15516709cog0404_3.

    Article  Google Scholar 

  • Campbell, J. P. (2007). Utilizing student data within the course management system to determine undergraduate academic success: An exploratory study. (Doctoral Dissertation). Retrieved from http://docs.lib.purdue.edu/dissertations/AAI3287222/

  • Chung, G. (2014). Toward the relational management of educational measurement data. Teachers College Record, 116(11), 1–16. Retrieved from http://www.tcrecord.org/Content.asp?ContentId=17650

  • Chung, G. K. W. K., & Kerr, D. (2012). A primer on data logging to support extraction of meaningful information from educational games: An example from Save Patch (CRESST Report 814). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

  • Collins, A., Brown, J. S., & Newman, S. E. (1987). Cognitive apprenticeship: Teaching the craft of reading, writing (No. 403). Champaing, Illinois: University of Illinois at Urbana-Champaign.

  • Corbett, A. T., & Anderson, J. R. (1995). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modelling and User-Adapted Interaction, 4(4), 253–278. doi:10.1007/BF01099821.

    Article  Google Scholar 

  • Cummins, M., Johnson, L., & Adams, S. (2012). The NMC horizon report: 2012 higher education edition. The New Media Consortium.

  • Davies, R., & West, R. E. (2013). Technology integration in school settings. In M. Spector, M. J. Bishop, M. D. Merrill, & J. Elen (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 841–853). New York, NY: Lawrence Erlbaum.

    Google Scholar 

  • Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317. doi:10.1504/IJTEL.2012.051816.

    Article  Google Scholar 

  • Harlen, W. (2007). Assessment of learning. Los Angeles, CA: SAGE Publications Inc.

    Google Scholar 

  • Khan, S. (2012). The one world schoolhouse: Education reimagined. New York, NY: Grand Central Publishing.

    Google Scholar 

  • Koedinger, K. R., Corbett, A. T., & Perfetti, C. (2010). The Knowledge-Learning-Instruction framework: Bridging the science-practice chasm to enhance robust student learning (Technical Report). Pittsburg, PA: Carnegie Mellon University.

    Google Scholar 

  • Koh, K. H., Basawapatna, A., Nickerson, H., & Repenning, A. (2014). Real time assessment of computational thinking. In Proceedings from 2014 IEEE symposium on visual languages and human-centric computing (VL/HCC) (pp. 49–52). IEEE. Retrieved from http://sgd.cs.colorado.edu/wiki/images/9/91/Paper_24.pdf

  • Lehikoinen, J., & Koistinen, V. (2014). In big data we trust? Interactions, 21(5), 38–41.

    Article  Google Scholar 

  • Mayer, M. (2009). Innovation at Google: The physics of data. Retrieved from http://www.parc.com/event/936/innovation-atgoogle.html

  • Miller, M. D., Linn, R. L., & Gronlund, N. E. (2013). Measurement and assessment in teaching (11th ed.). Upper Saddle River, NJ: Prentice-Hall.

    Google Scholar 

  • Morris, L. V., Wu, S., & Finnegan, C. L. (2005). Predicting retention in online general education courses. The American Journal of Distance Education, 19(1), 23–36. doi:10.1207/s15389286ajde1901.

    Article  Google Scholar 

  • Nesbit, J., Zhou, M., Xu, Y., & Winne, P. (2007). Advancing log analysis of student interactions with cognitive tools. In Proceedings of 12th Biennial conference of the European association for research on learning and instruction (pp. 1–20). Retrieved from http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Advancing+Log+Analysis+of+Student+Interactions+with+Cognitive+Tools#0

  • Panko, R. R. (2013). The cognitive science of spreadsheet errors: Why thinking is bad. In Proceedings of 2013 46th Hawaii international conference on system sciences (pp. 4013–4022). IEEE. doi:10.1109/HICSS.2013.513

  • Panko, R. R., & Aurigemma, S. (2010). Revising the Panko-Halverson taxonomy of spreadsheet errors. Decision Support Systems, 49(2), 235–244. doi:10.1016/j.dss.2010.02.009.

    Article  Google Scholar 

  • Pelligrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know. Washington DC: National Academy Press.

    Google Scholar 

  • Perera, D., Kay, J., Koprinska, I., Yacef, K., & Zaiane, O. R. (2009). Clustering and sequential pattern mining of online collaborative learning data. IEEE Transactions on Knowledge and Data Engineering, 21(6), 759–772. doi:10.1109/TKDE.2008.138.

    Article  Google Scholar 

  • Popham, W. J. (2005). Classroom assessment: What teachers need to know (4th ed.). Boston, MA: Allyn and Bacon.

    Google Scholar 

  • Romero, C., Ventura, S., Pechenizkly, M., & Baker, R. S. (Eds.). (2011). Handbook of educational data mining. Baton Rouge, FL: CRC Press.

    Google Scholar 

  • Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice. In Proceedings of the 2nd international conference on learning analytics and knowledgeLAK’12 (pp. 4–8). New York, NY: ACM Press.

  • VanLehn, K. (2006). The behavior of tutoring systems. International Journal of Artificial Intelligence in Education, 16(3), 227–265. Retrieved from http://dl.acm.org/citation.cfm?id=1435353

  • Woolf, B. P. (2010). A roadmap for education technology. Retrieved from http://cra.org/ccc/wp-content/uploads/sites/2/2015/08/GROE-Roadmap-for-Education-Technology-Final-Report.pdf

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rob Nyland.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nyland, R., Davies, R.S., Chapman, J. et al. Transaction-level learning analytics in online authentic assessments. J Comput High Educ 29, 201–217 (2017). https://doi.org/10.1007/s12528-016-9122-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12528-016-9122-0

Keywords

Navigation