Skip to main content

Employing Authentic Analytics for More Authentic Tasks

  • Chapter
  • First Online:
Big Data in Education: Pedagogy and Research

Part of the book series: Policy Implications of Research in Education ((PIRE,volume 13))

  • 561 Accesses

Abstract

Although teaching systems and other infrastructure collect large amounts of information that can give insights into the learning behaviour of students, much of this data tends to be concerned with secondary aspects – for example, when students accessed resources or how they performed in inauthentic tasks such as adaptive quizzes. Whereas, when a design task, a higher-order thinking task, or a long-form writing task is given, students typically do their thinking and research outside of the learning environment and only the submitted product is available for analysis. In this chapter, we examine the opportunities for student-facing learning analytics in authentic tasks using authentic tools. By employing professional tools, we can design environments that allow students to work on realistic open-ended problems while gathering data on the strategies and practices they use in the creation process. In some fields, such as software engineering, professional and open source projects gather this sort of data, and those same tools allow collection of student data, allowing us to explore whether students are adopting strategies that experts find to be successful. We see this as the goal of developing cognitive apprenticeships, supported by smart technology, that use more authentic environments. We suggest this trend is the coming together of three strands of development in education: rich learning environments, learning analytics, and authentic tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Aleven, V., Mclaren, B. M., Sewall, J., & Koedinger, K. R. (2009). A new paradigm for intelligent tutoring systems: Example-tracing tutors. International Journal of Artificial Intelligence in Education, 19(2), 105–154.

    Google Scholar 

  • Aleven, V., McLaren, B. M., Sewall, J., van Velsen, M., Popescu, O., Demi, S., Ringenberg, M., & Koedinger, K. R. (2016). Example-tracing tutors: Intelligent tutor development for non-programmers. International Journal of Artificial Intelligence in Education, 26(1), 224–269.

    Article  Google Scholar 

  • Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1–18.

    Article  Google Scholar 

  • Anderson, J. R. (1993). Rules of the mind. Erlbaum.

    Google Scholar 

  • Anderson, J. R., Boyle, C. F., & Yost, G. (1985). The geometry tutor. In Proceedings of the 9th international joint conference on artificial intelligence – Volume 1, IJCAI’85 (pp. 1–7). Morgan Kaufman.

    Google Scholar 

  • Anderson, J. R., Corbett, A. T., Koedinger, K. R., & Pelletier, R. (1995). Cognitive tutors: Lessons learned. The Journal of the Learning Sciences, 4(2), 167–207.

    Article  Google Scholar 

  • Baker, R. S., Corbett, A. T., & Koedinger, K. R. (2004). Detecting student misuse of intelligent tutoring systems. In International conference on intelligent tutoring systems (pp. 531–540). Springer.

    Chapter  Google Scholar 

  • Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17.

    Google Scholar 

  • Bakharia, A., Kitto, K., Pardo, A., Gašević, D., & Dawson, S. (2016). Recipe for success: Lessons learnt from using xAPI within the Connected Learning Analytics Toolkit. In Proceedings of the sixth international conference on learning analytics & knowledge, LAK ‘16 (pp. 378–382). ACM.

    Chapter  Google Scholar 

  • Barnes, T., & Stamper, J. (2010). Automatic hint generation for logic proof tutoring using historical data. Journal of Educational Technology & Society, 13(1), 3–12.

    Google Scholar 

  • Billingsley, W. (2020). Revisiting the intelligent book: Towards seamless intelligent content and continuously deployed courses. In S. Gregory, S. Warburton, & M. Parkes (Eds.), ASCILITE’s first virtual conference (pp. 230–240). Proceedings ASCILITE 2020 in Armidale.

    Google Scholar 

  • Billingsley, W., & Billingsley, J. (2004). The animation of simulations and tutorial clients for online teaching. In Proceedings of the 15th annual conference for the Australasian Association for Engineering Education and the 10th Australasian women in engineering forum, Toowoomba, Australia (pp. 532–540). Australasian Association for Engineering Education.

    Google Scholar 

  • Billingsley, W., & Robinson, P. (2005). Towards an intelligent textbook for discrete mathematics. In Proceedings of the 2005 international conference on active media technology, AMT2005 (pp. 291–296). IEEE Press.

    Chapter  Google Scholar 

  • Billingsley, W., & Robinson, P. (2007a). Searching questions, informal modelling, and massively multiple choice. In S. Wheeler & N. Whitton (Eds.), Beyond control: Learning technology for the social network generation. Research proceedings of the 14th Association for Learning Technology conference, ALT-C 2007 (pp. 159–168). The Association for Learning Technology.

    Google Scholar 

  • Billingsley, W., & Robinson, P. (2007b). Student proof exercises using MathsTiles and Isabelle/HOL in an intelligent book. Journal of Automated Reasoning, 39(2), 181–218.

    Article  Google Scholar 

  • Billingsley, W., & Robinson, P. (2009). Intelligent books – Combining reactive learning exercises with extensible and adaptive content in an open-access web application. In C. Mourlas, N. Tsianos, & P. Germanakos (Eds.), Cognitive and emotional processes in web-based education: Integrating human factors and personalization (pp. 229–244). IGI Global.

    Chapter  Google Scholar 

  • Billingsley, W., & Steel, J. (2013). A comparison of two iterations of a software studio course based on continuous integration. In Proceedings of the 18th ACM conference on innovation and technology in computer science education, ITiCSE ‘13 (pp. 213–218). ACM.

    Google Scholar 

  • Billingsley, W., & Steel, J. R. H. (2014). Towards a supercollaborative software engineering MOOC. In Companion proceedings of the 36th international conference on software engineering – ICSE companion 2014 (pp. 283–286). ACM.

    Chapter  Google Scholar 

  • Billingsley, W., Torbay, R., Fletcher, P. R., Thomas, R. N., Steel, J. R., & Süß, J. G. (2019). Taking a studio course in distributed software engineering from a large local cohort to a small global cohort. ACM Transactions on Computing Education, 19(2). https://doi.org/10.1145/3218284

  • Bloom, B. (1984). The two sigma problem: The search for methods of group instruction as effective as one-to-one tutoring. Educational Researcher, 13, 4–15.

    Article  Google Scholar 

  • Brooks, F. P. (1987). No silver bullet. IEEE Computer, 20(4), 10–19.

    Article  Google Scholar 

  • Brown, J. S., Burton, R. R., & Bell, A. G. (1975). Sophie: A step toward creating a reactive learning environment. International Journal of Man-Machine Studies, 7(5), 675–696.

    Article  Google Scholar 

  • Brown, J. S., Burton, R. R., Bell, A. G., & Bobrow, R. J. (1974). Sophie: A sophisticated instructional environment. In Technical report AFHRL-TR-74-93. Bolt Beranek and Newman Inc.

    Google Scholar 

  • Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

    Article  Google Scholar 

  • Brown, J. S., Rubinstein, R., & Burton, R. (1976). Reactive learning environment for computer assisted electronics instruction. Technical report AFHRL-TR-76-68. Bolt Beranek and Newman Inc.

    Google Scholar 

  • Chiu, J. L., DeJaegher, C. J., & Chao, J. (2015). The effects of augmented virtual science laboratories on middle school students’ understanding of gas properties. Computers & Education, 85, 59–73.

    Article  Google Scholar 

  • Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 453–491). Laurence Erlbaum Associates.

    Google Scholar 

  • Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., & Fisher, J. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Australian Office for Learning and Teaching.

    Google Scholar 

  • Corbett, A. (2001). Cognitive computer tutors: Solving the two-sigma problem. In User modeling 2001: 8th international conference, UM 2001, Sonthofen, Germany (pp. 137–147). Springer.

    Chapter  Google Scholar 

  • Corbett, A., & Trask, H. (2000). Instructional interventions in computer-based tutoring: Differential impact on learning time and accuracy. In CHI ‘00: Proceedings of the SIGCHI conference on human factors in computing systems (pp. 97–104). ACM Press.

    Chapter  Google Scholar 

  • Dawson, S., Tan, J. P.-L., & McWilliam, E. (2011). Measuring creative potential: Using social network analysis to monitor a learners’ creative capacity. Australasian Journal of Educational Technology, 27(6), 924–942.

    Article  Google Scholar 

  • De Liddo, A., Shum, S. B., Quinto, I., Bachler, M., & Cannavacciuolo, L. (2011). Discourse-centric learning analytics. In Proceedings of the 1st international conference on learning analytics and knowledge, LAK ‘11 (pp. 23–33). ACM.

    Chapter  Google Scholar 

  • Deslauriers, L., & Wieman, C. (2011). Learning and retention of quantum concepts with different teaching methods. Physical Review Special Topics – Physics Education Research, 7(1), 010101.

    Article  Google Scholar 

  • Docherty, M., Sutton, P., Brereton, M., & Kaplan, S. (2001). An innovative design and studio-based CS degree. In Proceedings of the thirty-second SIGCSE technical symposium on computer science education, SIGCSE ‘01 (pp. 233–237). ACM.

    Chapter  Google Scholar 

  • Dutton, T. A. (1987). Design and studio pedagogy. Journal of Architectural Education, 41(1), 16–25.

    Article  Google Scholar 

  • Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5–6), 304–317.

    Article  Google Scholar 

  • Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. Review of Educational Research, 82(3), 300–329.

    Article  Google Scholar 

  • Gibson, A., Kitto, K., & Bruza, P. (2016). Towards the discovery of learner metacognition from reflective writing. Journal of Learning Analytics, 3(2), 22–36.

    Article  Google Scholar 

  • Hendrix, D., Myneni, L., Narayanan, H., & Ross, M. (2010). Implementing studio-based learning in CS2. In Proceedings of the 41st ACM technical symposium on computer science education (pp. 505–509). ACM.

    Chapter  Google Scholar 

  • Herrington, J., Reeves, T. C., & Oliver, R. (2006). Authentic tasks online: A synergy among learner, task, and technology. Distance Education, 27(2), 233–247.

    Article  Google Scholar 

  • Hobson, L. M. (2009, March). Putting a bolder face on Google. New York Times.

    Google Scholar 

  • Hundhausen, C. D., Narayanan, N. H., & Crosby, M. E. (2008). Exploring studio-based instructional models for computing education. ACM SIGCSE Bulletin, 40(1), 392.

    Article  Google Scholar 

  • Janning, R., Schatten, C., & Schmidt-Thieme, L. (2016). Perceived task-difficulty recognition from log-file information for the use in adaptive intelligent tutoring systems. International Journal of Artificial Intelligence in Education, 26(3), 855–876.

    Article  Google Scholar 

  • Khousa, E. A., Atif, Y., & Masud, M. M. (2015). A social learning analytics approach to cognitive apprenticeship. Smart Learning Environments, 2(1), 14.

    Article  Google Scholar 

  • Kitto, K., Bakharia, A., Lupton, M., Mallet, D., Banks, J., Bruza, P., Pardo, A., Shum, S. B., Dawson, S., Gašević, D., et al. (2016). The Connected Learning Analytics Toolkit. In Proceedings of the sixth international conference on learning analytics & knowledge, LAK ‘16 (pp. 548–549). ACM.

    Chapter  Google Scholar 

  • Lajoie, S. P., & Lesgold, A. M. (1992). Dynamic assessment of proficiency for solving procedural knowledge tasks. Educational Psychologist, 27(3), 365–384.

    Article  Google Scholar 

  • Loddington, S., Pond, K., Wilkinson, N., & Willmot, P. (2009). A case study of the development of WebPA: An online peer-moderated marking tool. British Journal of Educational Technology, 40(2), 329–341.

    Article  Google Scholar 

  • Long, J. G. (2012). State of the studio: Revisiting the potential of studio pedagogy in U.S.-based planning programs. Journal of Planning Education and Research, 32(4), 431–448.

    Article  Google Scholar 

  • Ma, J., & Nickerson, J. V. (2006). Hands-on, simulated, and remote laboratories: A comparative literature review. ACM Computing Surveys, 38(3), 1–24.

    Article  Google Scholar 

  • Mah, D.-K. (2016). Learning analytics and digital badges: Potential impact on student retention in higher education. Technology, Knowledge and Learning, 21(3), 285–305.

    Article  Google Scholar 

  • McCarthy, K. S., Likens, A. D., Johnson, A. M., Guerrero, T. A., & McNamara, D. S. (2018). Metacognitive overload!: Positive and negative effects of metacognitive prompts in an intelligent tutoring system. International Journal of Artificial Intelligence in Education, 28(3), 420–438.

    Article  Google Scholar 

  • Melis, E., Kärger, P., & Homik, M. (2005). Interactive concept mapping in ActiveMath. In J. N. Haake, U. Lucke, & D. Tavangarian (Eds.), Delfi 2005: 3. Deutsche eLearning Fachtagung Informatik, volume 66 of LNI (pp. 247–258). Rostock.

    Google Scholar 

  • Merceron, A., & Yacef, K. (2003). A web-based tutoring tool with mining facilities to improve learning and teaching. In Proceedings of the 11th international conference on artificial intelligence in education (pp. 201–208). IOS Press.

    Google Scholar 

  • Mitrovic, A. (2012). Fifteen years of constraint-based tutors: What we have achieved and where we are going. User Modeling and User-Adapted Interaction, 22(1–2), 39–72.

    Article  Google Scholar 

  • Mitrovic, A., Mayo, M., Suraweera, P., & Martin, B. (2001). Constraint-based tutors: A success story. In International conference on industrial, engineering and other applications of applied intelligent systems (pp. 931–940). Springer.

    Google Scholar 

  • Montgomery, K. (2002). Authentic tasks and rubrics: Going beyond traditional assessments in college teaching. College Teaching, 50(1), 34–40.

    Article  Google Scholar 

  • Nipkow, T., Paulson, L. C., & Wenzel, M. (2002). Isabelle/HOL: A proof assistant for higher-order logic, volume 2283 of LNCS. Springer.

    Book  Google Scholar 

  • Nurkkala, T., & Brandle, S. (2011). Software studio: Teaching professional software engineering. In Proceedings of the 42nd ACM technical symposium on computer science education, SIGCSE ‘11 (pp. 153–158). ACM.

    Chapter  Google Scholar 

  • Pardo, A. (2013). Social learning graphs: Combining social network graphs and analytics to represent learning experiences. International Journal of Social Media and Interactive Learning Environments, 1(1), 43–58.

    Article  Google Scholar 

  • Pardo, A., & Kloos, C. D. (2011). Stepping out of the box: Towards analytics outside the learning management system. In Proceedings of the 1st international conference on learning analytics and knowledge, LAK ‘11 (pp. 163–167). ACM.

    Chapter  Google Scholar 

  • Pardos, Z. A., Tang, S., Davis, D., & Le, C. V. (2017). Enabling real-time adaptivity in MOOCs with a personalized next-step recommendation framework. In Proceedings of the fourth (2017) ACM conference on learning @ scale (pp. 23–32). ACM.

    Chapter  Google Scholar 

  • Perez, S., Massey-Allard, J., Butler, D., Ives, J., Bonn, D., Yee, N., & Roll, I. (2017). Identifying productive inquiry in virtual labs using sequence mining. In International conference on artificial intelligence in education (pp. 287–298). Springer.

    Google Scholar 

  • Perkel, J. M. (2014). Scientific writing: The online cooperative. Nature News, 514(7520), 127–128.

    Article  Google Scholar 

  • Petraglia, J. (1998). The real world on a short leash: The (mis)application of constructivism to the design of educational technology. Educational Technology Research and Development, 46(3), 53–65.

    Article  Google Scholar 

  • Piech, C., Bassen, J., Huang, J., Ganguli, S., Sahami, M., Guibas, L. J., & Sohl-Dickstein, J. (2015). Deep knowledge tracing. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 1 (NIPS’15) (pp. 505–513). Cambridge, MA, USA: MIT Press.

    Google Scholar 

  • Piech, C., Sahami, M., Koller, D., Cooper, S., & Blikstein, P. (2012). Modeling how students learn to program. In Proceedings of the 43rd ACM technical symposium on computer science education (pp. 153–160). ACM.

    Chapter  Google Scholar 

  • Pyatt, K., & Sims, R. (2012). Virtual and physical experimentation in inquiry-based science labs: Attitudes, performance and access. Journal of Science Education and Technology, 21(1), 133–147.

    Article  Google Scholar 

  • QILT. (2017). 2017 employer satisfaction survey: National report. Technical report. The Social Research Centre.

    Google Scholar 

  • Ritter, S., & Koedinger, K. R. (1997). An architecture for plug-in tutoring agents. Journal of Artificial Intelligence in Education, 7, 315–347.

    Google Scholar 

  • Rodríguez-Bustos, C., & Aponte, J. (2012). How distributed version control systems impact open source software projects. In Mining software repositories (MSR), 2012 9th IEEE working conference (pp. 36–39). IEEE Press.

    Chapter  Google Scholar 

  • Romero, C., & Ventura, S. (2007). Educational data mining: A survey from 1995 to 2005. Expert Systems with Applications, 33(1), 135–146.

    Article  Google Scholar 

  • Root, D., Rosso-Llopart, M., & Taran, G. (2008). Exporting studio: Critical issues to successfully adopt the software studio concept. In 21st conference on software engineering education and training, CSEET ‘08 (pp. 41–50). https://doi.org/10.1109/CSEET.2008.21

    Chapter  Google Scholar 

  • San Pedro, M. O. Z., Baker, R. S. J. D., & Rodrigo, M. M. T. (2014). Carelessness and affect in an intelligent tutoring system for mathematics. International Journal of Artificial Intelligence in Education, 24(2), 189–210.

    Article  Google Scholar 

  • Schneider, J., Bernstein, A., vom Brocke, J., Damevski, K., & Shepherd, D. (2017). Detecting plagiarism based on the creation process. IEEE Transactions on Learning Technologies, 11(3), 348–361.

    Article  Google Scholar 

  • Schön, D. A. (1984). The architectural studio as an exemplar of education for reflection-in-action. Journal of Architectural Education, 38(1), 2–9.

    Article  Google Scholar 

  • Schön, D. A. (1987). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. Jossey Bass.

    Google Scholar 

  • Seely Brown, J., & Adler, R. (2008). Open education, the long tail, and learning 2.0. Educause Review, 43(1), 16–20.

    Google Scholar 

  • Süß, J. G. & Billingsley, W. (2012). Using continuous integration of code and content to teach software engineering with limited resources. In Proceedings of the 34th International Conference on Software Engineering, ICSE2012 (pp. 1175–1184). doi: https://doi.org/10.1109/ICSE.2012.6227025.

  • Tomayko, J. E. (1991). Teaching software development in a studio environment. ACM SIGCSE Bulletin, 23(1), 300–303.

    Article  Google Scholar 

  • Tomayko, J. E. (1996). Carnegie Mellon’s software development studio: A five year retrospective. In Proceedings of the 9th conference on software engineering education (pp. 119–129). IEEE.

    Chapter  Google Scholar 

  • VanLehn, K. (2011). The relative effectiveness of human tutoring, intelligent tutoring systems, and other tutoring systems. Educational Psychologist, 46(4), 197–221.

    Article  Google Scholar 

  • VanLehn, K., Lynch, C., Schulze, K., Shapiro, J., Shelby, R., Taylor, L., Treacy, D., Weinstein, A., & Wintersgill, M. (2005). The Andes physics tutoring system: Lessons learned. International Journal of Artificial Intelligence in Education, 15, 147–204.

    Google Scholar 

  • Venant, R., Sharma, K., Vidal, P., Dillenbourg, P., & Broisin, J. (2017). Using sequential pattern mining to explore learners’ behaviors and evaluate their correlation with performance in inquiry-based learning. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data driven approaches in digital education (pp. 286–299). Springer.

    Chapter  Google Scholar 

  • Wang, L., Sy, A., Liu, L., & Piech, C. (2017). Deep knowledge tracing on programming exercises. In Proceedings of the fourth (2017) ACM conference on learning @ scale, L@S ‘17 (pp. 201–204). ACM.

    Chapter  Google Scholar 

  • Wiggins, G. (1989). A true test: Toward more authentic and equitable assessment. Phi Delta Kappan, 70(9), 703–713.

    Google Scholar 

  • Wiggins, G. (1990). The case for authentic assessment. ERIC digest. Washington, DC: Office of Educational Research and Improvement. (ERIC document reproduction service no. ED328611).

    Google Scholar 

  • Wilson, K. H., Karklin, Y., Han, B., & Ekanadham, C. (2016). Back to the basics: Bayesian extensions of IRT outperform neural networks for proficiency estimation. arXiv:1604.02336.

    Google Scholar 

  • Xiong, X., Zhao, S., Van Inwegen, E., & Beck, J. (2016). Going deeper with deep knowledge tracing. In Proceedings of the 9th International Conference on Educational Data Mining, EDM 2016 (pp. 545–550). International Educational Data Mining Society.

    Google Scholar 

  • Zacharia, Z. C., Olympiou, G., & Papaevripidou, M. (2008). Effects of experimenting with physical and virtual manipulatives on students’ conceptual understanding in heat and temperature. Journal of Research in Science Teaching, 45(9), 1021–1035.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to William Billingsley .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Billingsley, W., Fletcher, P. (2021). Employing Authentic Analytics for More Authentic Tasks. In: Prodromou, T. (eds) Big Data in Education: Pedagogy and Research . Policy Implications of Research in Education, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-030-76841-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-76841-6_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-76840-9

  • Online ISBN: 978-3-030-76841-6

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics