skip to main content
10.1145/3573051.3596180acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesl-at-sConference Proceedingsconference-collections
short-paper
Open Access

Towards Automated Code Assessment with OpenJupyter in MOOCs

Published:20 July 2023Publication History

ABSTRACT

The popularity of Massive Open Online Courses (MOOCs) as a means of delivering education to large numbers of students has been growing steadily over the last decade. As technology improves, more educational content is becoming readily available to the public. JupyterLab, an open-source web-based interactive development environment (IDE), is also becoming increasingly popular in education, however, it is so far primarily used in small classroom settings. JupyterLab can provide a more interactive, hands-on, and collaborative learning experience for students in MOOCs, and it is highly customizable and can be accessed from anywhere. To capitalize on these benefits, we have developed OpenJupyter, which integrates JupyterLab at scale with MOOCs, enhancing the student learning experience and providing hands-on exercises for data science courses, making them more interactive and engaging. While MOOCs provide access to education for a large number of students, one of the significant challenges is providing effective and timely feedback to learners. OpenJupyter includes an auto-assessment capability that addresses this problem in MOOCs by automating the evaluation process and providing feedback to learners in a timely manner. In this paper, we provide an overview of the architecture of OpenJupyter, its scalability in the context of MOOCs, and its effectiveness in addressing the auto-assessment challenge. We also discuss the Advantages and limitations associated with using OpenJupyter in a MOOC context and provide a reference for educators and researchers who wish to implement similar tools. Our efforts aim to foster an open educational environment in the field of programming by providing learners with an interactive learning tool and a streamlined technical setup, allowing them to acquire and test their knowledge at their own pace.

References

  1. M.E. Auer and T. Tsiatsos. 2019. The Challenges of the Digital Transformation in Education: Proceedings of the 21st International Conference on Interactive Collaborative Learning (ICL2018) - Volume 2. Springer International Publishing. https://books.google.de/books?id=qfCKDwAAQBAJGoogle ScholarGoogle Scholar
  2. Robert J. Brunner and Edward J. Kim. 2016. Teaching data science, In ICCS 2016. The International Conference on Computational Science. Teaching Data Science. Procedia Computer Science 80, 1947--1956. https://doi.org/10.1016/j.procs.2016. 05.513Google ScholarGoogle ScholarCross RefCross Ref
  3. Alberto Cardoso, Joaquim Leitão, and César Teixeira. 2019. Using the Jupyter Notebook as a Tool to Support the Teaching and Learning Processes in Engineering Courses. In The Challenges of the Digital Transformation in Education, Michael E. Auer and Thrasyvoulos Tsiatsos (Eds.). Springer International Publishing, Cham, 227--236.Google ScholarGoogle Scholar
  4. Gayle Christensen, Andrew Steinmetz, Brandon Alcorn, Amy Bennett, Deirdre Woods, and Ezekiel Emanuel. 2013. The MOOC Phenomenon: Who Takes Massive Open Online Courses and Why? SSRN Electronic Journal (01 2013). https://doi.org/10.2139/ssrn.2350964Google ScholarGoogle Scholar
  5. Mohamed Elhayany, Ranjiraj-Rajendran Nair, Thomas Staubitz, and Christoph Meinel. 2022. A Study about Future Prospects of JupyterHub in MOOCs. In Proceedings of the Ninth ACM Conference on Learning @ Scale (New York City, NY, USA) (L@S '22). Association for Computing Machinery, New York, NY, USA, 275--279. https://doi.org/10.1145/3491140.3529537Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Jessica B. Hamrick. 2016. Creating and Grading IPython/Jupyter Notebook Assignments with NbGrader. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (Memphis, Tennessee, USA) (SIGCSE '16). Association for Computing Machinery, New York, NY, USA, 242. https: //doi.org/10.1145/2839509.2850507Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Jessica B. Hamrick and Jupyter Development Team. 2016. 2016 Jupyter Education Survey. Jupyter Development Team. https://doi.org/10.5281/zenodo.51701Google ScholarGoogle Scholar
  8. Petri Ihantola, Tuukka Ahoniemi, Ville Karavirta, and Otto Seppälä. 2010. Review of Recent Systems for Automatic Assessment of Programming Assignments. In Proceedings of the 10th Koli Calling International Conference on Computing Education Research (Koli, Finland) (Koli Calling '10). Association for Computing Machinery, New York, NY, USA, 86--93. https://doi.org/10.1145/1930464.1930480Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Jeremiah W. Johnson. 2020. Benefits and Pitfalls of Jupyter Notebooks in the Classroom. In Proceedings of the 21st Annual Conference on Information Technology Education (Virtual Event, USA) (SIGITE '20). Association for Computing Machinery, New York, NY, USA, 32--37. https://doi.org/10.1145/3368308.3415397Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Project Jupyter, Douglas Blank, David Bourgin, Alexander Brown, Matthias Bussonnier, Jonathan Frederic, Brian Granger, Thomas Griffiths, Jessica Hamrick, Kyle Kelley, M Pacer, Logan Page, Fernando Perez, Benjamin Ragan-Kelley, Jordan Suchow, and Carol Willing. 2019. nbgrader: A Tool for Creating and Grading Assignments in the Jupyter Notebook. Journal of Open Source Education 2 (01 2019), 32. https://doi.org/10.21105/jose.00032Google ScholarGoogle Scholar
  11. René F. Kizilcec, Chris Piech, and Emily Schneider. 2013. Deconstructing Disengagement: Analyzing Learner Subpopulations in Massive Open Online Courses. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (Leuven, Belgium) (LAK '13). Association for Computing Machinery, New York, NY, USA, 170--179. https://doi.org/10.1145/2460296.2460330Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Thomas Kluyver, Benjamin Ragan-Kelley, Fernando Pérez, Brian E Granger, Matthias Bussonnier, Jonathan Frederic, Kyle Kelley, Jessica B Hamrick, Jason Grout, Sylvain Corlay, et al. 2016. Jupyter Notebooks-a publishing format for reproducible computational workflows. Vol. 2016.Google ScholarGoogle Scholar
  13. Samuel Lau and Joshua Hug. 2018. nbinteract: generate interactive web pages from Jupyter notebooks. Master's Thesis, Master's thesis Part F128771 (2018), 1139--1144.Google ScholarGoogle Scholar
  14. Hamza Manzoor, Amit Naik, Clifford A. Shaffer, Chris North, and Stephen H. Edwards. 2020. Auto-Grading Jupyter Notebooks. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (Portland, OR, USA) (SIGCSE '20). Association for Computing Machinery, New York, NY, USA, 1139--1144. https://doi.org/10.1145/3328778.3366947Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Fernando Perez and Brian E. Granger. 2007. IPython: A System for Interactive Scientific Computing. Computing in Science and Engineering 9, 3 (2007), 21--29. https://doi.org/10.1109/MCSE.2007.53Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Jonathan Reades. 2020. Teaching on jupyter -- using notebooks to accelerate learning and curriculum development. Region 7 (2020), 21--34. Issue 1. https://doi.org/10.18335/region.v7i1.282Google ScholarGoogle ScholarCross RefCross Ref
  17. Chad Sharp, Jelle van Assema, Brian Yu, Kareem Zidane, and David J. Malan. 2020. An Open-Source, API-Based Framework for Assessing the Correctness of Code in CS50. In Proceedings of the 2020 ACM Conference on Innovation and Technology in Computer Science Education (Trondheim, Norway) (ITiCSE '20). Association for Computing Machinery, New York, NY, USA, 487--492. https: //doi.org/10.1145/3341525.3387417Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Thomas Staubitz, Hauke Klement, Jan Renz, Ralf Teusner, and Christoph Meinel. 2015. Towards Practical Programming Exercises and Automated Assessment in Massive Open Online Courses. https://doi.org/10.1109/TALE.2015.7386010Google ScholarGoogle Scholar
  19. Thomas Staubitz, Hauke Klement, Ralf Teusner, Jan Renz, and Christoph Meinel. 2016. CodeOcean - A Versatile Platform for Practical Programming Excercises in Online Environments. https://doi.org/10.1109/EDUCON.2016.7474573Google ScholarGoogle Scholar
  20. Eric Van Dusen. 2020. Jupyter for Teaching Data Science. Association for Computing Machinery, New York, NY, USA, 1399. https://doi.org/10.1145/3328778. 3372538Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Towards Automated Code Assessment with OpenJupyter in MOOCs

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          L@S '23: Proceedings of the Tenth ACM Conference on Learning @ Scale
          July 2023
          445 pages
          ISBN:9798400700255
          DOI:10.1145/3573051

          Copyright © 2023 Owner/Author

          This work is licensed under a Creative Commons Attribution International 4.0 License.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 20 July 2023

          Check for updates

          Qualifiers

          • short-paper

          Acceptance Rates

          Overall Acceptance Rate117of440submissions,27%
        • Article Metrics

          • Downloads (Last 12 months)133
          • Downloads (Last 6 weeks)30

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader