skip to main content
10.1145/3287324.3287390acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article
Public Access

Development of a Lean Computational Thinking Abilities Assessment for Middle Grades Students

Published:22 February 2019Publication History

ABSTRACT

The recognition of middle grades as a critical juncture in CS education has led to the widespread development of CS curricula and integration efforts. The goal of many of these interventions is to develop a set of underlying abilities that has been termed computational thinking (CT). This goal presents a key challenge for assessing student learning: we must identify assessment items associated with an emergent understanding of key cognitive abilities underlying CT that avoid specialized knowledge of specific programming languages. In this work we explore the psychometric properties of assessment items appropriate for use with middle grades (US grades 6-8; ages 11-13) students. We also investigate whether these items measure a single ability dimension. Finally, we strive to recommend a "lean" set of items that can be completed in a single 50-minute class period and have high face validity. The paper makes the following contributions: 1) adds to the literature related to the emerging construct of CT, and its relationship to the existing CTt and Bebras instruments, and 2) offers a research-based CT assessment instrument for use by both researchers and educators in the field.

References

  1. Goode, J. and Chapman, G. Exploring Computer Science. University of Oregon, Eugene, OR, 2016.Google ScholarGoogle Scholar
  2. Lee, I., Martin, F. and Apone, K. Integrating computational thinking across the K-8 curriculum. ACM Inroads, 5, 4 (2014), 64--71. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Mannila, L., Dagiene, V., Demo, B., Grgurina, N., Mirolo, C., Rolandsson, L. and Settle, A. 2014. Computational Thinking in K-9 Education. In Proceedings of the Proceedings of the Working Group Reports of the 2014 on Innovation and Technology in Computer Science Education Conference. ACM, 1--29. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Zur-Bargury, I., Parv, B. and Lanzberg, D. 2013. A nationwide exam as a tool for improving a new curriculum. In Proceedings of the Proceedings of the 18th ACM conference on Innovation and technology in computer science education. ACM, 267--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Witherspoon, E. B., Higashi, R. M., Schunn, C. D., Baehr, E. C. and Shoop, R. Developing Computational Thinking through a Virtual Robotics Programming Curriculum. ACM Trans. Comput. Educ., 18, 1 (2017), 1--20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Martin, L. The promise of the Maker Movement for education. Journal of Pre-College Engineering Education Research (J-PEER), 5, 1 (2015), 4.Google ScholarGoogle ScholarCross RefCross Ref
  7. Rodger, S. H., Hayes, J., Lezin, G., Qin, H., Nelson, D., Tucker, R., Lopez, M., Cooper, S., Dann, W. and Slater, D. 2009. Engaging middle school teachers and students with alice in a diverse set of subjects. In Proceedings of the Proceedings of the 40th ACM technical symposium on Computer science education (SIGCSE). ACM, 271--275. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Bienkowski, M., Snow, E., Rutstein, D. W. and Grover, S. Assessment design patterns for computational thinking practices in secondary computer science: A first look. SRI International, Menlo Park, CA, 2015.Google ScholarGoogle Scholar
  9. Grover, S. and Pea, R. Computational Thinking in K--12: A Review of the State of the Field. Educational Researcher, 42, 1 (2013), 38--43.Google ScholarGoogle ScholarCross RefCross Ref
  10. {10Wing, J. M. Computational Thinking. Communications of the ACM, 49, 3 (2006), 33--35. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. K12CS K--12 Computer Science Framework. 2016.Google ScholarGoogle Scholar
  12. NRC, N. R. C. A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. The National Academies, Washington, DC, 2011.Google ScholarGoogle Scholar
  13. Buffum, P. S., Martinez-Arocho, A. G., Frankosky, M. H., Rodriguez, F. J., Wiebe, E. N. and Boyer, K. E. 2014. CS principles goes to middle school: learning how to teach Big Data. Proceedings of the 45th ACM technical symposium on computer science education (SIGCSE '14). ACM, 151--156. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jona, K., Wilensky, U., Trouille, L., Horn, M., Orton, K., Weintrop, D. and Beheshti, E. 2014. Embedding computational thinking in science, technology, engineering, and math (CT-STEM). Future Directions in Computer Science Education Summit Meeting.Google ScholarGoogle Scholar
  15. Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L. and Wilensky, U. Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25, 1 (2016), 127--147.Google ScholarGoogle ScholarCross RefCross Ref
  16. Grover, S. 2017. Assessing Algorithmic and Computational Thinking in K-12: Lessons from a Middle School Classroom. In Emerging Research, Practice, and Policy on Computational Thinking. Educational Communications and Technology: Issues and Innovations. Springer, 269--288.Google ScholarGoogle Scholar
  17. Grover, S., Cooper, S. and Pea, R. 2014. Assessing computational learning in K-12. In Proceedings of the 2014 conference on Innovation & technology in computer science education (ITTICSE '14). ACM, 57--62. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Shute, V. J., Sun, C. and Asbell-Clarke, J. Demystifying computational thinking. Educational Research Review, 22 (2017), 142--158.Google ScholarGoogle ScholarCross RefCross Ref
  19. Brennan, K. and Resnick, M. 2012. New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 annual meeting of the American Educational Research Association. AERA.Google ScholarGoogle Scholar
  20. Denner, J., Werner, L. and Ortiz, E. Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Computers & Education, 58, 1 (1// 2012), 240--249. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Tew, A. E. and Guzdial, M. 2011. The FCS1: a language independent assessment of CS1 knowledge. In Proceedings of the Proceedings of the 42nd ACM technical symposium on Computer science education. ACM, 111--116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Weintrop, D. and Wilensky, U. 2015. Using Commutative Assessments to Compare Conceptual Understanding in Blocks-based and Text-based Programs. In International Computing Education Research Conference (ICER '15). ACM, 101--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Taylor, C., Zingaro, D., Porter, L., Webb, K. C., Lee, C. B. and Clancy, M. Computer science concept inventories: past and future. Computer Science Education, 24, 4 (2014), 253--276.Google ScholarGoogle ScholarCross RefCross Ref
  24. Curzon, P., McOwan, P. W., Plant, N. and Meagher, L. R. 2014. Introducing teachers to computational thinking using unplugged storytelling. In Proceedings of the 9th Workshop in Primary and Secondary Computing Education. ACM, 89--92. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Román-González, M., Moreno-León, J. and Robles, G. 2017. Complementary tools for computational thinking assessment. In Proceedings of international conference on computational thinking education (CTE 2017). The Education University of Hong Kong, 154--159.Google ScholarGoogle Scholar
  26. Román-González, M., Pérez-González, J.-C. and Jiménez-Fernández, C. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior 72 (2016), 678--691. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Román-González, M., Pérez-González, J.-C., Moreno-León, J. and Robles, G. Extending the nomological network of computational thinking with non-cognitive factors. Computers in Human Behavior, 80 (2018), 441--459. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Alfonso, V. C., Flanagan, D. P. and Radwan, S. The impact of the Cattell-Horn-Carroll theory on test development and interpretation of cognitive and academic abilities. Guilford Publications, City, 2005.Google ScholarGoogle Scholar
  29. McGrew, K. S. CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1 (2009), 1--10.Google ScholarGoogle ScholarCross RefCross Ref
  30. Ambrósio, A. P., Xavier, C. and Georges, F. 2014. Digital ink for cognitive assessment of computational thinking. In Proceedings of the 2014 IEEE Frontiers in Education Conference (FIE). IEEE, 1--7.Google ScholarGoogle Scholar
  31. Basawapatna, A. R., Koh, K. H. and Repenning, A. 2010. Using scalable game design to teach computer science from middle school to graduate school. In Proceedings of the fifteenth annual conference on Innovation and technology in computer science education. ACM, 224--228. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Werner, L., Denner, J. and Campe, S. 2012. The Fairy Performance Assessment: Measuring Computational Thinking in Middle School. In Proceeding of the 44th ACM technical symposium on computer science education (SIGCSE '12). ACM, 421--426. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Dagiene, V. and Futschek, G. 2008. Bebras international contest on informatics and computer literacy: Criteria for good tasks. In International Conference on Informatics in Secondary Schools-Evolution and Perspectives. Springer, 19--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Dagiene, V. and Sentance, S. 2016. It's Computational Thinking! Bebras Tasks in the Curriculum. In International Conference on Informatics in Schools: Situation, Evolution, and Perspectives. Springer, 28--39.Google ScholarGoogle Scholar
  35. Aksit, O. Enhancing Science Learning through Computational Thinking and Modeling in Middle School Classrooms: A Mixed Methods Study. Dissertation, North Carolina State University, Raleigh, NC, 2018.Google ScholarGoogle Scholar
  36. Moreno-León, J. and Robles, G. 2015. Dr. Scratch: A web tool to automatically evaluate Scratch projects. In Proceedings of the workshop in primary and secondary computing education (WiPSCE '15 ). ACM, 132--133. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Blokhuis, D., Millican, P., Roffey, C., Schrijvers, E. and Sentance, S. UK Bebras Computational Thinking Challenge 2016. University of Oxford, Oxford, UK, 2015.Google ScholarGoogle Scholar
  38. Barendsen, E., Mannila, L., Demo, B., Nata, Grgurina, A., Izu, C., Mirolo, C., Sentance, S., Settle, A., Gabriel, S. 2015. Concepts in K-9 Computer Science Education. In Proceedings of the 2015 ITiCSE on Working Group Reports. ACM, 85--116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Dagiene, V., Stupurien, G. and Vinikien, L. 2016. Promoting Inclusive Informatics Education Through the Bebras Challenge to All K-12 Students. In Proceedings of the Proceedings of the 17th International Conference on Computer Systems and Technologies 2016. ACM, 407--414. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Bellettini, C., Lonati, V., Malchiodi, D., Monga, M., Morpurgo, A. and Torelli, M. 2015. How Challenging are Bebras Tasks?: An IRT Analysis Based on the Performance of Italian Students. In Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education. ACM, 27--32. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Gujberova, M. and Kalas, I. 2013. Designing productive gradations of tasks in primary programming education. In Proceedings of the 8th Workshop in Primary and Secondary Computing Education. ACM, 108--117. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Hubwieser, P. and Muhling, A. 2014. Playing PISA with Bebras. In Proceedings of the 9th Workshop in Primary and Secondary Computing Education. ACM, 128--129. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Izu, C., Mirolo, C., Settle, A., Mannila, L. and Stupuriene, G. Exploring Bebras Tasks Content and Performance: A Multinational Study. Informatics in Education, 16, 1 (2017), 39--59.Google ScholarGoogle ScholarCross RefCross Ref
  44. Dagiene, V., Mannila, L., Poranen, T., Rolandsson, L. and Derhjelm, S. 2014. Students' performance on programming-related tasks in an informatics contest in Finland, Sweden and Lithuania. In Proceedings of the 2014 conference on Innovation; technology in computer science education. ACM, 153--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Fischer, G. H. and Molenaar, I. W. Rasch models: Foundations, recent developments, and applications. Springer Science, 2012.Google ScholarGoogle Scholar
  46. Linacre, J. M. Winsteps®. Winsteps.com, 2018.Google ScholarGoogle Scholar
  47. de Ayala, R. J. The Theory and Practice of Item Response Theory. Guilford Press, New York, 2009.Google ScholarGoogle Scholar
  48. Chalmers, R. P. mirt: A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48, 6 (2012), 1--29.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Development of a Lean Computational Thinking Abilities Assessment for Middle Grades Students

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          SIGCSE '19: Proceedings of the 50th ACM Technical Symposium on Computer Science Education
          February 2019
          1364 pages
          ISBN:9781450358903
          DOI:10.1145/3287324

          Copyright © 2019 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 22 February 2019

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          SIGCSE '19 Paper Acceptance Rate169of526submissions,32%Overall Acceptance Rate1,595of4,542submissions,35%

          Upcoming Conference

          SIGCSE Virtual 2024
          SIGCSE Virtual 2024: ACM Virtual Global Computing Education Conference
          November 30 - December 1, 2024
          Virtual Event , USA

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader