Skip to main content

Application of the DINA Model Framework to Enhance Assessment and Learning

  • Chapter
  • First Online:
Self-directed Learning Oriented Assessments in the Asia-Pacific

Part of the book series: Education in the Asia-Pacific Region: Issues, Concerns and Prospects ((EDAP,volume 18))

  • 2216 Accesses

Abstract

This chapter introduces cognitive diagnosis models as a component of an alternative psychometric framework for diagnostic modeling and assessment and contrasts them to traditional item response models. The focus of this chapter is on the family of cognitive diagnosis models which represent various formulations and extensions of the deterministic, input, noisy “and” gate (DINA) model. The assumptions of, related examples pertaining to, and appropriate uses of these models are covered. In addition, several issues pertaining to the DINA models in particular and diagnostic modeling in general are also discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  • de la Torre, J. (2006, June). Skills profile comparisons at the state level: An application and extension of cognitive diagnosis modeling in NAEP. Paper presented at the international meeting of the Psychometric Society, Montreal, Canada.

    Google Scholar 

  • de la Torre, J. (2008). An empirically based method of Q-matrix validation for the DINA model: Development and applications. Journal of Educational Measurement, 45, 343–362.

    Article  Google Scholar 

  • de la Torre, J. (2009a). A cognitive diagnosis model for cognitively based multiple-choice options. Applied Psychological Measurement, 33, 163–183.

    Article  Google Scholar 

  • de la Torre, J. (2009b). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115–130.

    Article  Google Scholar 

  • de la Torre, J. (2010, July). The partial-credit DINA model. Paper presented at the international meeting of the Psychometric Society, Athens, GA.

    Google Scholar 

  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179–199.

    Article  Google Scholar 

  • de la Torre, J., & Chiu, C. Y. (2010, April). General empirical method of Q-matrix validation. Paper presented at the annual meeting of the National Council on Measurement in Education, Denver, CO.

    Google Scholar 

  • de la Torre, J., & Douglas, J. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333–353.

    Article  Google Scholar 

  • de la Torre, J., & Douglas, J. (2008). Model evaluation and selection in cognitive diagnosis: An analysis of fraction subtraction data. Psychometrika, 73, 595–624.

    Article  Google Scholar 

  • de la Torre, J., & Karelitz, T. (2009). Impact of diagnosticity on the adequacy of models for cognitive diagnosis under a linear attribute structure. Journal of Educational Measurement, 46, 450–469.

    Article  Google Scholar 

  • de la Torre, J., & Liu, Y. (2008, March). A cognitive diagnosis model for continuous response. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY.

    Google Scholar 

  • de la Torre, J., Hong, Y., & Deng, W. (2010). Factors affecting the item parameter estimation and classification accuracy of the DINA model. Journal of Educational Measurement, 47, 227–249.

    Article  Google Scholar 

  • Doornik, J. A. (2003). Object-oriented matrix programming using Ox (Version 3.1). [Computer software]. London: Timberlake Consultants Press.

    Google Scholar 

  • Haertel, E. H., & Wiley, D. E. (1993). Representations of ability structures: Implications for testing. In N. Frederiksen, R. J. Mislevy, & I. Bejar (Eds.), Test theory for a new generation of tests (pp. 359–384). Hillsdale: Erlbaum.

    Google Scholar 

  • Hagenaars, J. A. (1990). Categorical longitudinal data: Log-linear panel, trend, and cohort analysis. Newbury Park: Sage.

    Google Scholar 

  • Hagenaars, J. A. (1993). Log-linear models with latent variables. Newbury Park: Sage.

    Google Scholar 

  • Hartz, S. (2002). A Bayesian framework for the unified model for assessing cognitive abilities: Blending theory with practicality. Unpublished doctoral dissertation, University of Illinois at Urbana-Champaign.

    Google Scholar 

  • Junker, B. W. (1999, November 30). Some statistical models and computational methods that may be useful for cognitively-relevant assessment. Paper prepared for the Committee on the Foundations of Assessment, National Research Council.

    Google Scholar 

  • Junker, B. W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.

    Article  Google Scholar 

  • Lee, J., Grigg, W., & Dion, G. (2007). The Nation’s report card: Mathematics 2007 (NCES 2007–494). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Available online at: http://nces.ed.gov/nationsreportcard/itemmaps/?subj=Mathematics&year=2007&grade=4

  • Leighton, J. P., & Gierl, M. J. (2007a). Cognitive diagnostic assessment for education: Theory and application. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Leighton, J. P., & Gierl, M. J. (2007b). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees’ thinking processes. Educational Measurement: Issues and Practice, 26, 3–16.

    Article  Google Scholar 

  • Leighton, J. P., Gierl, M. J., & Hunka, S. (2004). The attribute hierarchy model: An approach for integrating cognitive theory with assessment practice. Journal of Educational Measurement, 41, 205–236.

    Article  Google Scholar 

  • Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika, 64, 187–212.

    Article  Google Scholar 

  • Mislevy, R. J. (1995). Probability-based inference in cognitive diagnosis. In P. D. Nichols, S. F. Chipman, & R. L. Brennan (Eds.), Cognitively diagnostic assessment (pp. 43–71). Hillsdale: Erlbaum.

    Google Scholar 

  • Mislevy, R. J. (1996). Test theory reconceived. Journal of Education al Measurement, 33, 379–416.

    Article  Google Scholar 

  • Muthén, L. K., & Muthén, B. O. (2010). Mplus user’s guide (Version 6) [Computer software and manual]. Los Angeles: Muthén & Muthén.

    Google Scholar 

  • Nitko, A. J. (2001). Educational assessment of students (3rd ed.). Columbus: Merrill Prentice Hall.

    Google Scholar 

  • Pellegrino, J. W., Baxter, G. P., & Glaser, R. (1999). Addressing the “two disciplines” problem: Linking theories of cognition and learning with assessment and instructional practices. In A. Iran-Nejad & P. D. Pearson (Eds.), Review of research in education (pp. 307–353). Washington, DC: American Educational Research Association.

    Google Scholar 

  • Rupp, A., & Templin, J. (2008). Unique characteristics of cognitive diagnosis models: A comprehensive review of the current state-of-the-art. Measurement: Interdisciplinary Research & Perspectives, 6, 219–262.

    Article  Google Scholar 

  • Rupp, A., Templin, J., & Henson, R. (2010). Diagnostic measurement: Theory, methods, and applications. New York: Guilford Press.

    Google Scholar 

  • Sadler, P. M. (1998). Psychometric models of student conceptions in science: Reconciling qualitative studies and distractor-driven assessment instruments. Journal of Research in Science Teaching, 35, 265–296.

    Article  Google Scholar 

  • Stiggins, R. J. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta Kappan, 83, 758–765.

    Google Scholar 

  • Tatsuoka, K. K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345–354.

    Article  Google Scholar 

  • Tatsuoka, K. K. (1990). Toward an integration of item-response theory and cognitive error diagnosis. In N. Frederiksen, R. Glaser, A. Lesgold, & M. Safto (Eds.), Monitoring skills and knowledge acquisition (pp. 453–488). Hillsdale: Erlbaum.

    Google Scholar 

  • Tatsuoka, K. K. (2009). Cognitive assessment: An introduction to the rule space method. New York: Routledge Academic.

    Google Scholar 

  • Tatsuoka, K. K., Corter, J., & Tatsuoka, C. (2004). Patterns of diagnosed mathematical content and process skills in TIMSS-R across a sample of 20 countries. American Educational Research Journal, 41, 901–906.

    Article  Google Scholar 

  • Templin, J., & Henson, R. (2006). Measurement of psychological disorders using cognitive diagnosis models. Psychological Methods, 11, 287–305.

    Article  Google Scholar 

  • Templin, J., Henson, R., Rupp, A., & Jang, E. (2008, March). Cognitive diagnosis models for nominal response data. Paper presentation at the annual meeting of the National Council on Measurement in Education Society, New York, NY.

    Google Scholar 

  • Templin, J., Henson, R., Douglas, J., & Hoffman, L. (2009, April). Estimating a family of diagnostic classification models with Mplus. Paper presentation at the annual meeting of the American Educational Research Association, San Diego, CA.

    Google Scholar 

  • van der Linden, W. J., & van Krimpen-Stoop, E. M. L. A. (2003). Using response times to detect aberrant response patterns in computerized adaptive testing. Psychometrika, 68, 251–265.

    Article  Google Scholar 

  • van der Linden, W. J., Scrams, D. J., & Schnipke, D. L. (1999). Using response-time constraints to control for speededness in computerized adaptive testing. Applied Psychological Measurement, 23, 195–210.

    Article  Google Scholar 

  • van der Linden, W. J., Klein Entink, R. H., & Fox, J.-P. (2010). IRT parameter estimation with response times as collateral information. Applied Psychological Measurement, 34, 327–347.

    Article  Google Scholar 

  • Vermunt, J. K., & Magidson, J. (2005). Technical guide for Latent GOLD 4.0: Basic and advanced [Computer software and manual]. Belmont: Statistical Innovations Inc.

    Google Scholar 

  • Wiggins, G. (1998). Educative assessment: Designing assessment to inform and improve performance. San Francisco: Jossey-Bass.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jimmy de la Torre .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

de la Torre, J. (2012). Application of the DINA Model Framework to Enhance Assessment and Learning. In: Mok, M. (eds) Self-directed Learning Oriented Assessments in the Asia-Pacific. Education in the Asia-Pacific Region: Issues, Concerns and Prospects, vol 18. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-4507-0_5

Download citation

Publish with us

Policies and ethics