Skip to main content
Log in

The Effectiveness of Logical Distractors in an Online Module

  • Original Article
  • Published:
Eastern Economic Journal Aims and scope

Abstract

This article studies the differences in student learning outcomes associated with changes in the format of answer distractors in online learning resources. Employing a pre- and posttest quasi-experimental design, we compare degrees of student achievement across three versions of the same economic education online module produced by the Federal Reserve Bank of St. Louis: two versions employing the “Both option X and option Y are correct” type of distractor across definitional and analytical questions and another, baseline, version omitting it. This study documents no consistent gains in the assessment effectiveness of test items adding this particular type of logical distractor.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Attali, Y., and T. Fraenkel. 2000. The point-biserial as a discrimination index for distractors in multiple-choice items: Deficiencies in usage and an alternative. Journal of Educational Measurement 37(1): 77–86.

    Article  Google Scholar 

  • Central Bank. 2019. Financial inclusion initiative: Federal Reserve Bank of St Louis. Retrieved from: https://www.centralbanking.com/awards/3960346/financial-inclusion-initiative-federal-reserve-bank-of-st-louis.

  • Bloom, Benjamin. S., Englehart, MD., Furst, EJ., Walker H.H., and Krathwohl, DR. 1956. The taxonomy of educational objectives. Handbook I: The cognitive domain. New York: David McKay.

  • Bresnock, A.E., P.E. Graves, and N. White. 1989. Multiple-choice testing: Question and response position. Journal of Economic Education 20(3): 239–245.

    Article  Google Scholar 

  • Buckles, S., and J.J. Siegfried. 2006. Using multiple-choice questions to evaluate in-depth learning of economics. Journal of Economic Education 37(1): 48–57.

    Article  Google Scholar 

  • Carlson, J.L., and A.L. Ostrosky. 1992. Item sequence and student performance on multiple-choice exams: Further evidence. Journal of Economic Education 23(3): 232–235.

    Article  Google Scholar 

  • Council for Economic Education (CEE). n.d. Voluntary national content standards in economics. http://councilforeconed.org/resource/voluntary-national-content-standards-in-economics/.

  • Frary, R.B. 1991. The none-of-the-above option: An empirical study. Applied Measurement in Education 4(2): 115–124.

    Article  Google Scholar 

  • Geiger, M.A., and K. Simons. 1994. Intertopical sequencing of multiple-choice questions: Effect on exam performance and testing time. Journal of Education for Business 70(2): 87–90.

    Article  Google Scholar 

  • Gierl, M.J., O. Bulut, Q. Guo, and X. Zhang. 2017. Developing, analyzing, and using distractors for multiple-choice tests in education: A comprehensive review. Review of Educational Research 87(6): 1082–1116.

    Article  Google Scholar 

  • Hadsell, L. 2009. The effect of quiz timing on exam performance. Journal of Education for Business 84(3): 135–141. https://doi.org/10.3200/JOEB.84.3.135-141

  • Haladyna, T.M., M.C. Rodriguez, and C. Stevens. 2019. Are multiple-choice items too fat? Applied Measurement in Education 32(4): 350–364.

    Article  Google Scholar 

  • Hansen, J.D., and L. Dexter. 1997. Quality multiple-choice test questions: Item-writing guidelines and an analysis of auditing testbanks. Journal of Education for Business 73(2): 94–97.

    Article  Google Scholar 

  • Harter, C., R.G. Chambers, and C.J. Asarta. 2022. Assessing learning in college economics: A sixth national quinquennial survey. Eastern Econ Journal 48: 251–266.

    Article  Google Scholar 

  • Jin, K.-Y., W.-L. Siu, and X. Huang. 2022. Exploring the impact of random guessing in distractor analysis. Journal of Educational Measurement 59: 43–61.

    Article  Google Scholar 

  • Lawrence, J.A., and R.P. Singhania. 2004. A study of teaching and testing strategies for a required statistics course for undergraduate business students. Journal of Education for Business 79(6): 333–338.

    Article  Google Scholar 

  • Mendez-Carbajo, D., and L.C. Malakar. 2020. Flipping the classroom with econlowdown.org. The Journal of Economic Education 51(1): 95–102.

    Article  Google Scholar 

  • Mendez-Carbajo, D., and A.W. Scott. 2019. Segmenting educational content: Long-form vs. short-form online learning modules. American Journal of Distance Education 33(2): 108–119.

    Article  Google Scholar 

  • Miller, D., R.L. Linn, and N.E. Gronlund. 2012. Measurement and assessment in teaching, 11th ed, Pearson.

    Google Scholar 

  • Miller, L.A., C.J. Asarta, and J.R. Schmidt. 2019. Completion deadlines, adaptive learning assignments, and student performance. Journal of Education for Business 94(3): 185–194.

    Article  Google Scholar 

  • Schaur, G., M. Watts, and W.E. Becker. 2012. School, department, and instructor determinants of assessment methods in undergraduate economics courses. Eastern Econ Journal 38: 381–400.

    Article  Google Scholar 

  • Schee, V., and A. Brian. 2013. Test item order, level of difficulty, and student performance in marketing education. Journal of Education for Business 88(1): 36–42.

    Article  Google Scholar 

  • Walstad, W.B. 1998. Multiple-choice tests for the economics course. In Teaching undergraduate economics: A handbook for instructors, ed. W.B. Walstad and P. SaundersIrwin.

    Google Scholar 

  • Walstad, W.B., and W.E. Becker. 2006. Achievement differences on multiple-choice and essay tests in economics. American Economic Review 84(2): 193–196.

    Google Scholar 

  • Walstad, W.B., and J. Wagner. 2016. The disaggregation of value-added test scores to assess learning outcomes in economics courses. Journal of Economic Education 47(2): 121–131.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Diego Mendez-Carbajo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The views expressed in this article are those of the author and don’t necessarily reflect the position of the Federal Reserve Bank of St. Louis or the Federal Reserve System.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mendez-Carbajo, D. The Effectiveness of Logical Distractors in an Online Module. Eastern Econ J 49, 15–30 (2023). https://doi.org/10.1057/s41302-022-00232-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/s41302-022-00232-z

Keywords

JEL Classification

Navigation