The Effects of Exam Setting on Students’ Test-Taking Behaviors and Performances: Proctored Versus Unproctored

Authors

DOI:

https://doi.org/10.19173/irrodl.v24i4.7145

Keywords:

test-taking behaviors, proctored exam, unproctored exam, formative assessment

Abstract

One of the biggest challenges for online learning is upholding academic integrity in online assessments. In particular, institutions and faculties attach importance to exam security and academic dishonesty in the online learning process. The aim of this study was to compare the test-taking behaviors and academic achievements of students in proctored and unproctored online exam environments. The log records of students in proctored and unproctored online exam environments were compared using visualization and log analysis methods. The results showed that while a significant difference was found between time spent on the first question on the exam, total time spent on the exam, and the mean and median times spent on each question, there was no significant difference between the exam scores of students in proctored and unproctored groups. In other words, it has been observed that reliable exams can be conducted without the need for proctoring through an appropriate assessment design (e.g., using multiple low-stake formative exams instead of a single high-stake summative exam). The results will guide instructors in designing assessments for their online courses. It is also expected to help researchers in how exam logs can be analyzed and in extracting insights regarding students' exam-taking behaviors from the logs.

Author Biographies

Denizer Yıldırım, Faculty of Open and Distance Education, Ankara University

Denizer Yıldırım is an instructor at the Faculty of Open and Distance Education, Ankara University, Turkey. He received his Ph.D. in Computer Education and Instructional Technology at Hacettepe University in December 2018. His research interests include educational data mining, learning analytics, learning design, e-assessment, AI-based learning management systems, and robotics integration in education.

Hale Ilgaz, Faculty of Open and Distance Education, Ankara University

Hale Ilgaz is an Associate Professor at the Faculty of Open and Distance Education, Ankara University. She received her Ph.D. in Computer Education and Instructional Technology from Hacettepe University. Her research interests include distance education, e-learning, instructional design, cognitive processes in e-learning environments, and human-computer interaction. She is designing and coordinating the online diploma and certificate programs at the university.

Alper Bayazıt, Department of Medical Education and Informatics, Ankara University

Alper Bayazıt is an Assistant Professor at the Department of Medical Education and Informatics, Ankara University, Faculty of Medicine, Turkey. He received a Ph.D. degree in Computer Education and Instructional Technology and M.Sc. degree in Medical Education. His academic interest areas are learning analytics, educational data mining, clinical decision-making and simulations.

Gökhan Akçapınar, Department of Computer Education and Instructional Technology, Hacettepe University

Gökhan Akçapınar is an Associate Professor at the Department of Computer Education and Instructional Technology, Hacettepe University, Turkey. His research interests are in the areas of Educational Data Mining (EDM) and Learning Analytics (LA). He has been conducting research on developing educational early-warning systems for at-risk students, modeling students' learning behaviors, designing data-driven interventions, and developing learning analytics dashboards.

References

Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2018). Interaction of proctoring and student major on online test performance. The International Review of Research in Open and Distributed Learning, 19(5). https://doi.org/10.19173/irrodl.v19i5.3698

Alexandron, G., Wiltrout, M. E., Berg, A., Gershon, S. K., & Ruipérez-Valiente, J. A. (2022). The effects of assessment design on academic dishonesty, learner engagement, and certification rates in MOOCs. Journal of Computer Assisted Learning, 39(1), 141–153. https://doi.org/10.1111/jcal.12733

Alexandron, G., Yoo, L. Y., Ruipérez-Valiente, J. A., Lee, S., & Pritchard, D. E. (2019). Are MOOC learning analytics results trustworthy? With fake learners, they might not be! International Journal of Artificial Intelligence in Education, 29(4), 484–506. https://doi.org/10.1007/s40593-019-00183-1

Alin, P., Arendt, A., & Gurell, S. (2022). Addressing cheating in virtual proctored examinations: Toward a framework of relevant mitigation strategies. Assessment & Evaluation in Higher Education, 48(3), 262-275. https://doi.org/10.1080/02602938.2022.2075317

Analytic Hacettepe. (n.d.). Learning analytics portal. Retrieved November 2022, from http://analitik.hacettepe.edu.tr/raporlar/1

Arnold, I. J. M. (2016). Cheating at online formative tests: Does it pay off? The Internet and Higher Education, 29, 98–106. https://doi.org/10.1016/j.iheduc.2016.02.001

Balash, D. G., Kim, D., Shaibekova, D., Fainchtein, R. A., Sherr, M., & Aviv, A. J. (2021). Examining the examiners: Students’ privacy and security perceptions of online proctoring services [Paper presentation]. 17th Symposium on Usable Privacy and Security.

Balderas, A., & Caballero-Hernández, J. A. (2021). Analysis of learning records to detect student cheating on online exams: Case study during COVID-19 pandemic [Paper presentation]. Eighth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain.

Baleni, Z. (2015). Online formative assessment in higher education: Its pros and cons. The Electronic Journal of e-Learning, 13(4), 228–236.

Becker, D. A., Connolly, J., Lentz, P., & Morrison, J. (2006). Using the business fraud triangle to predict academic dishonesty among business students. The Academy of Educational Leadership Journal, 10, 37.

Bin Mubayrik, H. F. (2020). New trends in formative-summative evaluations for adult education. SAGE Open, 10(3). https://doi.org/10.1177/2158244020941006

Bolliger, D. U., & Martin, F. (2021). Critical design elements in online courses. Distance Education, 42(3), 352–372. https://doi.org/10.1080/01587919.2021.1956301

Burke, D. D. S., Kenneth J. (2018). Applying the fraud triangle to higher education: Ethical implications. Journal of Legal Studies Education, 35(1), 5–43. https://doi.org/10.1111/jlse.12068

Cantabella, M., López, B., Caballero, A., & Muñoz, A. (2018). Analysis and evaluation of lecturers’ activity in learning management systems: Subjective and objective perceptions. Interactive Learning Environments, 26(7), 911–923. https://doi.org/10.1080/10494820.2017.1421561

Chirumamilla, A., Sindre, G., & Nguyen-Duc, A. (2020). Cheating in e-exams and paper exams: The perceptions of engineering students and teachers in Norway. Assessment & Evaluation in Higher Education, 45(7), 940–957. https://doi.org/10.1080/02602938.2020.1719975

Coghlan, S., Miller, T., & Paterson, J. (2021). Good proctor or “big brother”? Ethics of online exam supervision technologies. Philosophy & Technology, 34(4), 1581–1606. https://doi.org/10.1007/s13347-021-00476-1

Daffin, L. W., Jr., & Jones, A. A. (2018). Comparing student performance on proctored and non-proctored exams in online psychology courses. Online Learning, 22(1). https://doi.org/10.24059/olj.v22i1.1079

Dominguez, C., Garcia-Izquierdo, F. J., Jaime, A., Perez, B., Rubio, A. L., & Zapata, M. A. (2021). Using process mining to analyze time-distribution of self-assessment and formative assessment exercises on an online learning tool. IEEE Transactions on Learning Technologies, 4(5). https://doi.org/10.1109/TLT.2021.3119224

Draaijer, S., Jefferies, A., & Somers, G. (2018). Online proctoring for remote examination: A state of play in higher education in the EU. In E. Ras & A. Guerrero Roldán (Eds) Technology enhanced assessment TEA 2017: Communications in computer and information science (Vol. 829). Springer. https://doi.org/10.1007/978-3-319-97807-9_8

Epoc Converter. (n.d). Epoch and Unix timestamp conversion tools. Retrieved June 1, 2022, from https://www.epochconverter.com/

Fernandes, S., Flores, M. A., & Lima, R. M. (2012). Students’ views of assessment in project-led engineering education: Findings from a case study in Portugal. Assessment & Evaluation in Higher Education, 37(2), 163–178. https://doi.org/10.1080/02602938.2010.515015

Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education. McGraw-Hill.

Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004

Goedl, P. A., & Malla, G. B. (2020). A study of grade equivalency between proctored and unproctored exams in distance education. American Journal of Distance Education, 34(4), 280-–289. https://doi.org/10.1080/08923647.2020.1796376

Gurcan, F., Ozyurt, O., & Cagitay, N. E. (2021). Investigation of emerging trends in the e-learning field using latent Dirichlet allocation. The International Review of Research in Open and Distributed Learning, 22(2), 1–18. https://doi.org/10.19173/irrodl.v22i2.5358

Harmon, O. R., & Lambrinos, J. (2008). Are online exams an invitation to cheat? The Journal of Economic Education, 39(2), 116–125. https://doi.org/10.3200/JECE.39.2.116-125

Hu, S., Jia, X., & Fu, Y. (2018). Research on abnormal behavior detection of online examination based on image information. In 10th International Conference on Intelligent Human-Machine Systems and Cybernetics (Vol. 2, pp. 88–91). Hangzhou, China. https://doi.org/10.1109/IHMSC.2018.10127

Hui, B. (2023). Are they learning or guessing? Investigating trial-and-error behavior with limited test attempts [Paper presentation]. LAK23: 13th International Learning Analytics and Knowledge Conference, Arlington, Texas. https://doi.org/10.1145/3576050.3576068

Jaramillo-Morillo, D., Ruipérez-Valiente, J., Sarasty, M. F., & Ramírez-Gonzalez, G. (2020). Identifying and characterizing students suspected of academic dishonesty in SPOCs for credit through learning analytics. International Journal of Educational Technology in Higher Education, 17(1), 45. https://doi.org/10.1186/s41239-020-00221-2

Joshi, A., Vinay, M., & Bhaskar, P. (2021). Impact of coronavirus pandemic on the Indian education sector: Perspectives of teachers on online teaching and assessments. Interactive Technology and Smart Education, 18(2), 205–226. https://doi.org/10.1108/ITSE-06-2020-0087

Lee, K., & Fanguy, M. (2022). Online exam proctoring technologies: Educational innovation or deterioration? British Journal of Educational Technology, 53(3), 475–490. https://doi.org/10.1111/bjet.13182

Man, K., Harring, J. R., Ouyang, Y., & Thomas, S. L. (2018). Response time based nonparametric Kullback-Leibler divergence measure for detecting aberrant test-taking behavior. International Journal of Testing, 18(2), 155–177. https://doi.org/10.1080/15305058.2018.1429446

Martin, F., Bolliger, D. U., & Flowers, C. (2021). Design matters: Development and validation of the online course design elements (OCDE) instrument. The International Review of Research in Open and Distributed Learning, 22(2), 46–71. https://doi.org/10.19173/irrodl.v22i2.5187

Nigam, A., Pasricha, R., Singh, T., & Churi, P. (2021). A systematic review on AI-based proctoring systems: Past, present and future. Education and Information Technologies, 26(5), 6421–6445. https://doi.org/10.1007/s10639-021-10597-x

Noorbehbahani, F., Mohammadi, A., & Aminazadeh, M. (2022). A systematic review of research on cheating in online exams from 2010 to 2021. Education and Information Technologies, 27(6), 8413–8460. https://doi.org/10.1007/s10639-022-10927-7

Oosterhof, A., Conrad, R. M., & Ely, D. P. (2008). Assessing learners online. Pearson.

Pelanek, R. (2021). Analyzing and visualizing learning data: A system designer’s perspective. Journal of Learning Analytics, 8(2), 93–104. https://doi.org/10.18608/jla.2021.7345

Rios, J. A., & Liu, O. L. (2017). Online proctored versus unproctored low-stakes Internet test administration: Is there differential test-taking behavior and performance? American Journal of Distance Education, 31(4), 226–241. https://doi.org/10.1080/08923647.2017.1258628

Shbail, M. O., Alshurafat, H., Ananzeh, H., & Al-Msiedeen, J. M. (2022). Dataset of factors affecting online cheating by accounting students: The relevance of social factors and the fraud triangle model factors. Data in Brief, 40, 107732. https://doi.org/10.1016/j.dib.2021.107732

Singh, U. G., & de Villiers, M. R. (2017). An evaluation framework and instrument for evaluating e-assessment tools. The International Review of Research in Open and Distributed Learning, 18(6). https://doi.org/10.19173/irrodl.v18i6.2804

Slack, H. R., & Priestley, M. (2022). Online learning and assessment during the COVID-19 pandemic: Exploring the impact on undergraduate student well-being. Assessment & Evaluation in Higher Education, 48(3), 1–17. https://doi.org/10.1080/02602938.2022.2076804

Soffer, T., Kahan, T., & Livne, E. (2017). E-assessment of online academic courses via students’ activities and perceptions. Studies in Educational Evaluation, 54, 83–93. https://doi.org/10.1016/j.stueduc.2016.10.001

Snekalatha, S., Marzuk, S. M., Meshram, S. A., Maheswari, K. U., Sugapriya, G., & Sivasharan, K. (2021). Medical students’ perception of the reliability, usefulness and feasibility of unproctored online formative assessment tests. Advances in Physiology Education, 45(1), 84–88. https://doi.org/10.1152/advan.00178.2020

Stowell, J. R., & Bennett, D. (2010). Effects of online testing on student exam performance and test anxiety. Journal of Educational Computing Research, 42(2), 161–171. https://doi.org/10.2190/EC.42.2.b

Thelwall, M. (2000). Computer-based assessment: A versatile educational tool. Computers & Education, 34(1), 37–49. https://doi.org/10.1016/S0360-1315(99)00037-8

Tiong, L. C. O., & Lee, H. J. (2021). e-Cheating prevention measures: Detection of cheating at online examinations using deep learning approach, a case study, ArXiv Preprint. https://doi.org/10.48550/arXiv.2101.09841.

Traoré, I., Nakkabi, Y., Saad, S., Sayed, B., Ardigo, J. D., & de Faria Quinan, P. M. (2017). Ensuring online exam integrity through continuous biometric authentication. In I. Traoré, A. Awad, & I. Woungang (Eds.), Information security practices: Emerging threats and perspectives (pp. 73–81). Springer.

Trezise, K., Ryan, T., de Barba, P., & Kennedy, G. (2019). Detecting academic misconduct using learning analytics. Journal of Learning Analytics, 6(3), 90–104. https://doi.org/10.18608/jla.2019.63.11

Tripathi, A. M., Kasana, R., Bhandari, R., & Vashishtha, N. (2022). Online examination system [Paper presentation]. Fifth International Conference on Smart Trends in Computing and Communications, SmartCom, Singapore.

Vegendla, A., & Sindre, G. (2019). Mitigation of cheating in online exams: Strengths and limitations of biometric authentication. In A. V. S. Kumar (Ed.), Biometric authentication in online learning environments (pp. 47–68). IGI Global.

Vlachopoulos, D. (2016). Assuring quality in e-learning course design: The roadmap. The International Review of Research in Open and Distributed Learning, 17(6). https://doi.org/10.19173/irrodl.v17i6.2784

Yazici, S., Yildiz Durak, H., Aksu Dünya, B., & Şentürk, B. (2023). Online versus face-to-face cheating: The prevalence of cheating behaviours during the pandemic compared to the pre-pandemic among Turkish University students. Journal of Computer Assisted Learning, 39(1), 231–254. https://doi.org/10.1111/jcal.12743

Published

2023-12-05

How to Cite

Yıldırım, D., Ilgaz, H., Bayazıt, A., & Akçapınar, G. (2023). The Effects of Exam Setting on Students’ Test-Taking Behaviors and Performances: Proctored Versus Unproctored. The International Review of Research in Open and Distributed Learning, 24(4), 174–193. https://doi.org/10.19173/irrodl.v24i4.7145

Issue

Section

Research Articles