Students designing assessment for future practice

Background Multiple choice questions (MCQs) are used at all stages of medical training. However, their focus on testing students’ recall of facts rather than actively facilitating learning remains an ongoing concern for educators. Having students develop MCQ items is a possible strategy to enhance the learning potential of MCQs. Methods Medical students wrote MCQs as part of a course on the medical care of vulnerable populations. Student perceptions of learning and assessment through MCQ writing were explored via surveys and focus group interviews. Survey responses were analysed using descriptive statistics and transcribed interviews were analysed thematically. Students reported that writing MCQs enhanced their learning and exam preparation and reduced their exam-related anxiety. It encouraged students to research what they did not know and benchmark their learning to that of their peers. Students described using deep learning strategies, were motivated to write high quality MCQ items for their peers and prioritised vocational learning in the development of their questions. The study suggests student-developed MCQs can enhance the learning value of MCQs as a form of assessment. It also highlighted that students can be capable designers of assessment and that learning processes can be improved if students are provided agency over their learning and assessment.

However, criticisms of MCQ assessment include the privileging of instructor perspectives, failure to acknowledge plurality of knowledge and difficulty addressing relativities and uncertainties (Epstein 2007). MCQs may not support robust clinical reasoning, required for future practice, instead, encouraging early closure in problem-solving processes and reducing opportunities to consider all diagnostic possibilities (Epstein 2007;Schuwirth, Van Der Vleuten & Donkers 1996). Designing MCQs that promote synthesis of information and use of constructive thinking skills, at the summit of Bloom's taxonomy, is more challenging (Albanese & Dast, 2014;Bloom et al., 1956). MCQ assessment has been associated with inferior retention of knowledge compared to assessment that requires the learner to construct an answer (Larsen, Butler & Roediger, 2008;Yu Tsao Pan et al, 2016). A further criticism is that testing via MCQs without providing feedback enhances the possibility that students will acquire false knowledge (Roediger & Marsh, 2005;Schooler, Foster & Loftus 1988).
These critiques do not favour higher education goals underpinned by adult learning theory where the focus is on providing learners with opportunities to be self-directed; to diagnose their own learning needs and to think autonomously (Knowles, Holton, & Swanson, 2015). They also challenge the role of academics to design assessments which achieve pedagogical goals of lifelong and self-directed learning ( Boud & Falchikov, 2007;Sadler, 2010).
In this study, we combined the pedagogical concepts that inform adult learning theory with the ideas that assessment should drive and sustain learning, by incorporating the requirement for medical students to write their own MCQs as a component of their assessment (Knowles, Holton & Swanson, 2015;Taylor & Hamdy, 2013). There is evidence that medical students can write high quality MCQs (Bottomley & Denny, 2011;Chamberlain et al., 2009;Galloway & Burns, 2015;Harris et al., 2015;Palmer & Devitt, 2007); that they find the process engaging and beneficial for learning and exam preparation (Craft et al 2017;Fellenz, 2004;Gonzales-Cabezas et al, 2015;Gooi & Sommerfeld, 2015;Jobs et al., 2013), and they have previously reported increased confidence and more reflective learning (Baerheim & Meland, 2003). These positive effects also appear to transfer to other forms of assessment (Yu Tsao Pan et al., 2016). Potential disadvantages include students adopting a minimalist approach if they perceive no reward for their effort (Jobs et al., 2013), the potential for poor retention of knowledge (Hoogerheide et al, 2018) and lack of student favour ( Palmer & Devitt, 2006).These studies highlight the need to provide guidance and feedback about the quality of MCQ items, for students to justify the correct answer and be able to identify meaning and relevance in the task.
In this study, we describe a process of integrating student-developed MCQs into a medical course which included feedback and incentives. We report students' experiences and interpretations of this task with a specific emphasis on the type of learning they engaged in and how they made sense of the learning and assessment experience.

Methods
This investigation was conducted within the primary care clinical unit of an Australian university. Participants were third and fourth year medical students completing an 8 week course focusing on the medical care of vulnerable people, as part of a 4 year Doctor of Medicine (MD) program. Data collection occurred from February to September 2017 and included 106 students from 3 consecutive course cohorts.
Ethics approval was obtained from The University of Queensland Human Research Ethics Committee B (approval number 2017000192).

Figure 1 Student-developed MCQ Model
The model of student-developed MCQs designed for this study is summarised in Figure 1. Each student was asked to develop and submit one multiple-choice question (MCQ) per week over 5 weeks of the 8-week course. Students were provided with an MCQ writing guide and checklist. This information ensured students were aware of the  (Sadler, 2010). One mark was awarded per completed question and contributed to the final written exam mark to provide motivation for students to engage in the process (Taylor & Hamdy, 2013).
Students were asked to (1) write single-best-answer style questions with a clinical scenario stem based on the topics and learning resources provided, (2) include a justification of the answer and (3) identify the learning resource(s) used. Questions were reviewed by an academic and 10 of the submitted questions each week were selected to form a quiz for the course cohort resulting in a total of 50 formative questions over the duration of the course. Once students completed the quizzes they could access the correct answers with justifications. From the cumulative pool of 50 formative questions, 20 were selected for inclusion on the final written exam. This meant that 25% of the written exam mark (5-mark credit for submitting 5 questions and 20 marks derived from the formative pool) was student-owned with the remaining unseen written paper worth 75 marks. This aspect was included to ensure students saw the relevance of their contribution and provide them with an opportunity to potentially shape the learning for themselves and for the broader cohort (Taylor & Hamdy, 2013). The written paper represented the knowledge assessment for this course and represented a third of the assessment weighting, with work-based and reflective practice skills representing the remaining two thirds.

Evaluation methods
A mixed-methods approach was used to obtain both quantitative and qualitative data. Following their written exam, students were invited to complete a survey and participate in focus groups about the student-developed MCQs. The paper survey involved Likert scale responses to statements about student learning, assessment and personal experiences developed for this study. Focus groups, facilitated by an independent medical educator, explored the associated learning and assessment experiences of students.

Analysis
Survey responses were analysed with descriptive statistics using IBM SSPS statistical software. Focus group transcripts were examined using inductive content analysis, allowing insights about students' perspectives and views to emerge from the data (Hsieh, 2005). These were developed as codes and scrutinised for patterns of ideas and themes extending across interviews.

Results
Survey 84% (89 of 106) students completed the survey and results are summarised in Table 1. Students found writing MCQ's a positive experience). 86% of students agreed that writing MCQs enhanced their learning and 89% that it encouraged them to engage more with the learning resources. 75% agreed that writing MCQs increased their preexam confidence, 68% that it reduced their exam-related anxiety and 53% that it improved their skills in answering MCQs. 72% agreed that writing MCQs was a valuable use of their time, and 58% found it was enjoyable.
91% of students agreed that the student-developed formative quizzes assisted their learning, 89% were encouraged to research what they did not know and 69% believed quizzes helped benchmark their learning compared to peers.
90% of students agreed that writing MCQs was a useful form of assessment, 83% that it provided them with ownership of their assessment and 63% that it encouraged collaborative learning.

Focus group discussions
Ten students participated in three focus-groups (4, 3, and 4 respectively) with two overarching themes emerging: Students as designers of assessment 1.
Linking assessment to vocational needs 2.
(A supplementary table of representative quotes is available for interest)

Theme 1: Students as designers of assessment
Students as consumers of learning and assessment processes felt their prior extensive experience of answering MCQs, in exams and via commercial question-banks, enabled them to critically appraise the learning value of MCQ items At this stage of our career, in theory we've done thousands of questions. Every day we do some kind of question (Quote1FG1S2) Students questioned the purpose and value of some MCQ exams. They preferred questions where an answer was clearly correct rather than contentious, and criticised questions which seemed designed to 'trick' them. Obscure or 'left-field' questions were considered unhelpful -they believed assessment should focus on understanding important concepts rather than assessing test-taking (gaming) skills.

I don't feel like [faculty MCQ exams] assesses my knowledge of the material; it assesses my test-taking skills and whether or not I can work around that MCQ to figure out what is most correct. (Quote6FG1S1)
Students described superficial approaches to learning when preparing for multiple-choice exams. They identified 'deep learning' as related to reasoning, connecting ideas and applying knowledge; and 'superficial learning' as memorising, accepting without thinking and reproducing facts. They described having to know 'facts' and scanning resources for likely multiple-choice-question fodder to prepare for MCQ exams. They spoke of preferring a deep learning approach and believed it was more valuable to their future career.
In the end, we are trying to be doctors. What's the point of memorising facts when we're trying to help our patients? (Quote7FG2S1) Students described how the opportunity to devise questions for themselves and their peers enhanced their engagement with learning materials requiring higher cognitive skills and a more disciplined approach.
It was a good way to study in a very different way. Where you're not just reading the material but you're really synthesising it and trying to apply it to a different scenario (Quote17 FG2S3) Students associated this agency over their assessment with a reduction in assessment-related anxiety.
It changed my whole outlook on MCQs. Normally I really stress out about school of medicine MCQs because sometimes they're really out of left field or you don't understand their wording (Quote31FG1S2) They described synergistic learning benefits from engagement with writing MCQs and completing student-developed MCQ quizzes.

So, you learn from writing the question and learn from doing the formative quizzes (QuoteFG2S3)
The student-developed quizzes were identified as the most useful formative questions they had encountered.
I liked how when you reviewed the quiz you are given the right answer and feedback. (Quote25FG1S4).
Students described frustration with assessments where correct answers were not revealed or explanations were inadequate. The strategic nature of their learning was evident in descriptions of mindless navigation of formative quizzes without feedback.

It is really frustrating when you do formatives and you don't know the answer and you have to do the quiz 15 times to get it right (Quote10FG1S4)
Students acknowledged the time and cognitive effort required to write good quality questions. They approached the task as 'academic designers' engaged in practices normally ascribed to the academic. Some students chose to write questions about concepts that lecturers or clinical preceptors had signposted as difficult with the intent of guaranteeing they understood that concept. Some wrote questions on topics that were new or interesting and others gravitated to material they found challenging to promote a better understanding.

[I wrote questions about] mainly things that I just had no idea about. So, it'd be like I'm learning it for the first time and then I'd feel like it was applicable and beneficial (Quote14FG3S3)
Reflecting on their experience, students identified that a greater understanding of the material was required to write a multiple-choice question (mandating a deeper learning style) having to justify the answer was analogous to teaching someone, and in order to teach, one needed to understand it well. They described using a different approach to their learning.

It's kind of like in a way teaching other people which I think is a good way to learn. By writing a question you have to know it enough to teach it to someone else (Quote16FG1S3)
Finally, students felt strongly that the approach to assessment in general should be improved. They argued that a 50% pass rate devalued the knowledge assessed.

Shouldn't the aim of exams be that students get most questions right? Like, if there is a low pass mark, why aren't people concerned by what students get wrong?(Quote9 FG1S2)
Students critically appraised the design and purpose of assessment through the lens of expert exam-takers and articulated clear preferences for assessment content. This contributed to the second theme centred on assessment for vocational purpose.

Theme 2: Linking assessment to vocational purpose
This theme emerged from students' discussions of how their experience of preparing for typical MCQ exams can handicap the real learning required for future practice as doctors.

More than half the questions don't actually help us in the clinical setting, but it's something that we have to learn because we have to do the MCQ exam to be able to pass (Quote8FG2S2)
A tension was evident between the need to be strategic learners for MCQ exams versus focussing on vocational

I felt like these [student developed MCQs] were all legitimate questions that everybody who finishes this course should be expected to know. That's what I appreciated about it (Quote4FG1S2)
At the forefront of student deliberations was the priority of vocational learning. Students expressed a desire to engage deeply with knowledge relevant for future practice and to be provided the opportunity to explain their answers and weigh-up pros and cons for diagnosis and treatment. They believed MCQ exams did not allow them to make this reasoning visible.

MCQ exams make you want to rote learn all the different steps rather than maybe sit and think about what we are trying to achieve in treatment and work it that way (Quote2FG1S2)
Students identified that writing a clinical scenario stem helped them focus on the patient and many based the scenario on patients they had encountered on clinical placement. Students described this process as enhancing their clinical reasoning skills.
It made you really think about the patient ... I was like-oh this'll be easy, I'll just write this out and then I was like-wait, does that patient actually present like that or would they actually say that or will they feel this way.
Students favoured questions that required them to make decisions about clinical problems and appreciated a wellwritten, clinical stem (quote20).

More useful types of questions for learning were 'what would you do in this situation' 'what's next in the management of this patient' or 'what should be your first step' (Quote20FG1S2)
This model provided impetus for students to engage in a community of practice with the formative quiz acting as a portal. Students were scattered across hospital campuses and the formative quizzes connected them as learners. Observing questions written by their peers sparked a level of competitiveness and motivation amongst students. This created a peer-learning quality-improvement process where students attempted to improve their MCQ items to match the quality of items produced by peers.

Some of the questions were very in depth and I was like 'oh, I need to step it up' like everyone's really putting in effort (Quote32FG1S2)
Comparing content and understanding was helpful with students keen to share in peers' learnings from different clinical placements.

The fact that we were all on different clinical placements, it was helpful to see whether or not you understand the material but also what others found important (Quote34FG3S2)
On a negative note, one student indicated that she thought one of the student-developed questions was sourced from a commercial question bank -and idea met with disapproval from the group.
Students believed they had capacity to contribute to item development and that this has learning benefit. They appreciated the opportunity to collaborate with peers to develop formative quizzes, and to be members of a community of assessment and learning practice. Students appreciated being given the responsibility to contribute to their own assessment and engage in meaningful learning. In designing their MCQs, they particularly focused on Kelly M, Ryan A, Henderson M, Hegerty H, Delany C MedEdPublish https://doi.org/10.15694/mep.2018.0000121.1 Page | 9 ensuring the questions were relevant to their future clinical work.
When asked whether the intervention should continue, all students agreed it should acknowledging that it fostered deeper learning.

Discussion
In this paper we describe and evaluate a model of integrating student-developed MCQs into a clinical course within a medical program.
A key finding was that students identified a shift in their learning approaches from superficial to deep when given the opportunity to devise MCQs. They also experienced reduced exam-related anxiety. Consistent with adult learning theory (Knowles et al., 2015), providing students with agency over their assessment facilitated an engaged peerlearning environment where vocational learning was privileged.
Writing assessment items provided students with a learning platform to engage, test and then reflect on their knowledge enhancing their assessment literacy (Deeley 2017;Smith CD, 2013). Students showed they could be effective and competent designers of assessment items and understood the limitations associated with MCQs. Students' frustration with the nature of MCQ assessment was compounded by lack of feedback.
By evaluating students' responses, this research provides insight into how students understand, interpret and critique the learning and evaluative components of MCQs. In particular, students prioritised vocational learning previously described by Mattick & Knight (2009) and expressed a desire for assessment to support this. Their recognition of the need for assessment to drive learning relevant to future work is well established in higher education assessment discourse (Foos and Fisher 1988;McDaniel et al. 2007). Writing a clinical scenario question-stem and having to justify the correct option was identified by students as enhancing their clinical reasoning and encouraging them to be patient focused. This is supported by studies that have shown that questions incorporating rich descriptions of the clinical context require more complex cognitive processes to answer and are therefore more representative of clinical practice (Schuwirth et al 2001;Schuwirth & Van Der Vleuten 2004).
In this study students not only created the context, developed the question and justified the answer for their own contributions, they also answered the same style of questions constructed by their peers. The formative quizzes connected their learning experiences and allowed them to learn from each other despite being geographically dispersed. Such advantages of peer learning opportunities have been previously described (Boud, Cohen & Sampson, 2001;Secomb 2008 Table 2 Tips for incorporating student-developed MCQs into the curriculum A strength of this study was the use of a mixed-methods approach. The survey provided an overall measurement of the experience of the cohort and the focus groups allowed a deeper exploration of the student experience of learning and assessment. However, there are several limitations: data collection relied on the participation of suitably motivated students and it is therefore possible that the data may be skewed in a favourable direction. The number of students who participated in the focus-group discussions was small, however the themes concerning the student experience of learning and assessment were consistently expressed across all groups. The survey and focus groups were conducted immediately following the written exam which may have provided students with a better opportunity to reflect on the effectiveness of their learning and experience of assessment however, students who felt that the exam was difficult may have been more negative about the process. The potential for plagiarism of questions has been noted as a potential problem in previous studies (Harris et al., 2015) and was not insured against in this study and although not overt, may have occurred.
Several avenues for further investigation of student-developed MCQs emerged including: measures of retention of learning, standard setting methods for student-developed questions, and whether students engage clinical preceptors in the process.
This intervention invited students to be designers and collaborators in their assessment. Each student had the opportunity to partner in assessment and the process appeared to harness their creativity and perspective, demanding a high level of participation. The model weakened the power-differential between academic and student and reframed it as mutually beneficial, acknowledging that students are key protagonists in the learning process and sophisticated in their understanding of their learning needs. Learning and assessment became a more democratic process as students became willing and proficient co-creators. By providing students with this agency it enhanced the meaning and value of learning and as a result, students privileged assessment targeted at their future vocational needs.

Take Home Messages
Involving students in MCQ writing promotes deeper learning and models an important and lifelong learning skill of independence and agency for learning Given the opportunity and appropriate feedback, students have expertise in developing high quality assessment questions for MCQ examinations Poorly designed MCQ exams may discourage or distract students from important learning Feedback on formative and summative MCQs enhances learning. Students want assessment to have vocational relevance.