The rOSCE: A remote clinical examination during COVID lockdown and beyond

This article was migrated. The article was marked as recommended. The COVID-19 pandemic has created a challenge for all medical educators. There is a clear need to train the next generation of doctors whilst ensuring that patient safety is preserved. The OSCE has long been used as the gold standard for assessing clinical competency in undergraduates ( Khan et al., 2013a). However, social distancing rules have meant that we have had to reconsider our traditional assessment methods. We held a remote eight-station summative OSCE (rOSCE) for three final year resit students using Microsoft Teams. Apart from clinical examinations and practical procedures which are assessed elsewhere in our programme, the content was similar to our standard OSCE. Staff and student training ensured familiarity with the assessment modality. The rOSCE was found to be a feasible tool with high face validity. The rOSCE is a remote assessment tool that can offer an alternative to the traditional face to face OSCEs for use in high stakes examinations. Although further research is needed, we believe that the rOSCE is scalable to larger cohorts of students and is adaptable to the needs of most undergraduate clinical competency assessments.


Introduction
The 2020 global pandemic has forced medical educators to reflect on the viability of face to face, experiential medical education (Sabzwari, 2013).Nevertheless, it remains imperative that we facilitate the progression of medical students, ensuring the sustainability of the future workforce that is adequately trained and assessed to preserve patient safety.
The OSCE has been the mainstay of clinical competency assessment for years (Khan et al. 2013a).Whilst its value is disputed, it remains the gold standard assessment of practical skills in medical education (Khan, 2017).But there are limitations to the OSCE; it requires close working, detailed organisation and considerable resource in terms of clinicians, patients, physical space (Khan et al., 2013b) and costs (Brown et al., 2015).Adherence to infection control measures makes a traditional OSCE challenging during the COVID 19 pandemic.Whilst some schools may rely on Entrustable Professional Activities (EPAs) (Cate et al., 2015 andO'Dowd et al., 2019) or workplace-based assessments (Figure 1) to assess clinical competence, these have not been routinely applied in UK undergraduate assessment.Consequently, we could not replace the OSCE for workplace-based assessments at short notice.
(Khan and Ramachandran, 2012, reprinted by permission of Informa UK Limited, trading as Taylor & Francis Group, www.tandfonline.com) There is limited evidence on remote summative practical assessments; one study highlighted issues about the accuracy of marks (Novack et al., 2012).Therefore, there was a need to develop a valid, feasible, remote assessment of clinical competency.We developed a new style of remote OSCE (rOSCE) using Microsoft Teams.

The rOSCE
Although we held our finals OSCE just before the lockdown, a small number of students were required to undertake the resit rOSCE on 28 th May 2020.For this examination, following an addendum to the medical school's Examination Regulations (Supplementary File 1), we reformatted the planned 20 stations, involving a mix of simulated and actual patients, into an 8-station rOSCE with simulated patients.
Conventional stations were modified to assess skills that could reliably be tested online.Most stations retained a simulated patient and one used a simulated doctor to allow for a handover situation.Practical procedures were removed from the rOSCE.Stations involving emergency care scenarios were modified to sequential simulations, where patient outcomes were progressively revealed through a PowerPoint presentation depending on the students' actions.Several stations had elements of a structured interview added to overcome potential gaps in competency assessment, for instance, the patient safety aspects of prescribing.The rest of the stations remained unchanged.In order to allow for technological issues, an additional 5 minutes was allocated to each station.
Microsoft Teams (Microsoft Corporation, Redmond, Washington) was chosen as the online platform due to its versatility, familiarity and availability.A Team workspace was created with a channel for each station in the rOSCE; however, as we had fewer students than stations, we reduced the number of channels to correspond to the number of students.Each channel was managed by an invigilator and recorded; the recordings were kept within the corresponding channel, enabling moderation and external review, without disrupting the flow of the station.Despite regular use of Microsoft Teams, all stakeholders undertook a total of three hours training.A mock examination was held to familiarise the students with the process.All training, exam-day briefings and individual station calibrations were conducted via Teams.
Unlike traditional OSCEs that require students to rotate around a circuit of stations each testing different clinical competencies, the students remained in the same Teams channel throughout their entire exam.The assessors and simulated patients joined the channel intermittently during the rOSCE.When the stations began, the assessors shared a PowerPoint presentation with the students, consisting of a holding slide (Figure 2), student instructions and any information for the station.This process removed the need for sharing of confidential material or paperwork outside the examination.The students were blocked from changing the slides using the Teams presenter functionality.
The simulated patients joined the channel at the start of the station, with their camera off and microphones muted until invited in.This process was repeated until the students had completed all the stations (Figure 3).
The marking strategy did not change, we continued to use domain scoring.Marks were recorded using a Microsoft Excel file located in the MS Teams document library, however, a digital OSCE marking tool could be used if all assessors have access to the appropriate IT hardware.
Several contingencies were planned.These included an emergency contact number for the Chief Invigilator, having 'buddy' assessors in case of assessor illness, technology or network failure.

Discussion
The rOSCE was completed as part of the final year resit examinations with three students and eight stations.Despite a large investment in time to plan the logistics of the assessment and training, the process was simple and easy to administer.It ran effectively and was a good test of concept.Assessors and invigilators were well prepared, and the feedback was positive.One examiner commented "On exam day, I was able to forget that I was in a virtual setting and concentrate on assessing the student, as in a "normal" OSCE".A student felt "OSCE or rOSCE are good formats because you are also interacting with a patient".All examiners agreed that the rOSCE was a good strategy for assessing clinical competence.
When designing the rOSCE, we were mindful that it is unlikely that the environmental situation secondary to COVID-19 will change in the near future.Therefore, we needed to develop a system that was possible to scale up for our whole cohort rOSCEs which are planned for late 2020.Whilst further research is needed, we suspect that the rOSCE is scalable to need with an ability to run multiple circuits concurrently, removing the cost and space requirements of a traditional OSCE and the issues of standard-setting a multiple site OSCE.
Although developed for undergraduate medical students, the system could be adapted for other clinical competence assessments in both undergraduate and postgraduate settings.Removing the need for the physical presence of the student and assessor on one site also assists the development of clinical competency in remote geographical areas.
The rOSCE cannot assess practical skills or a student's ability to elicit clinical signs but there are other reliable tools to assess these skills and therefore we have reflected on the purpose of basic practical skills stations within OSCEs.Rather than assess a student's ability to spot clinical findings, the rOSCE should be used to assess the meta-competencies of a clinician-the ability to integrate the interpretation and analyse a situation in order the manage the patient competently.
The station design for an rOSCE requires careful planning and peer review.There is a risk that without being appropriately structured, with clear goals and outcomes, rOSCE stations could easily turn into mini vivas with all the known limitations of the viva as an assessment tool.There is also a considerable staffing requirement for the rOSCE, comparable to a conventional OSCE.It also relies on a good internet connection, though we have not found this to be a limitation despite our students being based nationally and internationally.

Conclusion
The COVID-19 pandemic continues to challenge all facets of medical education, particularly assessment.It is vital that we continue to innovate and find ways to fill the gaps left by the lack of patient-facing assessment options during the pandemic.The rOSCE is an option to remotely assess clinical competence in undergraduate medical students providing a feasible assessment tool which is adaptable to the needs of the clinical assessment.Further research is needed to explore the potential breadth of skills examined and the scalability of this tool.

Take Home Messages
The challenges of the COVID-19 pandemic has caused everyone involved in medical education to think differently.
The rOSCE was designed as a remote assessment of clinical competence run over Microsoft Teams.
The rOSCE is a feasible tool with high face validity.
The rOSCE can offer an alternative to the traditional face to face OSCEs for use in high stakes examinations.Although further research is needed, we believe that the rOSCE is scalable to larger cohorts of students and is adaptable to the needs of most undergraduate clinical competency assessments.Professor Joanne Harris is Dean of the Medical School at the University of Buckingham.She is a practising GP, specialising in medical education for the past 20 years and has a particular interest in the teaching and assessment of professionalism.ORCID ID: https://orcid.org/0000-0002-1544-5585

Declarations
The author has declared that there are no conflicts of interest.

Ethics Statement
This paper is a service evaluation paper which did not meet university threshold for needing ethical approval.All students consent to their data, including feedback data, being used for research purposes at the start of each academic year.Whilst formal ethical approval was not required, the approval of the Medical School Research Ethics committee was sought and granted for both this project and publication.They state that this study presents data collected as part of our medical school evaluation improvement processes, with all students consenting to their data being used in medical education research through an Annual Student Agreement.

External Funding
This article has not had any External Funding Supplementary File 1: The University of Buckingham has granted permission for the publication of this document.

University of Otago
This review has been migrated.The reviewer awarded 1 stars out of 5 Thank you for the opportunity to review your article about the use of rOSCE as an examination during lockdown.This article was of great interest to our group as it is a challenge familiar to many educators during Covid 19.The topic has merit, however more detail is required.We offer the following recommendations.To set the scene, a definition or description of a rOSCE and clinical competence is needed as these terms mean different things to different people.This definition is important as one of the findings was that all examiners agreed that the rOSCE was a good strategy for assessing clinical competence and that the system could be adapted for other clinical competence assessments in both undergraduate and postgraduate settings.The statements about Entrustable Professional Activities (EPAs) and workplace-based assessments to assess clinical competence, could be further described and related back to the appropriateness of these assessments during Covid 19. Figure 1 does not really explain these assessments and as reviewers, we were unsure as to its relevance to Covid 19.We suggest you consider deleting this figure.The comment about accuracy of marks in remote summative practical assessments sounds important and has relevance to your work (Novack et al., 2012).It would be helpful if you could provide the reader with more context about this study.Your article raised a few questions for us about how you reformatted 20 stations into an 8-station rOSCE.If practical skills were removed, please state they were assessed.Please describe the practical skills.If skills were removed explain how the skills changed the purpose of your OSCE.Including educational theory related to OSCE's might help answer these questions and strengthen this section.Adding theory would also enhance your discussion point about reflecting on the purpose of basic practical skills stations within OSCEs and assessing the metacompetencies of a clinician.Without this context, it is difficult to determine how you reached the conclusion that "rOSCE is an option to remotely assess clinical competence in undergraduate medical students providing a feasible assessment tool which is adaptable to the needs of the clinical assessment".We also suggest that you include citations to support your points about rOSCE stations turning into mini vivas, the known limitations of the viva as an assessment tool (what are these?), the considerable staffing requirement for the rOSCE, comparable to a conventional OSCE, and the reliance on a good internet connection.Finally, we ask you to reflect on whether you have provided the reader with enough evidence to support your take home message that the "rOSCE is a feasible tool with high face validity, that offers an alternative to the traditional face to face OSCEs for use in high stakes examinations".We would be very happy to review a revised version of this article.

Lincoln Memorial University
This review has been migrated.The reviewer awarded 4 stars out of 5 This article describing a remote OSCE will be of interest of those who are teaching and assessing clinical skills during the pandemic.The authors do a good job of describing how they used Microsoft Teams to run a resit OSCE, though only having run 3 students through the exam, this can only be considered pilot data.I would like to have seen the authors discuss the decision to go from the standard 20-station OSCE to the 8-station rOSCE, as decreasing the number of stations reduces the reliability of the exam.For transparency, I am an Associate Editor of MedEdPublish.However I have posted this review as a member of the review panel with relevant expertise and so this review represents a personal, not institutional, opinion.

Figure 1 .
Figure 1.Model for the competency-based assessment of performance and examples of available assessment tools

Figure 3 .
Figure 3.A diagram to illustrate the OSCE cycle

Figure 1 :
Figure 1: Source: Khan, K. and Ramachandran, S. (2012) 'Conceptual Framework for Performance Assessment: Competency, Competence and Performance in the Context of Assessments in Healthcare; Deciphering the Terminology; Medical Teacher, 34, pp.920-928.Reprinted by permission of Informa UK Limited, trading as Taylor & Francis Group, www.tandfonline.com.https://doi.org/10.3109/0142159X.2012.722707 Dr Claire Stewart MBBS BMedSci DRCOG DFRSH FRCGP MScMedEd FAcadMEd is a senior clinical lecturer and Assessment Lead at University of Buckingham Medical School.She is a GP and has an interest in new methodologies for assessment in medical education.ORCID ID: https://orcid.org/0000-0002-7338-9809Dr Bharathy Kumaravel MBBS MHSc FFPH FAcadMEd is a Senior Clinical Lecturer and Public Health Theme lead in the University of Buckingham Medical School.She is the chair of the Medical School Research Committee and the chief editor of the Journal of Medical Education Research.ORCID ID: https://orcid.org/0000-0003-4801-3081Dr Jacqueline O'Dowd is the strategic development Lead at the University of Buckingham Medical School.Originally a biomedical research scientist, her current research interests lie in development and evaluation of educational interventions, change management and stakeholder engagement in evolving medical educational pedagogy and practice.Dr Andy McKeown is a GP and Deputy Director of Medical Education at the University of Buckingham, focussing on the development of their Crewe Campus.His interests lie in educational authenticity, longitudinal integrated clerkships and interprofessional learning.

)
Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE).Medical Teacher.37(7), pp.653-659.OSCEs are outdated: clinical skills assessment should be centred around workplace-based assessments (WPBAS) to put the 'art' back into medicine.MedEdPublish.6(4), pp.20.Reference Source Khan, K. and Ramachandran, S. (2012) Conceptual Framework for Performance Assessment: Competency, Competence and Performance in the Context of Assessments in Healthcare -Deciphering the Terminology.Medical Teacher.34, pp.920-928.This is an open access peer review report distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.