An independent appraisal of an electronic OSCE management software system in a Psychiatry examination

Background The Objective Structured Clinical Exam (OSCE) is a method of assessment in psychiatry but is high in expenditure, requiring large amounts of staﬀ time and an extensive paper trail. The Online Management Information System (OMIS) for OSCEs is a software system that proposes to reduce these drawbacks. The aim of this study was to independently evaluate the use of OMIS through feedback from all key stakeholders involved in developing, administering and implementing an OSCE in a medical school. The independent appraisal of the system was done by an anonymous survey of four major participants in an OSCE; examiners, students, the academic team in charge of preparing and implementing the OSCE and an independent Information Technology (IT) team from the university who appraised the software. Results Feedback from examiners, academic team and IT experts was positive indicating a preference for OMIS over the typical paper format. Students mostly indicated a neutral response stating that OMIS had neither a positive nor negative eﬀect on them during the exam. A few participants expressed concerns over data security, diﬃculty with some software features and the system not being optimised for hand-held devices. In general, OMIS is presents as a promising tool to enhance the implementation and delivery of an OSCE exam over the typical paper format of the exam.


Introduction
The Objective Structured Clinical Exam (OSCE) was introduced in 1975 as a method of medical examination to overcome many of the disadvantages of traditional examination methods that were open to higher levels of subjectivity (Harden & Gleeson, 1975). It is a popular method of examination offering better validity and reliability in assessing clinical competencies when the appropriate standards are adhered to (Wass, et al., 2001) (Pugh, et al., 2014) (Hodges, et al., 2014). The OSCE offers superior reliability and feasibility as a format of exam over other methods due to its flexibility in accommodating varying numbers of students and examiners, different types of patient presentations and varying exam lengths and durations (M.F, et al., 2013). In psychiatry education, the OSCE is considered a valid assessment of clinical competencies with high levels of construct and concurrent validity (Hodges, et al., 1998). It appears to be the method of assessment of choice that has grown steadily in psychiatry over the last 20 years with several national certification organisations adopting it as a method of examination (Hodges, et al., 2014). As medical education heads towards competency based assessments with the evaluation of learning outcomes, the OSCE in combination with standardised board examinations and direct observation of skills in a clinical setting is considered by many as the 'gold standard' for measuring physician competence (Carracio & Englander, 2000). It is further touted as having a beneficial impact on medical students' learning and future performance (Gormley, 2011). Its value as a valid and reliable assessment method in allied health specialties is also being explored with positive results in nursing (Brosnan, et al., 2006) (Selim, et al., 2012) (Meskell, et al., 2015), midwifery (Smith, et al., 2012), dietetics (Farahat, et al., 2015) and physiotherapy (Ladyshewsky, et al., 2000) (Palekar, et al., 2015) amongst others.
Despite its many advantages, the OSCE has fallen under criticism as being high in expenditure, requiring large amounts of staff time and often requiring an extensive paper trail (Cusimano, et al., 1994) (Frye, et al., 1989). For a six station OSCE, it is estimated that 327.5 hours of staff and faculty time is spent on administration and developing an OSCE for each rotation of students (Cusimano, et al., 1994). The cost of this was estimated to be US 6.90 per student per station (Cusimano, et al., 1994). Here in Ireland, Kropmans et al estimate costs of running an OSCE of 7-12 stations approximately 11 times for 670 students in an Irish medical school would produce 9380 assessment forms and cost on average Euro 29 500.00 (Euro 2.80 per form) (Kropmans, et al., 2012). Despite being an expensive exam, cost effective analyses would suggest that it is still an effective and feasible method of examination but due consideration needs to be given to the resources it requires (Frye, et al., 1989) (Brown, et al., 2015) (Hasle, et al., 1994). In the climate of ever escalating costs, demands on resources and staff time, we decided to explore potential solutions to minimise these drawbacks to help preserve the OSCE as a viable method of examination in medicine. In our search, we came across the Online Management Information System (OMIS) for Objective Structured Clinical Examinations; an online OSCE management system designed to capture real time data online without the need for physical paper assessment forms as a record (Kropmans, et al., 2012). With this system, assessment forms are created online with assessment data captured in real time and results available instantly (Kropmans, et al., 2012). The system can be used on tablet devices, laptops and desktop personal computers (Kropmans, et al., 2012). OMIS purports to reduce administration costs per transaction by 70% compared to paper based solutions, increase accuracy of results by eliminating human errors and providing instant student feedback, improve validity of results with advanced psychometric analysis available pre and post exam standard setting and capture data in real time allowing for online reporting and analysis (Qpercom, 2016). Creators of the system report high levels of satisfaction by assessors who use the system, time savings, avoidance of errors and promote it as possibly a more effective method of identifying incompetent students over the traditional paper trail format of an OSCE (Meskell, et al., 2015) (Kropmans, et al., 2012) (Kropmans, et al., 2015). We set out to perform our own independent evaluation of the system as a pilot project in an Irish Medical School.

Objective
The aim of this study was to evaluate the use of an online management information system for OSCEs (OMIS) through feedback from a survey of all key stakeholders involved in developing, administering and implementing an OSCE in a medical school.

Setting of the study.
The study was conducted during an OSCE for the Psychiatry module of an undergraduate medical degree course in the School of Medicine in University College Dublin (UCD), Ireland. The school of medicine in UCD has a class of approximately 240 students in a year and the class completes a 6-week module Psychiatry module in their 5th year of a 6-year medical programme. During the 5th year, the class is divided into 4 groups of approximately 60 students each and each group attends one of four 6-week modules (Psychiatry, Medicine in the Community, Obstetrics & Gynaecology and Paediatrics) at a time during the course of the year. The Psychiatry module is conducted four times in an academic year for four groups of students in the year at a time. At the end of each 6-week module for Psychiatry, the students are assessed with an OSCE. In an academic year, a total of 4 OSCEs are run for Psychiatry for approximately 60 students each time.

Format of the OSCE
The OSCE in Psychiatry consists of a circuit of 6 stations each marked by one examiner. The OSCE is conducted in two parallel circuits of the 6 stations to accommodate the student numbers with 12 students completing a sitting of the OSCE at a time. Each exam runs 4 sittings of the OSCE. Some OSCE stations are stations where a student is assessed based on their interaction and assessment of a simulated patient and others stations known as 'object stations' are where the student answers questions based on a particular clinical scenario that is presented to them by an 'object' such as a video or a clinical vignette rather than a patient. In the acted stations, the examiners record their marks based on a marking sheet checklist. In the object stations, the students record their answer on a written answer sheet that is later marked by an examiner.

OMIS software
The OMIS software was made available to the examiners on a tablet device (an iPad). Examiners received an individual training session on how to use OMIS and how to use an iPad before the OSCE. The paper assessment form that contains the marking criteria was uploaded on to OMIS and each student would now be marked using OMIS on the iPad thus replacing the old paper assessment method. OMIS instantly calculates total marks and stores it as each student completes a station. Prior to the OSCE, the psychiatry tutors and administration staff would create the OSCE stations and input all the necessary data such student information and order during the exam as required by OMIS beforehand.

Independent appraisal of the software
The independent appraisal of the software was done with anonymous survey questionnaires completed by the four major stakeholders in the OSCE; examiners, students, the academic team in charge of preparing and implementing the OSCE and an independent Information Technology (IT) team from the university who also appraised the software. The questionnaires consisted of questions with responses recorded on a Likert scale from 1 to 5; 1= Strongly disagree, 2= Disagree, 3= Neutral, 4= Agree, 5= Strongly agree. A few questions required 'yes', 'no' or 'don't know' answers and some questions were left as a free text box for responses. All survey results were recorded and analysed using IBM SPSS version 20. The number of responses (n) per item of the questionnaire was recorded and tabulated in the tables of results. Table 1 shows the feedback from 64 students surveyed during the pilot project of introducing OMIS during and OSCE. In the survey, students were asked if they were aware of examiners using an iPad device during the exam rather than the traditional paper assessment forms and out of the 64, 58 (90.6%) were aware of this and 6 (9.4%) were not. Majority of students indicated that the examiners using iPads did not have either a positive or negative effect on them during the exam. Majority were also not concerned about data protection or confidentiality. It should noted that although the majority indicated a relative indifference to the use of iPads, small numbers, 4 (6.3%) students and 3 (4.7%) did indicate that this affected them negatively during the exam and they had some concerns about data protection issues.

Examiner feedback
In Table 2, the feedback from a total of 16 examiners involved in the OSCE was overall positive in that they found OMIS enjoyable to use, better than the paper forms and both the software (OMIS) and hardware (iPads) were easy to use. Almost all of the examiners did not find OMIS distracting from the task of examining and made the task of examining an OSCE station easier than with the paper assessment forms. Pertaining to some of the unique features of OMIS; the ability to check summary one's marks and the trend of marks of others, the examiners were overall positive but a few indicated a neutral response to this as they were unaware that this feature existed. Of note, 2 examiners and 1 examiner registered a negative response to the ability to record comments with OMIS on the iPads and found marking with the paper assessment form easier respectively.

Academic team feedback
In Table 3, the feedback from 5 members of the academic team who were involved in preparing, administering and implementing the OSCE was overall very positive. All five members agreed that the OMIS was enjoyable to use and showed a preference for OMIS over the typical paper method. All also agreed that OMIS and the iPads were easy to use. All members agreed that OMIS saved time in both preparing the OSCE and during tabulating results with a reduction of the chance of errors occurring. Again a similar trend as in results with other stakeholders, a slightly more varied response was obtained in Statement 7 concerning the summary marks feature in OMIS where one academic indicated a neutral response. In further feedback, the academic team concluded that OMIS reduced the approximate costs and time spent on preparing and implementing the OSCE by approximately 50%.

IT Team Feedback
We asked two independent IT experts who are attached to the university to appraise the software. Both experts agreed that OMIS was user-friendly, well designed and fulfilled its function as an OSCE management software well. The IT experts were both generally unaware of any similar software to OMIS. Both experts did register neutral and one negative response in regards to the degree of security of data and confidentiality of results with the use of OMIS. One expert also gave a neutral response in regards to the ability of OMIS to provide backup of stored information.
We also included free text responses on what the experts thought were the best and worst features of OMIS. One expert cited the ease of use, the efficiency and ability to instantly calculate results as being the best features of OMIS. The second expert stated that the multiplatform format of OMIS and the flexibility of it was its strength.
Amongst the worst aspects of OMIS according to the experts, were that they thought more information and examination of its security aspects such as encryption was needed. To a lesser degree, one expert felt that OMIS could benefit further from more styling in its appearance to appeal to the user more and a greater variety of OSCE forms were needed to design a wider variety of OSCE stations. They also agreed that the current version of OMIS was not optimised for tablet devices such as iPads and thus could lead to difficulties with users trying to navigate the software on a tablet rather than on a desktop or laptop device.

Main Findings
The main finding that this study revealed was that the OMIS software was overall a positive enhancement to the current OSCE format of examination in psychiatry. Feedback from examiners was positive and they preferred the OMIS method of examination better than the typical paper assessment form. Examiners found the software easy to use and made the task of examining easier and more enjoyable. The examiners gave slightly lower scores relating to specific features related to OMIS largely due to the fact that they were unaware of these functions of the system but if they had been aware, they would have found them useful. The ability to record comments was also somewhat difficult for some examiners and may have been possibly related to the OMIS system not being optimised for use on tablet devices.
In terms of feedback given by students, students were generally neutral towards the use of OMIS during an OSCE indicating that the use of OMIS did not affect them either positively or negatively. A small minority reported that it had a negative effect on them during the OSCE and had concerns about the confidentiality and security of the software. The appraisal from two independent IT experts was favourable in that they thought that OMIS was an easy to use and well-designed OSCE management tool. They shared some concerns over the software's security and confidentiality features but overall were unaware of any major faults with the system. Its multiplatform format and ease of use were particular strengths but the current version was not optimised for tablet devices and thus made tasks such recording comments without a separate keyboard slightly more cumbersome for examiners.

Strengths and limitations of the study
To our knowledge, this study is the first independent appraisal of the OMIS software for an OSCE in psychiatry. The study offers a rounded unbiased appraisal from four different perspectives involved in an OSCE (students, examiners, academic team and a perspective from IT experts). As highlighted in the introduction, an OSCE is often a laborious task involving many different parties with high demands on cost, time and effort in many schools. We hope that this study provides some positive findings in regards to the introduction of technology such as OMIS that could help mitigate some of these factors and continue to support the ongoing use of the OSCE format of exam an effective method of assessment of competencies in medical education.
A relative weakness of the study is that its population size is small and limited to an OSCE in psychiatry. While the format of psychiatry OSCE is not vastly different to other subjects, the results perhaps are not entirely generalisable to OSCEs in other subjects. In addition to this, the lack of a control group (one where the examiners used the typical paper method of assessment) limits the authors from concluding more definitively that OMIS was superior to the typical paper method. However, the inclusion of comparison statements between OMIS and the paper method in the survey questionnaire overcame this somewhat. In this study, we did not include a cost benefit analysis on the implementation of OMIS as this was a pilot project and therefore the authors cannot comment on this aspect of the addition of OMIS to the OSCE.

Conclusion
In conclusion, OMIS presents as a promising enhancement by way of incorporating technology in the field of psychiatry OSCE examination by offering many advantages over the traditional paper-based method.

Take Home Messages
OMIS presents as a promising enhancement by way of incorporating technology in the field of psychiatry OSCE examination by offering many advantages over the traditional paper-based method.

Notes On Contributors
Dr Fernandez is a consultant Psychiatrist working in St Vincents University Hospital, Dublin, Ireland. He has an interest in Medical Education and is actively involved in the teaching and examining of Psychiatry to medical students in University College Dublin.