SHORT COMMUNICATION Evaluating the impact of faculty development programme initiative: Are we really improving skills in MCQ writing?

A series of seven workshops were conducted in 2018, at the National University of Medical Sciences and its aﬃliated institutes, to evaluate the eﬀectiveness of a three-hour workshop in improving faculty competence in developing high quality test items. Participants’ satisfaction was evaluated with a post-workshop feedback questionnaire. A self-made structured questionnaire was required to be filled as a pre-test and post-test assessment. Paired t-test was applied and diﬀerence in mean scores of responses was evaluated. A total of 141 faculty members were trained. The training session led to high satisfaction in all elements of workshop, significant improvements in boosting confidence in item writing skills ( p =0.001), recognising parts of MCQs ( p =0.001), identifying item writing flaws (p=0.001), and levels of Millers pyramid and blooms taxonomy ( p =0.001). Training sessions of short duration are eﬀective in improving faculty competence in writing high quality test items, if hands-on experience is built-in and eﬀective feedback is provided.


Introduction
For faculty of medical colleges to experience selftransformation and avoid obsolescence, it is imperative to run training programmes for continuous medical education.It increases quality of learning and assessment, thereby benefitting a major workforce of nascent physicians trained by them. 1 The challenge lies in going beyond attendance and satisfaction; knowing how a wellorganised training session can modify the cognition and behaviour of the faculty, which in turn will affect the future physician workforce. 2Faculty developers, by conducting immediate programme assessment, open the doors of reflection, self-evaluation and thereby, improvement. 3hese steps are vital for the faculty to steer clear of professional isolation or attrition. 4th the move towards Competency Based Medical Education (CBME), institutions are spending time, money, and resources on faculty training in acceleration. 5Though literature is replete with studies based on faculty development initiatives, evidence, of the impact of the programme on the faculty, which should be evident by observable change in their behaviour and practices. 6The focus of these studies is mainly on immediate satisfaction and improvement in knowledge, with no reporting on gain in participants' skills.These training programmes lack proper structure, mostly occur in private medical colleges, and do not practice programme evaluation. 7Additionally, while planning coaching sessions, the foreground for most of the trainers is skill transfer or enhancement, often ignoring participants' motivations and values. 3e rate-limiting step in evolution towards CBME is a model programme for faculty development practicing hands-on experience and prompt feedback to improve skills. 8,9The most popular model for evaluating effectiveness of faculty training programmes is Kirkpatrick's model which delineates four levels of training outcomes, namely, reaction, learning, behavioural change, and results via organisational performance. 10 CBME, learning and assessment must be congruent. 5This steers toward incorporating quality assurance in assessment process to provide validity evidence. 11A study in a public sector college in Pakistan, using MCQs as an assessment tool, revealed that our faculty is not trained to write high quality MCQs and their item writing skills are poor as many flaws were identified when items were reviewed. 12In addition, test development is costly, requires increased time, energy, and effort in item authoring, item moderation, test administration, and post hoc analysis.It has been estimated that construction of a good multiplechoice item costs $1,000, approximately. 13This concludes that in faculty development programmes, more emphasis must be paid on improving the quality of item writing (MCQs) which provides content validity evidence. 11Results of another study proved that effective feedback given to MCQ authors improved quality of test items and reduced item writing flaws. 8culty training in assessment must be a longitudinal programme and not a random, one-time activity as effective assessment is not an innate skill, rather it requires an ongoing training, practice, and feedback. 14Most faculty members acknowledged the importance of training, but these training sessions should be designed keeping in mind its feasibility for the faculty.Part-time sessions, rather than full day engaging ones, are more acceptable to them. 15ere are many medical and dental colleges operating under the umbrella of our institution.Review of MCQs for item bank generation and for exams is done on a regular basis.It was observed that the questions submitted by most faculty members were below par, even though training sessions were conducted frequently within these institutions.Also, faculty members were doing the tedious job of evaluating their MCQs themselves.As mentioned earlier, the majority of the training models are ineffective as they do not go beyond the Kirkpatrick level 1 and lack 'success factors' for faculty development, namely, incorporation of active learning and feedback, effective relationship with colleagues and diverse teaching approaches. 7,16l the above mentioned informal and unplanned needs assessment drove us to plan a comprehensive faculty development programme consisting of continuous series of three-hour training sessions on writing high quality MCQs.The objectives were to evaluate faculty competence in developing high quality test items, to report on their degree of satisfaction and explore its immediate impact on the cognitive and affective domain of the participants.

Methods and Results
After taking permission from the institutional ethics committee, a descriptive, cross sectional study was conducted at the NUMS constituent and affiliate medical and dental colleges, in 2018, over a five-month period.The learning effect of workshops was evaluated on the same cohort comprising clinicians, bench scientists, consultants, and supervisors of PG trainees.Participants' satisfaction was evaluated with a post-workshop feedback questionnaire.
Initially, we provided relevant feedback on pre-workshop MCQs.A pre-test was conducted, followed by a group activity on making MCQs.This hands-on activity, in turn, was reinforced by giving immediate feedback on each constructed MCQ.A post-test was taken at the end of the session.
Pre-and post-test questionnaires were structured and consisted of 13 items.The first part (Q1-9) was meant to evaluate the current perceptions of the faculty, SOPs, and current practices within their institution, regarding the quality assurance of test items.All the questions asked reflected quality assurance procedures, attitude of participants and their institution towards assessment.The second part (Q 10-11) assessed their key concepts about basic anatomy of MCQs (each part scored 01 mark; total 05 marks), Blooms taxonomy levels (01 mark for each level; total 06 marks) and Millers competency pyramid (01 mark for each level; total 04 marks).The third part (Q13) was reserved for measuring participants' knowledge and skills regarding item writing flaws (01 mark for each flaw; total 03 marks).The participants were scored out of total 18 marks in Q10-Q13.Data was analysed using SPSS 23.Paired t-test was applied to compare the mean scores of the same cohort in their pre-test and post-test.
A total of 141 faculty members attended the workshop in seven cohorts and completed the pre-test; 119 participants completed the satisfaction form, whereas 123 participants completed the post-test.
The participants acknowledged that the said workshop promoted collaborative learning, and declared that feedback helped them in learning.They hoped that knowledge and skill gained will help them improve their future practice of making MCQs.They suggested that this workshop should be offered to all faculty members.Their average ratings on hands-on learning, collaborative tasks, key concepts of assessment, quality and relevance of power point presentation, and future practice determinacy were high (1-2) on a scale of 1-5 (Table -1).Significant (p=0.001) positive changes were noticed in the faculty members' knowledge about the basic five parts of an MCQ, levels of Millers pyramid and Blooms taxonomy, confidence in making high quality MCQs and identifying The format of workshop promoted collaborative learning Key concepts of assessments were reviewed in the workshop The concepts addressed will help improve practice Hands on activities promoted learning The facilitator(s) promoted active learning Material provided, power point presentation was clear and relevant There was sufficient time for questions, answers and discussion The venue was comfortable, and learner centred The workshop should be offered to all faculty members members made a little more than 5 MCQs per month, only 30 (21%) members wrote items which assessed higher cognition level.Only 66 (47%) members declared that their items were checked for quality; of these only 10 (7%) affirmed that their test items were reviewed by educationalists, 38 (27%) relied on senior guidance and peer evaluation to ensure quality and 4 (3%) did this tedious job themselves (Table -2).Anecdotally, lack of proper assessment unit and no availability of full-time educationalists within these institutions might be the cause of these practices.
Our study results compared favourably with studies by AlFaris, 17 Abdulghani 18 and Naeem, 19 all of whom concluded an evident improvement in participants' knowledge scores (learning) after the training intervention.This was congruent with other studies where positive changes in teachers' knowledge, attitudes, and skills following participation in a faculty development activity were observed. 21The results showed post workshop increase in confidence of the participants in their MCQs making skill.This suggested that even a shorter duration, well-designed rigorous training session, can be successful.This was in sheer harmony with the study by Pandachuck and Dellinges 22,23 in which the participants supported the fact that brief training sessions were helpful in improving their skills.Our study results closely related with a study of Abdulghani et al, 17 who conducted a two-day workshop in which the faculty analysed MCQ items for Bloom's cognitive levels and item writing.This reinforced the viewpoint of Notzer that hands-on training sessions with learner centred approach and effective feedback to participants, given promptly at appropriate time, led to successful faculty development programme. 24

Limitations
This was the first in a series of studies that will follow the cohort and determined the impact of reinforcements and repeated feedbacks.Forthcoming studies are aimed at the impact of serial training opportunities for improving the faculty members' skills on writing quality MCQs and the role of appropriate feedback.

Conclusions
Our study proved that well-designed, shorter faculty development sessions, which included feedback as an important element of training, had a positive impact on knowledge and behaviour of the participants.It can be deduced that shorter sessions are less resource-intensive, more convenient, encourage faculty involvement and improve competence of the faculty.They can be repeated more often to reinforce knowledge retention and skill refinement and a large number of people can be trained in a short span of time.