Piloting an Extracurricular Quality Improvement and Patient Safety Training Program for Interprofessional Learners

Introduction Experts recommend that health professions students acquire knowledge and skills in quality improvement and patient safety. Educational initiatives exist, but involve minimal interprofessional contact and experiential learning. We piloted an extracurricular program combining didactic elements and projects to address these issues. Methods We collected demographic information and administered a post-program survey to assess the pilot’s reach and impact. We analyzed responses using simple descriptive statistics and thematically analyzed unstructured feedback. Results Fifty-one students participated, including twenty-one (41%) undergraduate students, sixteen (31%) graduate students, and fourteen (27%) medical students. Nineteen (37%) participants responded to the survey. Qualitatively, themes around workshop eﬀectiveness, program administration, project-student mismatch, and engagement and accountability emerged. Discussion Despite limited response rates, our training program appeared to be well received. However, key issues of engagement and impact remain. Future eﬀorts will focus on improvement in these areas and more rigorous evaluation of learning outcomes.


Introduction
The Health Professions Education: A Bridge to Quality report advocated that all students and working health professionals develop and maintain proficiency in delivering patient-centred care, working in interdisciplinary teams, practicing evidence-based medicine, focusing on quality improvement, and using information technology (G. Armstrong et al., 2012). To this end, experts recommend that patient safety (PS) and quality improvement (QI) be taught to healthcare professionals, and that competencies in these areas complement clinical knowledge and skills in the delivery of safe and high-quality patient care (Wong et al., 2010).
A number of reviews describing various PS and QI curricula and their impacts on health professions students have emerged over the last decade (Wong et al., 2010;L. Armstrong, Shepherd and Harris, 2017). These reviews have found that most curricula combine didactic and experiential learning to address topics in QI, root cause analysis and systems thinking, and examine changes in student experience and knowledge (Wong et al., 2010;L. Armstrong, Shepherd and Harris, 2017). The majority of studies involved nursing or medical students, and, of the curricula focused exclusively on medical students, the majority involved fewer than 10 contact hours and did not include learners in PS or QI projects (Wong et al., 2010;L. Armstrong, Shepherd and Harris, 2017).
Other published initiatives have included other health professions, including nutrition, pharmacy, and occupational and physical therapy students (Galt et al., 2006;Dobson et al., 2009). These programs tended to use lectures, readings and assignments to convey key concepts in QI theory, patient safety, and the disclosure and prevention of errors through system improvement (Galt et al., 2006;Dobson et al., 2009). However, these curricula were limited in duration, offered minimal interprofessional contact, and infrequently offered experiential opportunities to learn and practice QI (Galt et al., 2006;Dobson et al., 2009).
Given these limitations, more recent work has focused on identifying barriers to curricular integration of QI and patient safety. Faculty highlighted competing demands in already overcrowded undergraduate curricula, insufficient numbers of faculty members and clinical preceptors with adequate expertise to teach and mentor, and the limited utility of the classroom setting to provide an authentic learning experience (Tregunno et al., 2014). Other barriers include insufficient time or budget, stakeholder resistance and hierarchy (de Vries-Erich et al., 2017).

Objectives
To address these gaps and overcome these barriers, we developed an extracurricular initiative, the Quality Improvement Practical Experience Program (QIPEP), using the general principles of educational experiences in healthcare as a blueprint: combining didactic learning and project-based work, linking projects with health system improvement efforts, and assessing outcomes (G. Armstrong et al., 2012). We describe the structure of the pilot program, provide an overview of the demographics of the first cohort, and report on participants' attitudes and satisfaction. In this way, we draw key lessons to help others create and launch their own similar educational initiatives.

Methods
This study was approved by the Kingston Health Sciences Centre Research Ethics Board (PAED-440-18).

Participants, Eligibility, and Selection
QIPEP spanned the academic year (September to April). All full-time Queen's University undergraduate and graduate students for the academic year were eligible to apply. Formal recruitment ran from April until June with students submitting an electronic application form. Applications were scored by QIPEP leadership using a rubric (we would be pleased to share both with interested readers) and admission offers were made in May. Figure 1 provides an overview of the program and components.

Didactic Learning
Participants completed the Institute for Healthcare Improvement (IHI) Basic Certificate of Quality and Safety in preparation for their projects (Certificates and Continuing Education, 2018). To complement the IHI modules, QIPEP offered a series of workshops focused on practical skills students needed to apply the theoretical information. Five workshops were piloted during the year: Basics of Project Management, Process Mapping and Analysis, Data Collection and Analysis in Excel, Slide Deck Development, and Poster Development and Writing a Final Report.

Projects
Projects aligned with or built on existing healthcare improvement efforts at university-affiliated teaching hospitals. During participant recruitment, the QIPEP leadership team identified clinicians with expertise in PS and/or QI to serve as "faculty supervisors." Prior to the official start of the program in September, participants were matched to projects and connected with their supervisors to gain an understanding of the nature and scope of the problem.
Students began their work in September with a focus on the "Plan" stage of the "Plan-Do-Study-Act" (PDSA) cycle. The "Plan" stage ran until December and gave teams time to refine the identified problem and proposed intervention, submit an ethics application, develop a formal project charter, review the relevant literature, and collect baseline data. Between January and February, the teams aimed to conduct two PDSA cycles. During this time, teams implemented their tests of change, continued to collect, analyze and interpret data, and made recommendations based on their findings. By the end of March, teams concluded any outstanding PDSA work and wrote their final reports. Teams were also encouraged to submit an abstract to relevant conferences to share their work and results with the larger PS and QI community.

Data Collection
We collected demographic information, including gender, primary degree, program and degree and year of study from participants' applications to the program. A structured survey requiring Likert-based and free-text responses was emailed to all participants following conclusion of the program (Appendix 1).

Analysis
We analyzed responses to Likert-based questions using simple descriptive statistics. We compiled unstructured feedback and analyzed responses using a constant comparative approach (Glaser, 2006). AR and MLD independently coded responses and then worked iteratively to develop a final set of themes.

Demographics
The program received a total of 80 applications. Fifty-one students were accepted and matched to seven projects. Thirty-eight (75%) participants were female and eighteen (25%) were male. Twenty-one (41%) participants were undergraduate students, sixteen (31%) were graduate students, and fourteen (27%) were medical students. Table 1 provides a breakdown of participants' academic programs. Five (14%) undergraduate students were in their first year, fifteen (43%) in their second year, six (17%) in their third year, and eight (23%) in their fourth year. Table 2 describes projects and their outcome at the end of the year. Three projects concluded at the end of the 2016-17 year and four projects continued into the following academic year (2017-18).

Survey Responses
Nineteen (37%) participants responded to the structured survey. Fifteen (79%) respondents agreed or strongly agreed that they have changed their way of thinking about roles and responsibilities within the healthcare system. Seventeen (89%) participants agreed or strongly agreed that they identified gaps in the healthcare system through completion of their QIPEP project. Eighteen (95%) respondents agreed or strongly agreed that QI has the capability to improve overall healthcare and seventeen (89%) agreed that they wanted to learn more about QI.
Thirteen (68%) agreed or strongly agreed that they would be able to apply the skills and knowledge learned from the program in their career. Six (32%) agreed or strongly agreed that they often felt disengaged from the project and two (11%) agreed or strongly agreed that the program was too time-consuming. Six (32%) agreed or strongly agreed that they were satisfied with the impact of their project. Fourteen (74%) agreed or strongly agreed that they would recommend participation in the program to friends and colleagues.

Survey Themes
Five themes emerged from the unstructured feedback. We grouped these into strengths (two themes) and opportunities for improvement (three themes). Strengths included effectiveness and utility of the workshops as tools for facilitating learning and ensuring accountability and robust program administration and communication. Opportunities for improvement included a mismatch between project needs and assigned students, maintaining the participant engagement and accountability, and a protracted research ethics board (REB) approval process.

Effectiveness and utility of workshops
As a learning tool, respondents commented that workshops were informative and well organized, presented in a logical manner, and complemented their projects. Multiple participants also cited that they provided a dedicated time and place for teams to convene and "work together in a more accountable way."

Robust program administration and communication
Participants highlighted that there was strong communication between teams and site managers about projects and workshops. They also commented that the site managers were quick to address all of the administrative tasks at the very start of the program.

Project-student mismatch
According to respondents, the mismatch between project needs and assigned individuals took many forms. Some students commented that projects were overstaffed (the average project had seven students) and that there was not enough work to go around while others cited a discordance between project needs and participants' schedules or skills. Students also commented about the mismatch between project topics and their interests.

Maintaining participant engagement and accountability
The disengagement of participants and the lack of accountability within certain teams was another theme arising in the responses. One graduate student in particular commented, "I found it difficult to lead a team of undergraduate students who were not as committed to the project as I had originally hoped, and found myself completing a lot of the project myself."Others discussed the need for improved communication within teams as well as greater faculty input and supervision as suggestions for improvement in this area.

Protracted REB Approval Process
In the context of an already compressed timeline, many participants highlighted that the long REB approval process detracted from their ability to carry out key aspects of their projects, including PDSA cycles. After raising these issues, students suggested solutions, either only allowing projects with prior ethics approval or securing approval prior to the start of the program.

Discussion
We developed and launched a yearlong extracurricular initiative, QIPEP, using a combination of didactic and experiential elements to provide interprofessional students with fundamental knowledge and skills in QI and PS. Our pilot involved over 50 students representing more than 15 different academic programs. Generally, the program was well received with participants reporting a desire to learn more about QI and a willingness to recommend the program to peers.
However, through the quantitative and qualitative responses, two key areas of focus for the future emerged. The first is engagement. We infer that the high level of participant disengagement during the pilot was likely the result of a mismatch between project needs and the number of assigned students or their skills, a lack of communication between team members, or some combination of the two. Although the workshops were designed to fill gaps in practical skills (e.g., data collection and analysis) and provide a regular forum for teams to meet, we recognize they are insufficient in addressing these issues. With this in mind, we initiated a plan for the following year to consult faculty supervisors early in the summer to assess the human resource needs of their project and align them with the availability and schedules of incoming participants. Impact is another area. Due to the compressed timeline of the program, achieving milestones is critical. At our institution, QI projects must receive ethics approval, which can sometimes take weeks. Such lengthy approval processes diminished teams' time to move forward with projects, including collecting and analyzing data, proposing changes, and testing them. In contrast, some teams were unaffected by ethics-related delays, creating disparities in what groups were able to accomplish during the program. Recognizing that projects should start on the same proposed start date to give enough time for teams to complete PDSA cycles, we are working with faculty supervisors to have REB approval secured the summer before the program officially begins.

Limitations & Future Directions
There were a number of limitations to our approach. First, we did not employ a formal program evaluation to identify changes in commonly assessed educational outcomes, including knowledge and skill acquisition. Our assessment of attitudes and satisfaction at the end of the program was also informal and leveraged a homegrown questionnaire. Linking participants' application submissions with their post-program responses may highlight potential differences of interest. Future work should also incorporate mixed-methods approaches, including testing of knowledge and skill acquisition using validated instruments like the QI Knowledge Application Tool-Revised (Singh et al., 2014), and examine the perspectives of faculty supervisors using semi-structured interviews. Second, our findings may not be readily applicable to other settings. Although nursing students have been the target of many curricular efforts (L. Armstrong, Shepherd and Harris, 2017), only 5% of participants in our pilot were nursing students. We hypothesize that reasons for the low level of nursing engagement are likely multifactorial, with barriers including busy academic schedules and limited numbers of nursing faculty with PS and QI expertise available to mentor and promote this work to students. Low response rate also limits the internal validity of our findings; however, responses received provided important insights. Our post-program survey was sent to participants a few weeks after the final workshop and was not mandatory. We also acknowledge that such surveys are prone to selection and recall bias. Given the size of the overall program, future program evaluation may wish to mitigate these issues by mandating in-person survey completion at key milestones during the year or at the program's last workshop or event.

Conclusion
We piloted the QI Practical Experience Program, an extracurricular, interprofessional initiative designed to equip students with basic PS and QI knowledge and skills. Although the program was well received by students, key issues of engagement and impact remain. Future efforts will focus on improvement in these areas and more rigorous evaluation of learning outcomes.

Take Home Messages
We piloted an interprofessional training program in quality improvement and patient safety that was well received by participating undergraduate and graduate students. Maintaining participant engagement is critical and future work will focus on ensuring congruency between project needs and the number of assigned students and better intra-project communication. Students want their work to have an impact; to this end, future work will ensure that projects start at the beginning of the academic year to give participants sufficient time to evaluate the problem and test solutions.

Notes On Contributors
Akshay Rajaram MMI, MD is a first-year resident physician in Family Medicine at Queen's University (Kingston, ON). Prior to medical school (Queen's University), he completed a Master's degree in Management of Innovation (University of Toronto) has strong interests in informatics, analytics, the social determinants of health, and quality improvement.
Mialynn Lee-Daigle RN, BNSc is a Queen's University alumni. She is currently working at Hotel Dieu Grace Hospital as a registered nurse, and Audacia Bioscience as clinical project manager in Windsor, Ontario. She has worked in clinical, research, and management capacities in the areas of hospital and community care, academia, and business.
Anna Curry PhD is a third-year medical student at Queen's University with clinical interests in Family Medicine. She completed her PhD in cancer immunotherapy at the University of Toronto prior to beginning her medical studies. Her current research interests include quality improvement and assurance in healthcare settings.
Malcolm Eade BSc (Life Sciences, Queen's University) served as the President of the local chapter of the Institute for Healthcare Improvement (IHI). He is now a co-founder of a technology start-up focused on developing improved chemical detection technology to help combat the Opioid Crisis.
Rajaram A, Lee-Daigle M, Eade M, Curry A, Connelly R, Ilan R MedEdPublish https://doi.org/10.15694/mep.2019.000209.1 Page | 10 Robert Connelly MD, MBA is Associate Professor of Pediatrics at Queen's University and Head of the Department of Pediatrics at Kingston Health Sciences Centre. His works as a neonatologist in the Neonatal Intensive Care Unit and in the Neonatal Follow-up Clinic. He has interests in the use of handoff tools to improve patient safety.
Roy Ilan MD, MSc is an Assistant Professor of Medicine at the Technion -Israel Institute of Technology. He practices critical care medicine at the Rambam Healthcare campus in Haifa, Israel. His academic activities focus on patient safety and healthcare quality through research, education, and improvement activities.