Skip to main content
  • Methodology article
  • Open access
  • Published:

Development of an in situ simulation-based continuing professional development curriculum in pediatric emergency medicine

Abstract

Background

Continuing professional development (CPD) activities delivered by simulation to independently practicing physicians are becoming increasingly popular. At present, the educational potential of such simulations is limited by the inability to create effective curricula for the CPD audience. In contrast to medical trainees, CPD activities lack pre-defined learning expectations and, instead, emphasize self-directed learning, which may not encompass true learning needs. We hypothesize that we could generate an interprofessional CPD simulation curriculum for practicing pediatric emergency medicine (PEM) physicians in a single-center tertiary care hospital using a deliberative approach combined with Kern’s six-step method of curriculum development.

Methods

From a comprehensive core list of 94 possible PEM clinical presentations and procedures, we generated an 18-scenario CPD simulation curriculum. We conducted a comprehensive perceived and unperceived needs assessment on topics to include, incorporating opinions of faculty PEM physicians, hospital leadership, interprofessional colleagues, and expert opinion on patient benefit, simulation feasibility, and value of simulating the case for learning. To systematically rank items while balancing the needs of all stakeholders, we used a prioritization matrix to generate objective “priority scores.” These scores were used by CPD planners to deliberately determine the simulation curriculum contents.

Results

We describe a novel three-step CPD simulation curriculum design method involving (1) systematic and deliberate needs assessment, (2) systematic prioritization, and (3) curriculum synthesis. Of practicing PEM physicians, 17/20 responded to the perceived learning needs survey, while 6/6 leaders responded to the unperceived needs assessment. These ranked data were input to a five-variable prioritization matrix generating priority scores. Based on local needs, the highest 18 scoring clinical presentations and procedures were selected for final inclusion in a PEM CPD simulation curriculum. An interim survey of PEM physician (21/24 respondents) opinions was collected, with 90% finding educational value with the curriculum. The curriculum includes items not identified by self-directed learning that PEM physicians thought should be included.

Conclusions

We highlight a novel methodology for PEM physicians that can be adapted by other specialities when designing their own CPD simulation curriculum. This methodology objectively considers and prioritizes the needs of practicing physicians and stakeholders involved in CPD.

Background

Independently practicing physicians have a moral imperative to commit to continuing professional development (CPD) in order to provide high-quality patient care and maintain ongoing public trust as a self-regulated profession [1]. The current conceptualization of CPD emphasizes updating and acquiring all of the broad competencies required for practicing high-quality medicine, including continuing medical education (CME), whereby technical knowledge and skills are refined and acquired [2], as well as the physician’s capacity as a communicator, collaborator, leader, health advocate, and professional [3]. In recent years, the increasing emphasis on improving healthcare quality and patient safety has transformed the needs and expectations of medical education provided to physicians in training and in practice [4, 5].

There is mounting public pressure for maintenance of certification (MOC) programs to support activities demonstrating change in physician behavior and improving patient outcomes rather than focusing on individual learning outcomes [6]. Concurrently, there has been mounting pressure from physicians for MOC programs to demonstrate value, encouraging CPD delivery in novel, impactful methods with high-quality learning outcomes. In Canada, specialists are mandated to participate in the Royal College of Physicians and Surgeons of Canada (RCPSC) MOC program. A 2016 cross-sectional survey of the RCPSC MOC program found a perceived lack of impact of the program on physician learning and perception from physicians that the MOC program served as a monitoring/regulatory body rather than its intended purpose as a mechanism to enhance lifelong learning and reflection [7]. A similar, US survey in 2016 revealed physicians desire more practice-relevant learning that is time-efficient, low-cost, and with topics of their choosing [8].

Rationale for simulation in continuing professional development

Evidence from several studies [7,8,9,10,11,12] suggests three factors consistently influence CPD learning success: accurate needs assessments prior to the learning activity, interaction amongst physician learners with opportunities to practice, and multifaceted educational activities [13]. Simulation-based medical education (SBME) addresses all three of these factors and has a greater influence on physician learning outcomes and practice compared to the traditional CPD methods such as lectures and conferences [12, 13]. SBME offers a number of potential advantages over traditional educational methods including simultaneously addressing CME needs and advanced knowledge domains such as communication and teamwork, direct observation of clinical performance with feedback through debriefing, and the capability to practice medical procedures and methods with minimal patient risk. SBME is also unique in its ability to train physicians to operate within interprofessional healthcare teams and the larger systems in which they function.

For these reasons, MOC programs are increasingly embracing SBME for CPD. In Canada, a 2009 RCPSC MOC program evaluation led to simulation being recognized as a method of practice assessment. In the US, SBME is required for primary licensure of anesthesiologists and surgeons and is utilized for CPD with anesthesia, surgery, internal medicine, family medicine, emergency medicine, pediatrics, and radiology [14]. There is strong physician demand for SBME, with 46% of practicing US physicians wanting more simulation activities for CPD in a 2016 national cross-sectional survey [8].

There is less understanding regarding the effectiveness of SBME for CPD, as most published literature focuses upon trainee physicians. However, evidence is emerging to support SBME effectiveness for board-certified physicians. A systematic review of 39 available studies in 2015 revealed benefits of SBME for CPD in acute care physician’s self-reported skills and attitudes, and immediate and sustained improvements in physician educational outcomes [5]. As an educational method rooted in constructivism, SBME theoretically has a greater potential educational benefit for practicing physicians due to the presence of pre-existing clinical experiences to build upon. With this potential educational impact, understanding methods to optimize SBME for CPD is imperative.

Rationale for developing a curriculum for simulation CPD

In contrast to medical trainee programs, CPD programs are distinguished by lack of specific, pre-defined curricula and emphasis on self-identified learning interests. However, evidence has shown that despite being a key motivator for ongoing engagement [8, 15], self-direct learning (SDL) is insufficient for a robust CPD program as physicians are unable to accurately self-assess learning needs meeting professional and societal demands [16]. A curriculum including necessary topics not identified by SDL would create a more robust and effective CPD program.

At present, much of our experience with SBME and CPD consists of ad hoc case creation suiting SDL needs of physicians or as a reaction to systems needs (i.e., critical safety events). This approach limits the educational potential of SBME for CPD. In contrast, a simulation curriculum, planned in advance from perceived and unperceived learning needs, is more robust, sustainable, and proactive for healthcare system needs. We hypothesize that such a curriculum would be non-inferior in educational effectiveness compared to the traditional approach.

Curriculum integration prospectively distributes CPD learning over several sessions, broadening the scope of SBME and allowing deliberate practice and spaced repetition of targeted objectives to achieve mastery learning [17]. In particular, we feel spaced repetition is powerful for equally important learning objectives such as communication, which can be repeating educational aims, in a curriculum with various clinical presentations. From the perspective of a MOC program, a curriculum vastly improves simulation logistics and operations. Although seemly trivial, prior CPD literature has identified logistical factors as a leading facilitator and barrier to successful and sustainable CPD [10]. A curriculum serves to assist course facilitators with more predictable planning, known costs, lead-time for quality simulation scenario development and testing, and standardization of local practice. These changes potentially translate to an overall higher quality educational experience.

Conceptually, we envision a curriculum whereby a group of board-certified physicians shares in completing a comprehensive series of longitudinally delivered simulation scenarios. Using appropriate knowledge/experience dissemination tools, this approach is rooted in the educational theories of Experiential Learning, Social Learning and Communities of Practice [18,19,20,21]. This co-produced curriculum approach is a pragmatic solution to allow a greater variety of cases to be covered in a feasible timeframe than a single individual completing the curriculum alone.

Our aim: development of simulation CPD curriculum for PEMs

In pediatric emergency medicine (PEM), simulation is vital for practicing physicians as the frequency of critically ill children is low relative to the total volume of children presenting to pediatric emergency departments [22]. Simulation allows PEM physicians to refine and retain resuscitative skills to remain prepared to manage rare, high stakes events. The tremendous breadth of clinical presentations in PEM creates both a challenge and rationale for a well-designed CPD simulation curriculum. An objective and systematic methodology using input from vital stakeholders to select the important clinical presentations and procedures to include in such a comprehensive curriculum is critical.

To design our simulation CPD curriculum, we formed an Interprofessional PEM Simulation Oversite Committee (PEMSOC) with membership consisting of local physicians, a registered nurse (RN), pharmacist, and a simulation operation specialist with a professional background as a registered respiratory therapist (RRT). The overarching aim of our simulation CPD program was to utilize simulation to develop and maintain the clinical skills of health care providers with the goal of providing world-class quality care to children. The primary objective of this paper is to describe our methods for the development of a SBME curriculum for CPD of practicing PEM physicians.

Methods

Literature review

To develop our curriculum, we first looked at existing literature on CPD simulation curriculum design. We conducted a literature review on curriculum design methodology for simulation and CPD. A formal search by a research librarian conducted on February 14, 2019, revealed 172 results when including articles (Epub Ahead of Print, In-Process & Other Non-Indexed Citations) from OVID Medline between 1946 and February 14, 2019, with the following MESH terms: Professional Competence (MESH 108801) AND Simulation Training/or exp Computer Simulation/ or exp Patient Simulation (MESH 222646) AND faculty, medical/ed or “Education, Medical, Continuing” (MESH 24165). A manual review of these 172 results revealed 0 articles referencing a curriculum design methodology for physician CPD using simulation.

Curriculum design process

With no identified pre-existing literature, we endeavored to develop our own methodology for CPD simulation curriculum design. Although our methodology could be considered for general CPD curriculum design, our focus was to develop a curriculum optimized for simulation as the chief educational strategy.

To develop our methodology, we adapted Kern’s [23] established approach to curriculum design for a novel application of simulation curriculum design with a CPD audience. We completed the following six steps: (1) problem identification by a general needs assessment, (2) targeted needs assessment, (3) goals and specific measurable objectives, (4) educational strategies (SBME in our case), (5) implementation, and (6) evaluation and feedback [24] This process was followed by a deliberative curriculum approach with experts determining the final curriculum content. Specific deliberative considerations included lack of standardized core learning objectives, greater variations in learner clinician experience and practice patterns, non-static learner composition, need for voluntary learner participation, medico-legal ramifications of simulations, and psychological safety considerations of participants such as reputation.

We consolidated the above processes into our novel systematic three-phase methodology described below (Fig. 1: Continuing professional development simulation curriculum design process).

Fig. 1
figure 1

Continuing professional development simulation curriculum design process. A three-phase curriculum development process adapted from Kern’s curriculum development approach used to generate a simulation curriculum for continuing professional development (CPD) for pediatric emergency medicine physicians. Phase 1 begins with a detailed targeted needs assessment involving all relevant stakeholders of the physician’s continuing professional development. Phase 2 follows with systematic prioritization of learning topics to include in the curriculum using data collected from the targeted needs assessment. A prioritization matrix is used to rank items for curriculum inclusion. Finally, in phase 3, selected learning topics are organized by educational experts into a curriculum to be implemented and evaluated. These three phases can be repeated in a cyclical manner as the curriculum is refined and reimagined over time

Phase 1: systematic and deliberative needs assessment

We conducted a detailed general and targeted needs assessment using a systems and deliberative approach, including perceived and unperceived learning needs from our target learner audience.

A perceived needs assessment was completed using a 20-item electronically distributed survey (SurveyMonkey Inc., San Mateo, CA, USA, www.surveymonkey.com) sent to all 20 practicing PEM physicians at our institution from March 28, 2018, to April 11, 2018. The primary objective of this survey was to determine topics our audience wished to cover in simulations. The secondary objective focused on local simulation etiquette and tailoring learning processes for our target audience. Participants were presented a comprehensive list of 65 clinical presentations and 29 critical procedures from the 2013 RCPSC Objectives of Training in pediatric emergency medicine [25] and asked if they felt an educational need to address by an educational simulation session. This document, originating from our national board-certifying organization, was selected as the foundational list as we felt it best represented the competencies expected of PEM physicians upon entry into practice. The survey was anonymously completed by 17 physicians (85% response rate), with results found in Additional file 1: Needs assessment survey.

Considering the aforementioned limitations of SDL in identifying learning needs, we conducted an unperceived needs assessment by asking our hospital leadership to rank on a 5-point Likert scale (1 = low priority, 3 = moderate priority [default starting score], 5 = high priority) from the same expansive list of clinical presentations and procedures presented to PEM physicians from the perspective of their position of leadership. Six positions of leadership polled were multi-disciplinary and composed of our PEM medical director and division head, deputy division head, PEM division safety lead, general emergency department site lead at our local tertiary care general hospital, PEM pharmacist lead, and PEM RN lead and trauma coordinator. All six leaders (100% response rate) completed the survey. In addition to leadership needs, PEMSOC accounted for patient-level needs by reviewing cases discussed from the previous 2 years of PEM divisional quality improvement and patient safety rounds (QI/PS). PEMSOC reviewed 21 clinical presentations but did not go into an in-depth chart review of each case at this stage, as our focus was to determine topics to cover in the simulation curriculum.

Phase 2: systematic prioritization

In order to objectively synthesize all of our collected needs assessment data, we utilized a prioritization matrix to objectively determine the clinical presentations and critical procedures to include in our PEM simulation CPD curriculum. While less commonly utilized in medical education and simulation, prioritization matrices have been applied in healthcare quality improvement [26, 27] and are commonly utilized in business and military goal setting where resources are limited and multiple objectives need to be addressed [28]. As there are several published matrices underpinned by different mathematical algorithms, we adapted an available prioritization matrix utilized by members in other local educational projects.

Our prioritization matrix utilized five data categories determined by PEMSOC a priori: perceived needs assessment results, unperceived needs assessment results, feasibility of conducting the simulation, benefit to patients if practiced, and educational value of simulating the case for learning (balanced with frequency of encountering the case in real clinical practice, in which case there would be less educational need). Ranked data ranging from scores of 1 (low rank) to 5 (high rank) were input into the matrix for each clinical presentation and procedure and each data category, generating an overall priority score which was utilized by PEMSOC in consideration for curriculum inclusion (Table 1).

Table 1 Pediatric emergency medicine continuing professional development simulation curriculum systematic prioritization matrix

To complete our perceived needs assessment data category, we recorded the frequency of “yes” responses from all 17 survey participants, converting individual qualitative responses (nominal/binary data) into quantitative discrete frequency data. These frequency data were categorically assigned a ranking score from 1 to 5 (1 = 0–20%; 2 = 21–40%; 3 = 41–60%; 4 = 61–80%; 5 = 81–100%) and inserted into the prioritization matrix. With the unperceived needs assessment category, we input the mean ranking score (from 1 to 5) of all six leaders into the priority matrix. For the remaining three data categories, PEMSOC members independently assigned a ranking score from 1 to 5 (1 = low priority, 3 = moderate priority [default starting score], 5 = high priority) for each data category, with the mean score input into each respective column in the prioritization matrix. PEMSOC committee members also included the 21 QI/PS rounds cases by factoring their scores for the “benefits to patient” data category of the matrix. As PEMSOC is comprised of inter-professional education and simulation specialists with formal training (simulation, quality improvement, Master of Education, curriculum design), these data represent the expert opinion necessary for pragmatic grounding we felt necessary for eventually creating a feasible, high-quality curriculum.

Phase 3: curriculum synthesis, implementation, and continuous evaluation

With these data, the matrix generated priority scores for PEM clinical presentations and critical procedures. These scores were organized into two ranked lists: a clinical presentation list and a critical procedure-list based on priority score (Table 2). Each list was manually reviewed by PEMSOC to ensure overlapping or duplicate items were combined into a single item.

Table 2 Ranked scores for pediatric emergency medicine continuing professional development simulation curriculum generated by priority matrix

Based on expert opinion, factoring the large number of items to cover, balanced with the need for regular curriculum refresh and review, PEMSOC decided on a 24-month curriculum block, filled with an 18-simulation scenario curriculum, with one scenario per month. We deliberately planned 6 months of “flex-time” to maintaining flexibility to repeat high-yield scenarios, conduct urgent simulation cases (e.g., to address urgent patient safety and quality improvement needs), and reschedule the unanticipated events where a simulation sessions are cancelled.

With the number of cases to include for our single-center’s needs decided, PEMSOC selected the highest-scoring 18 clinical presentations and 18 critical procedures objectively determined by the matrix for inclusion in the CPD curriculum. PEMSOC mapped individual clinical presentations to critical procedures to synthesize our 18-simulation scenario curriculum (Table 3). We centered our scenarios around clinical presentations, as our intention was to cover all procedures at least once in the curriculum with repetition of high-yield procedures for mastery learning. Learning objectives for each case were set by PEMSOC based on expert opinion, including learning points from the 21 QI/PS round cases when appropriate. For consistent formatting, all cases were drafted using an internally designed standard simulation template (Additional file 2: Sample simulation scenario).

Table 3 Pediatric emergency medicine continuing professional development simulation curriculum

Our curriculum was incorporated into a monthly pre-existing interprofessional in situ simulation program within our pediatric emergency department starting in April 2019. In our single-center PEM division, all practicing PEM physicians participated in the simulation program, with leadership expecting physicians to participate in one to two simulation sessions within a 24-month period. Prior to our curriculum, cases were selected at random from a database, or created on an ad hoc basis, by request of the participating PEM physician, 1 week in advance of the scheduled simulation session. After curriculum implementation, practicing PEM physicians were instead given an option of three simulation scenarios selected from our 18 case curriculum, 1 week in advance of the simulation session. The selected scenario would be marked as “completed” by PEMSOC and removed from circulation until all cases were covered in the 24-month curriculum block. We allowed PEM physicians to know the clinical presentation in advance to provide context, reduce “performance anxiety” [29], encourage psychological safety [30], and encourage learning in advance of the simulation. We also shared a limited version of the simulation curriculum in Table 3 with all PEM physicians (only detailing “clinical presentation,” “sample case stem,” and “unique procedures covered in case”). This sharing was intended to facilitate learning through community of practice and further promote psychological safety. Other interprofessional participants were also provided the case topic in advance, if asked.

In the simulation session, the PEM physician completed a 1-h in situ interprofessional simulation involving practicing RNs, pharmacists, and RRTs in our pediatric emergency department resuscitation room. Following each simulation, immediate post-event debriefing of the entire interprofessional team was completed by a simulation facilitator with formal debriefing training (PEMSOC member). Key learning points relating to team-level knowledge and performance gaps were addressed during debriefing. Within a month of the simulation session, an information package highlighting key knowledge and practice tips (called “SimBITS” by our team) was created and electronically disseminated by PEMSOC to all PEM, RN, and RRT practitioners in our division (Additional file 3: Sample SimBITS). To protect the psychological safety of simulation participants, the newsletter omitted all identifying information and performance gaps of participants.

As our learner group is expected to be non-static, with evolving clinical experiences and evolving learning needs, we planned to continuously evaluate our simulation program, with repeated curriculum redesign following the above process every 20 months.

Results

Interim curriculum feedback

Between November 10 and 15, 2019, PEMSOC distributed an electronic survey to all 24 practicing PEM physicians (SurveyMonkey Inc., San Mateo, CA, USA, www.surveymonkey.com) (our division had grown by four members in the interval time). The primary objective of this survey was to elicit feedback on the curriculum contents, with a secondary objective of determining learner attitudes to our curriculum approach (Additional file 4: Interim feedback survey).

Twenty-one PEM physicians responded to the survey (88% response rate) with 20 (95%) respondents previously participating in PEMSOC CPD simulation activities. Four (19%) of respondents felt the curriculum included scenarios they did not feel were necessary, with three (14%) being unsure and the remaining 14 (67%) feeling all included scenarios were appropriate. Participant opinions on curriculum contents are detailed in Fig. 2: PEMSOC interim curriculum survey results regarding curriculum contents. Presentations the PEM physicians group felt should be included (but were not) were multi-system trauma (the subject of trauma not in the scope of this curriculum), pericardial tamponade (actually included in the curriculum), increased intracranial pressure (also included in the curriculum), traumatic arrest (not in the scope of this curriculum), violent/agitated patient, and the critical procedure of Burr hole.

Fig. 2
figure 2

PEMSOC interim curriculum survey results regarding curriculum contents. Pediatric emergency medicine physicians (n = 21/24) completed an online survey indicating personal opinions on whether clinical presentations should be included in a simulation curriculum designed for continuing professional development. Clinical presentations are listed in ascending order of priority score in curriculum design process. Clinical presentations with high priority scores (toxicology, cardiopulmonary arrest, cardiac arrhythmia, respiratory failure, congenital heart disease, drowning/submersion) were felt to be high yield. Items with lower priority scores (toxic syndrome, meningitis/encephalitis) were felt by some physicians to not be necessary. Some clinical presentations (inborn error of metabolism, disseminated intravascular coagulation, adrenal disorders) were unexpected to be included in the curriculum and felt to be high yield by physicians

Regarding attitudes towards the PEM CPD simulation curriculum, 19 (90%) felt they could learn from a shared curriculum approach, with 19 (90%) feeling they could learn from our post-simulation, SimBITS knowledge dissemination package. Seventeen (81%) of physicians even felt our curriculum approach enhanced their learning compared to the previous, with the remainder being unsure (19%). No respondents felt unable to learn in our curriculum approach, and 11 (52%) respondents felt the curriculum would make it more likely they would participate in CPD simulation activities, with 8 (38%) being unsure and two (9.5%) responding no. PEM physician opinions on the CPD simulation curriculum are detailed in Fig. 3: Physician opinions regarding added value of CPD simulation over traditional ad hoc approach.

Fig. 3
figure 3

Physician opinions regarding the added value of CPD simulation over traditional ad hoc approach. (20/24 pediatric emergency medicine physicians responded to an online survey on personal opinions regarding simulations for continuing professional development adhering to a curriculum model, in comparison with simulations devised shortly in advance of a simulation session based on personal request (ad hoc approach). The majority of physicians felt the curriculum allowed them to experience more diverse clinical presentations outside of their comfort zone. The curriculum approach also encouraged mastery learning and created a more psychologically safe and predictable simulation experience.)

Discussion

We describe a three-phase methodology for designing a simulation curriculum for CPD, whereby ranked data obtained from a systematic and deliberative approach is subsequently input into a project prioritization matrix to generate objective prioritization scores for use by program coordinators to plan, synthesize, and implement a final curriculum (Fig. 1: Continuing professional development simulation curriculum design process). To our knowledge, this is the first published methodology for curriculum design with CPD simulation activities.

Preliminary feedback from our curriculum approach was positive. The majority of PEM physicians found our curriculum approach educationally effective (potentially more than traditional ad hoc simulation). In addition, the vast majority of physicians found our curriculum comprehensive for their needs, with no participants disagreeing with clinical presentations included. Reassuringly, the clinical presentations our PEM physicians found to be of highest yield in the interim feedback survey were also the clinical presentations with the highest matrix prioritization scores (toxicology, inborn error of metabolism, cardiopulmonary arrest, cardiac arrhythmia, severe electrolyte abnormalities). Similarly, clinical presentations that PEM physicians felt were not required had lower matrix prioritization scores. There were also clinical presentations PEM physicians felt would be high yield but did not identify as important for curriculum inclusion, such as inborn error of metabolism, severe sepsis, and electrolyte abnormalities. These finding highlights and reaffirms a critical function of our curriculum approach: the capability to consider learning needs beyond those identified through the traditional self-directed approach.

Although our curriculum design process was derived for PEM subspecialty physicians, we believe our methodology is generalizable to other healthcare specialities interested in using simulation for CPD. Our phase 2 objectively incorporates the identified needs of multiple stakeholders involved with CPD, including participants, the healthcare system, and patients. Other healthcare specialities can adapt our prioritization matrix to suit their needs including incorporating differing clinical presentations and procedures, mathematical algorithms for calculating priority scores, and data categories to suit their needs. Alternative data categories for use in other priority matrices by other groups are shown in Table 4. We also intentionally designed our phase 3 of curriculum synthesis, implementation, and continuous review to rely on expert opinion of local education leaders. Our approach was applied to an acute care medical speciality in a fashion we felt best suited our local needs. However, our approach allows other healthcare specialities to select different numbers of clinical/presentations for curriculum inclusion, interpret prioritization score data to map different simulation curricula, implement their own CPD simulation programs compatible to their learning audience, and utilize variable knowledge dissemination techniques. While relying on local expert opinion creates more difficulty in consistently replicating curricula, we felt our approach was strengthened by the intrinsic conceptual flexibility of our methodology to accommodate the countless local needs, barriers, and variations of CPD simulation practice.

Table 4 Alternative data categories to consider with priority matrices in other CPD curriculum design processes

Limitations

Unfortunately, due to the limitations of our single-center design, we determined a priori that we would be unable to address the above-proposed hypothesis comparing the educational effectiveness of our curricular approach to ad hoc SBME for CPD. It was not feasible in our single academic center to have some members of our pediatric emergency department in the curriculum group and others in a traditional stream. This is a major limitation of our current protocol requiring further investigation in future, multi-center, comparative studies. We do believe there is benefit to curricular integration with CPD simulation on learning outcomes, although additional studies regarding effect on learner knowledge, skills, and performance are required. Additional unique learning outcomes to examine from a CPD context include voluntary participation rates for CPD, economic analysis of an integrated simulation curriculum for CPD compared to ad hoc design, correlations with MOC certification performance and, of course, impact on real patient outcomes. Nonetheless, our protocol design process described here is the first step toward these future studies, which is critical, as SBME is increasingly utilized for CPD.

Conclusions

We describe a novel three-phase process for curriculum design in simulation activities targeting independently practicing physicians for continuing professional development (CPD). The highlights of our approach are (1) an adaptable prioritization matrix capable of objectively ranking and identifying high-priority subjects to include using data collected from an in-depth perceived and unperceived needs analysis and (2) a knowledge gap sharing tool to facilitate group learning and community of practice amongst physicians. This methodology is valuable as simulation activities are increasingly embraced and required as CPD activities.

Availability of data and materials

All the datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Health and Public Committee and Office of Health. Position statement The art and science of high-quality health care: executive summary: R Coll physician surgen canada; 2012.

  2. Peck C, Mccall M, Mclaren B, Rotem T. Continuing medical education and continuing professional development: international comparisons Common features of systems for professional development internationally. Bmj. 2000;320:432–5 Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1117549/pdf/432.pdf.

    Article  CAS  Google Scholar 

  3. Frank J, Snell L, Sherbino J. CanMEDS 2015 Physician competency framework. Ottawa: Royal College of Physicians and Surgeons of Canada. CanMEDS 2015 Physician Competency Fram Ottawa R Coll Physicians Surg Canada; 2015. p. 1–30. Available from: http://www.royalcollege.ca/portal/page/portal/rc/canmeds/resources/publications.

    Google Scholar 

  4. Leape LL, Brennan TA, Laird NAN, Ph D, Lawthers ANNG, Sc D, et al. The nature of adverse events in hospitalized patients: results of the Harvard Medical Practice Study II fecting the quality of care has grown . Curiously , how- paratively little attention from either perspective . But an important objective for those conc. N Engl J Med. 1991;324(6):377–84.

    Article  CAS  Google Scholar 

  5. Khanduja PK, Bould MD, Naik VN, Hladkowicz E, Boet S. The role of simulation in continuing medical education for acute care physicians: a systematic review. Crit Care Med. 2014/10/25. 2015;43(1):186–93. Available from: https://www.ncbi.nlm.nih.gov/pubmed/25343571.

  6. Campbell CM, Parboosingh J. The Royal College experience and plans for the maintenance of certification program. J Contin Educ Heal Prof. 2013 [cited 2019 Feb 12];33(Suppl. 1):S36–47. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24347151.

  7. Horsley T, Moreau K, Lockyer J, Zeiter J, Varpio L, Campbell C. More than reducing complexity: Canadian specialists’ views of the Royal College’s maintenance of certification framework and program. J Contin Educ Heal Prof. 2016;36(3):157–63.

    Article  Google Scholar 

  8. Cook DA, Blachman MJ, Price DW, West CP, Berger RA, Wittich CM. Professional development perceptions and practices among U.S. physicians. Acad Med. 2017 Sep [cited 2019 Feb 12];92(9):1335–45. Available from: http://insights.ovid.com/crossref?an=00001888-201709000-00036.

  9. Steinemann S, Berg B, Skinner A, DiTulio A, Anzelon K, Terada K, et al. In situ, multidisciplinary, simulation-based teamwork training improves early trauma care. J Surg Educ. 2011 Nov [cited 2019 Feb 8];68(6):472–7. Available from: https://linkinghub.elsevier.com/retrieve/pii/S1931720411001267.

  10. Jeong D, Presseau J, ElChamaa R, Naumann DN, Mascaro C, Luconi F, et al. Barriers and facilitators to self-directed learning in continuing professional development for physicians in Canada: a scoping review. Acad Med. 2018 Aug [cited 2019 Feb 12];93(8):1245–54. Available from: http://www.ncbi.nlm.nih.gov/pubmed/29642101.

  11. Cheng A, Hunt EA, Donoghue A, Nelson-McMillan K, Nishisaki A, LeFlore J, et al. Examining pediatric resuscitation education using simulation and scripted debriefing: a multicenter randomized trial. JAMA Pediatr. 2013 Jun 1 [cited 2019 Feb 8];167(6):528–36. Available from: http://www.ncbi.nlm.nih.gov/pubmed/23608924.

  12. Marinopoulos SS, Dorman T, Ratanawongsa N, Wilson LM, Ashar BH, Magaziner JL, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep). 2007 Jan [cited 2019 Feb 12];(149):1–69. Available from: http://www.ncbi.nlm.nih.gov/pubmed/17764217.

  13. Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner. JAMA. 2002 Sep 4 [cited 2019 Feb 15];288(9):1057. Available from: http://jama.jamanetwork.com/article.aspx?doi=10.1001/jama.288.9.1057.

  14. Levine AI, Schwatrz AD, Bryson E, Demaria S Jr. Role of simulation in US physician licensure and certification. 2012;79:140–53.

  15. Chan TM, Gottlieb M. Education theory made practical: volume 1, vol. 1: Wikipedia, The Free Encyclopedia; 2012. p. 3–7.

  16. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared: a systematic review. J Am Med Assoc. 2006;296(9):1094–102 Available from: http://jama.jamanetwork.com/article.aspx?articleid=203258.

    Article  CAS  Google Scholar 

  17. Issenberg SB, Mcgaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review AU - Barry Issenberg, S. Med Teach. 2005;27(1):10–28 Available from: https://doi.org/10.1080/01421590500046924.

    Article  Google Scholar 

  18. Haigh J. Expansive learning in the university setting: the case for simulated clinical experience. Nurse Educ Pract. 2007;7(2):95–102.

    Article  Google Scholar 

  19. Bandura A. Social learning theory of aggression. J Commun. 1978;28:12–29.

  20. Blackmore C. Social learning systems and communities of practice. First Edit. In: Blackmore C, editor. . London: The Open University; 2010.

    Chapter  Google Scholar 

  21. Cruess RL, Cruess SR, Steinert Y. Medicine as a community of practice: implications for medical education. Acad Med. 2018;93(2):185–91.

    Article  Google Scholar 

  22. Mittiga MR, Geis GL, Kerrey BT, Rinderknecht AS. The spectrum and frequency of critical procedures performed in a pediatric emergency department: implications of a provider-level view. Ann Emerg Med. 2012 [cited 2019 Mar 9];61:263–70. Available from: www.annemergmed.com.http://dx.doi.org/10.1016/j.annemergmed.2012.06.

  23. Thomas PA (Patricia A, Kern DE, Hughes MT, Chen BY, editors. Curriculum development for medical education: a six-step approach: Thid Edition. Third. Baltimore, Md. : John Hopkins University Press; 2016.

  24. Kern DE, Thomas PA (Patricia A, Hughes MT. Curriculum development for medical education: a six-step approach [Internet]. Baltimore, Md. : Johns Hopkins University Press; 2009 [cited 2019 Mar 14]. 253 p. Available from: https://discovery.mcmaster.ca/iii/encore/record/C__Rb2154485__SCurriculum%20Development%20for%20Medical%20Education:%20A%20%20Six-Step%20Approach__Orightresult__U__X4?lang=eng&suite=def.

  25. Canada RC of P and S of. Objectives of training in the subspecialty of pediatric emergency medicine. 2018 [cited 2019 Aug 15]. p. 1–31. Available from: www.royalcollege.ca/rcsite/.../ibd/pediatric-emergency-medicine-otr-e.pdf.

  26. Pelletier LR, Cphg CS, Beaudin CL, Cpha L, Van Leeuwen D, Che C. The use of a prioritization matrix to preserve quality resources. [cited 2019 Mar 11]. Available from: https://journals-scholarsportal-info.libaccess.lib.mcmaster.ca/pdf/10622551/v21i0005/36_tuoapmtpqr.xml.

  27. Van Leeuwen D. Developing a prioritization matrix breast care-a complex project. J Healthc Qual. 2002 [cited 2019 Mar 11];24(2):42–4. Available from: http://www.achieveglobal.com.

  28. Bennett BT. The CARVER Assessment Tool. In: Understanding, assessing, and responding to terrorism: protecting critical intrastructure and personnel. 1st ed. New Jersey: Wiley-Interscience; 2007. p. 244.

    Chapter  Google Scholar 

  29. Patterson MD Nadkarni VM. BGT. In situ simulation: challenges and results. In: Henriksen K Keyes MA BJB, editor. Advances in Patient Safety: New Directions and Alternative Approaches [Internet]. Rockville (MD): Agency for Healthcare Research and Quality (US); 2008. Available from: https://www.ncbi.nlm.nih.gov/books/NBK43682/.

  30. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1(1):49–55.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge Ms. Elizabeth Czanyo, MLIS, who conducted our literature search on behalf of Joule Inc., a Canadian Medical Association company. We also acknowledge Dr. Anthony Crocco, Division Chief and Medical Director of McMaster PEM, and Ms. Christine Chaston, Clinical Manager of McMaster PED for their administrative support with the McMaster PEMSOC program.

Funding

There are no funding sources to declare in this publication.

Author information

Authors and Affiliations

Authors

Contributions

JL made substantial contributions in the conceptualization of this study, acquisition of data (survey creation and dissemination), and primary data analysis (matrix creation, data entry, matrix score generation, curriculum synthesis creation and process, case scenario generation) and is the primary author of this manuscript. Revisions after peer and editorial review were primarily made by JL. MB made contributions in the conceptualization of this study (prioritization matrix methodology, matrix creation) and data analysis (curriculum synthesis, matrix score generation, case scenario generation) and revised the manuscript. ME made contributions in the conceptualization of this study, data acquisition (survey creation), and data analysis (curriculum synthesis, matrix score generation, case scenario generation) and provided substantial revisions to the manuscript. KM contributed to the conceptualization of this study and data analysis (matrix score generation, curriculum synthesis). LP contributed substantially to data analysis (matrix score generation and case scenario generation) and revised the manuscript. MD contributed substantially with data analysis (curriculum synthesis and case scenario generation) and revised the manuscript. QN contributed substantially to the conceptualization (Kern’s curriculum design methodology) and data analysis (curriculum synthesis, case scenario generation) and provided substantial revisions to the manuscript. All authors have reviewed the attached manuscript and approve of this version to be submitted for publication. All authors have agreed to be personally accountable for their own contributions and ensure that questions related to the accuracy, integrity of any part of their work, even the ones in which the author was not personally involved are appropriately investigated, resolved, and the resolution documented in the literature.

Corresponding author

Correspondence to James S. Leung.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed by our regional ethics board (Hamilton Integrated Research Ethics Board) and felt to be qualified as a quality improvement study. Formal ethics approval was deferred.

Consent for publication

Not applicable—no individual personal data is presented.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Needs assessment survey.

Additional file 2.

Sample simulation scenario.

Additional file 3.

Sample SimBITS.

Additional file 4.

Interim feedback survey.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Leung, J.S., Brar, M., Eltorki, M. et al. Development of an in situ simulation-based continuing professional development curriculum in pediatric emergency medicine. Adv Simul 5, 12 (2020). https://doi.org/10.1186/s41077-020-00129-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41077-020-00129-x