The integration of a new simulation center within a Competency-Based Curriculum: an opportunity for holistic undergraduate medical education curriculum redesign

Problem: Medical education is shifting to competency based training focusing on 13 Core EPA’s for Entering Residency. In response, UME leaders are reforming curricula to focus on Competency Based Education (CBE), and many institutions are choosing to incorporate Simulation Based Educations (SBE) into these efforts. Guidance for institutions planning comprehensive reform and simulation integration is limited. The purpose of this paper is to describe the experience at one medical school attempting to align a new UME curriculum and a new simulation center using CBE principles. Approach: As part of a UME curriculum redesign, the University of Texas Southwestern took two major actions. First, they secured funding to build a campus wide simulation center to host large-scale, high-quality simulation activities. Second, they formed a simulation planning committee to coordinate existing simulation activities and develop new activities to integrate into the new curriculum. This committee chose to use EPAs as an organizing framework. Outcome: The simulation planning committee carefully identified 25 simulation activities that would effectively Weis J, Wagner J, Farr D, Ginsburg C, Guttman O, Krumwiede K, Kho K, Martinez J, Reed G, Rege R, Sulistio M, Scott D MedEdPublish https://doi.org/10.15694/mep.2018.0000137.1 Page | 2 target core EPAs, while also complementing existing UME courses. The committee identified a director and codirector for each activity and established standard elements that would be common to all simulation activities. Learners’ progress through each activity is tracked and verified in a comprehensive portfolio. Next Steps: Throughout the academic year, data will be collected for each simulation activity according to uniform metrics. These data will be used to inform the committee’s decisions to continue, modify, or discontinue certain activities for future cohorts of students.

In 2014, CBE thought leaders came to consensus on a list of outcomes by which graduates from undergraduate medical education (UME) curriculum should be judged (Englander et al., 2014;Englander et al., 2013). Since then, great progress has been made in defining the details of assessing these outcomes and making competency decisions about UME learners (Lomis et al., 2017). The culmination of this work is a list of 13 Core Entrustable Professional Activities (EPA's) for Entering Residency (Englander et al., 2014) that is now undergoing pilot testing at ten institutions.
Surgical skills educators have been thought leaders in SBE curriculum design in the context of surgical education (Acton, 2015;Ahmed et al., 2011;Friedell, 2010;Wiener, Haddock, Shichman, & Dorin, 2015). They recognize the importance of mapping educational missions to curricular outcomes and competencies. However, there are few descriptions of a holistic design of institutional-level curricula incorporating both CBE and SBE.
Driven by the SBE and CBE movements, the number of simulation facilities around the world is rapidly growing. Guidance for institutions planning to build these facilities is limited, and primarily oriented toward the design of the physical space (Seropian & Lavey, 2010). However, the importance of aligning institutional educational mission, resources, and outcomes is widely recognized (Gardner et al., 2015). The purpose of this paper is to describe the experience at one medical school of an attempt to align a new UME curriculum and a new simulation center using CBE principles.

Approach
The University of Texas Southwestern (UTSW) Medical Center in Dallas has carefully designed a new UME curriculum. At the outset of the curriculum redesign, university leadership realized that SBE would be a major pillar in the new curriculum; however, simulation efforts on campus were too fragmented and under-resourced to adequately serve the nearly 1000 UME learners at UTSW.
In order to bolster simulation efforts on campus, university leaders made two major decisions. First, they secured funding to build a campus wide simulation center that would offer the necessary infrastructure to host large-scale, high-quality simulation modules. Second, they formed a simulation planning committee tasked with coordinating existing simulation modules and systematically developing new simulation modules to be integrated into the new Weis J, Wagner J, Farr D, Ginsburg C, Guttman O, Krumwiede K, Kho K, Martinez J, Reed G, Rege R, Sulistio M, Scott D MedEdPublish https://doi.org/10.15694/mep.2018.0000137.1 Page | 3 UME curriculum. This committee chose to use EPAs as an organizing framework.
The UTSW Simulation Center is a 28,000 sq. ft. facility occupying two floors of a new building centrally located on our main campus. It contains 20 mock exam rooms for standardized patient encounters, 11 high fidelity training environments with adjacent control rooms, seven debriefing rooms, two suturing labs, three classrooms, and a large wet/dry lab with 140-seat capacity. The initial capital cost is approximately $50M and annual cost are expected to be approximately $2.8M, covering equipment and personnel (17 FTE's).
The twelve-member simulation planning committee was formed in October 2015, consisting of seven associate deans of the Medical School and the School of Health Professions and five simulation experts. Committee members represented a breadth of clinical training backgrounds including anesthesiology, emergency medicine, internal medicine, OB/GYN, pediatrics, and surgery.
The simulation planning committee took three initial steps to guide the construction of the new simulation curriculum. First, they reviewed the 13 EPAs and identified EPAs 1, 2, 4, 5, 9, 10, and 12 as domains most amenable to SBE. Next, the committee reviewed all SBE modules already being offered to UME learners and mapped these modules to the most relevant EPA's. Finally, the committee generated a list of new SBE modules that could realistically be added to the UME curriculum. The resulting list included 46 individual SBE modules. Using an iterative process of committee review, the list was subsequently narrowed down to 25 modules that were considered the most relevant and feasible for pre-clerkship and clerkship level learners. In order to accommodate new modules some existing ones were eliminated and duplicative ones were consolidated.
As the list of 25 SBE modules was being finalized, the committee was also working to fit the modules into the UME curriculum in a rational way such that SBE module content would align with other didactic and clinical material to which UME learners would be exposed during the new UME curriculum. To facilitate this process, the committee identified appropriate stakeholders in UME education and began forming working groups within existing preclerkship and clerkship curriculum committees to discuss the incorporation of new SBE module content and logistics. With input from these working groups and several additional stakeholders, the committee identified a module director and a specific time slot for all 25 SBE module. The SBE modules were scheduled on a timeline spanning the entire UME curriculum, beginning with one module scheduled during MS1 orientation, 15 modules scheduled during pre-clerkship blocks, four modules scheduled during a transitional week between the pre-clerkship and clerkship periods, and culminating in five modules scheduled during clerkships. Additional modules are in the planning phase for the post-clerkship period.
Once the committee had integrated these core simulation modules, they shifted their focus to establishing standardized expectations for SBE module structure and data collection, based on the principles of CBE. They established a standardized format for modules and distributed a module description template to module directors.

Outcomes
There were six key elements that the committee mandated for all SBE modules. First, each module director was required to identify a co-director to ensure availability of appropriate leadership. Second, all modules were required to have written learning objectives. Third, a summary was required regarding the plans for each simulation module, including equipment, space, and staffing needs. Fourth, modules had to be proficiency-based and include an assessment protocol with a minimum performance expectation for every learner, in keeping with simulation-based mastery learning concepts (Griswold-Theodorson et al., 2015). Fifth, all modules were required to have a written remediation protocol for any learners unable to meet the minimum performance requirements. Lastly, all modules were expected to gather data on learner performance and experience. For example, data on learner performance needed to align with the EPAs and data on learner experience needed to use standardized pre-and post-course evaluations such that continuous quality improvement of SBE modules would be afforded. These six elements were captured in a written description of each module.
With the above requirements established, the committee set up meetings with module directors to ensure uniform quality of the module descriptions. The final result was an 87 page UME Simulation Curriculum Book containing the descriptions of the 25 SBE modules. The data collection plan outlined in this book will facilitate creating a simulation portfolio, which will allow tracking the progress of each individual learner through the entire simulation curriculum during their medical school education. It is anticipated that documentation of the learner's performance for each simulation activity will be useful for application and placement within residency programs. Furthermore, these data will be used in conjunction with other forms of assessment to construct on overall competency portfolio for each learner featuring all 13 EPAs.
The 25 modules include ten for training in various procedural skills (EPA 12). Additionally, six modules involve organ system based, high fidelity simulations. These team-based modules are designed to instruct learners on diagnosis and management of important acute scenarios (EPA's 2 & 10) while also teaching principles of interprofessional teamwork (EPA 9). Four modules focus on individual training in a diagnosis and management of core acute care medicine scenarios (EPA 10) such as cardiac and pulmonary codes. Two modules provide discrete training on physical exam maneuvers (EPA 1) while the remaining 3 modules train students in a variety of skills including history taking (EPA 1), gynecologic exam (EPA 1), vaginal delivery (EPA 10), prescription writing (EPA 4), and documenting encounters in the EMR (EPA 5). A matrix of courses and the EPA's that they fulfill is shown in Table 1. Upon completion of all materials, the SBE curriculum was launched on September 1, 2017. Of note, the SBE curriculum was launched a full year before the new simulation center's planned opening date (September 2018) with plans to host the SBE modules at various sites around campus during the first year. Although each SBE module is conceptually robust, the committee recognizes that there will likely be unforeseen challenges that arise when the modules are implemented. Similarly, there are sure to be growing pains associated with opening a new simulation center and transitioning staff and equipment into the new space. Thus, the committee intentionally launched the SBE curriculum far in advance of the new simulation center's planned opening in order to have a full year of data regarding the success of the SBE modules before they are moved into the new center. This will allow for modification or removal of unsuccessful modules, as well as the addition of new modules when the new center opens.

Next Steps
Success of individual SBE modules will be judged based on 5 criteria. First, a module will be deemed successful if it hosted as planned with adequate teaching staff, sufficient materials, and satisfactory attendance by learners. Second, successful modules will receive positive ratings from learners on post-module evaluations. Third, learners will report improved comfort with the relevant EPA's after completing the module. Fourth, learners will consistently meet objective milestones for the featured EPAs after completing the module. These criteria are summarized in Table 2.
For the entire set of 25 modules, these data will be useful for identifying system issues that may need improvement.
For individual modules, this information will help the planning committee determine which SBE modules should be continued, modified, or discontinued for future cohorts of students. Currently, simulation support staff are in the process of compiling data on the SBE modules that have already been hosted this academic year. We envision that data will be reported according to a template that addresses each of our key criteria for success listed in Table 2. After completing the module, did learners feel comfortable with the skills outlined in the learning objectives?
• Post-module evaluations completed by learners • >75% of learners report that they are at least moderately comfortable with the specified skills (score ≥3 on a 5-point Likert scale) Did learners achieve the predetermined threshold(s) for proficiency in the skills outlined in the learning objectives?
• Proficiency-based score sheets completed by course instructors • >95% of learners demonstrate proficiency by the end of the module For this academic year, data are being collected on paper and coded electronically at a later date. However, in the future, the University plans to use a centralized learning management system (LMS) to collect and store all data electronically. This will be used to generate individual portfolios for every UME learner such that successful progression through milestones of of the EPA's will be documented in a permanent record that follows learners into their respective GME programs. Furthermore, aggregate course data will be stored in an enterprise data warehouse (EDW) that will allow for continuous monitoring of all aspects of the UME curriculum including pre-clerkship courses, Clerkships, Electives, and SBE modules.

Notes On Contributors
Joshua J Weis is a General Surgery resident and UT Southwestern who is completing a two-year research fellowship in medical and surgical education.