Utilization of marketing automation tools for delivery of a faculty development curriculum

Background Physician clinical educators play important roles in teaching, providing feedback, and evaluating trainees, but they often have variable preparation and competing demands on their time that make universal participation in workshops, seminars, or short courses designed to foster these skillsets inefficient or impossible. Methods We designed and implemented a 52-week synchronous curriculum designed to address faculty opportunities to improve teaching skills, feedback for residents and medical students, and evaluation skills, which were delivered using marketing automation tools, including text messaging and email. We evaluated the programmatic impact and feasibility of using the implementation science framework. Results Over a 104-week evaluation period, there were at least 10,499 total content impressions and 4558 unique recipients, indicating the significant reach of this program to approximately 120 faculty members. Faculty engagement with continuing education materials remained stable or increased over the 2-year evaluation period, indicating that programs like ours can have sustainable impacts. Resident evaluations of faculty across the six key domains also improved after the implementation of the program. Conclusions Our experience with digital marketing tools reflects that they can be used to deliver impactful curricular content to faculty for continuing educational purposes and that faculty can use these resources in a sustainable way. However, because of the incomplete reach with any single communication, this type of content delivery is not appropriate for isolation as a material of critical importance. More research is needed to identify the best practices and additional education-related uses of this technology.


Introduction
Physician clinical faculty have important responsibilities for the provision of teaching, feedback, and evaluation to trainees, yet often have competing demands on their time and limited faculty development within these education domains.The Accreditation Council on Graduate Medical Education (ACGME), which oversees United States-based physician training, requires that residency programs offer faculty development but proffers little further guidance 1 .In the absence of standardization, the preparedness of individual faculty members for their roles as clinical educators is multifactorial, but is often dependent on factors such as whether they had access to residents as teacher programming during training 2 , the availability of opportunities at their own institution, their intrinsic motivation, schedule flexibility, and the impact of competing demands on their time 3,4 .Therefore, training programs are challenged to identify common opportunities for faculty improvement or to offer resources supporting faculty development [4][5][6] .
We performed a needs assessment for a curriculum to address variation in emergency medicine (EM) faculty preparation for clinical education among approximately 120 faculty members who work clinically at one of two emergency departments (EDs) affiliated with our residency program.A qualitative analysis of annual rotation evaluations completed by residents in two prior academic years identified three priorities for faculty development: clinical teaching, feedback, and evaluation.After these three faculty development content areas were identified, we performed a literature search for possible educational strategies to deploy to our faculty.Interventions designed to ensure that faculty groups like ours have access to clinical education resources frequently take the form of workshops, seminar series, and short courses but these formats have limitations that make their application to such large groups a challenge.First, the implementation of workshops and short courses requires significant expenditure of opportunity cost, especially on the part of the developers, whose time preparing for and delivering the content must be accounted for, as well as that of faculty participants who generally have significant competing demands on their time 7 .Additionally, the implementation of workshops, short courses, and seminar series are often infrequent with variable faculty attendance due to competing faculty time demands, making it difficult to include all faculty members.This infrequency may also contribute to the accelerated decay of skill and knowledge 8 and risk creation of an institutional culture where emphasis on teaching feels cyclical rather than continuous and compliance-driven rather than part of the physician faculty professional identity.With an eye toward engagement, we sought content delivery modalities that were used regularly and were already accepted by the faculty.Thus, we developed a robust faculty development curriculum targeting our needs assessment and coupled it with an email and text message content delivery strategy to provide highquality, relevant, just-in-time resources and create a multiinstitutional alignment around clinical teaching priorities and expectations.The resulting curriculum comprised just-in-time, synchronized content that allowed us to cover many relevant topics with relative depth while avoiding redundancy and creating a sense of perpetuity for the educational mission.A just-in-time strategy 9 focused on content delivery via email and text messages immediately prior to clinicians starting their clinical shifts, allowed cognitive priming, encouragement, and reinforcement immediately before applying teaching and assessment strategies to learners.Each faculty development installment included brief, evidence-based content relevant to clinical teaching, feedback and/or evaluation, links to additional resources, and a call to action.Emails were designed to be read in less than three minutes.Text messages were designed to reinforce the emailed content.

Methods
The implementation and evaluation of this curriculum were granted exempt status with a waiver of consent by the Colorado Multiple Institution Review Board, as is typical for educational program implementation and evaluation at our institution.
Emails were scheduled and sent using a marketing automation platform (MAP) (Mailchimp; Atlanta, GA, USA; available as web-based software with a "freemium" subscription model) to the faculty every Sunday evening.MAPs allow content developers to easily create engaging materials that are viewable as a web page, actively manage audience contact, and estimate engagement among recipients.All faculty members were automatically enrolled to receive emails but were permitted to disenroll at any time without penalty.During the seven days following the content email, faculty members who were scheduled to work clinically with learners received a text message one hour before their shift start time.These messages included a brief reminder of the week's content and a clickable link to the MAP-generated webpage for their just-in-time review to prime faculty members for teaching.Text messages were scheduled by a program administrator using third-party software (TextMagic, San Francisco, CA, USA).All faculty members practicing clinically at the affiliated institutions were automatically enrolled in text messages but were notified in advance that they could opt out at any time without penalty.Email and text message content is archived and linked in the Data Availability section 10 .
After 24 months, we downloaded a campaign report from the MAP, which includes the number of recipients, opening rate, click through rate, number of unique views, unsubscribe rate, and bounce rate for each email.The reports were downloaded to a spreadsheet platform (Excel for Mac v. 16.58; Microsoft Corporation; Redmond, WA, USA) and data intended for business purposes or deemed otherwise not useful, including revenue and number of orders, was excluded for ease of analysis.A similar report was also downloaded to determine the number of text messages sent.Engagement data is not available for text messages, except for opt-out rates.We then graphed the weekly opening rate and analyzed the data and program using an implementation science framework.
The RE-AIM implementation science framework was developed by Glasgow, et al. 11 for the National Institutes of Health, and is the most validated and used model for assessment of implementation.It is freely available for public use.The acronym stands for Reach, Effectiveness, Adoption, Implementation fidelity, and Maintenance.Although this framework was originally developed for use in public health interventions, it can also be an effective tool for examining the impact and sustainability of longitudinal educational programs given their ubiquity in public health initiatives.
Reach was defined as the number of individuals willing to participate in an intervention.To describe this impact, we report the number of individuals enrolled over time, opt-out rates, and number of unique content impressions generated by each dissemination modality.
Effectiveness is a measure of the effects of the intervention, which we assessed using faculty evaluation data generated by emergency medicine residents working with the participants.
Adoption reflects saturation of the intervention within the institution(s).We report the institutional support for continuing the program and the expansion of the program to include additional participants.
Implementation fidelity is a measure of how closely the implemented intervention resembles the planned intervention.We describe the tools and resources used to deliver the curriculum, barriers, and enablers.
Maintenance refers to the sustainability of intervention.We describe ongoing programs and adaptations to improve sustainability.

Reach
Over the first 24 months of the program (January 1, 2018, to December 31, 2019), a total of 120 academic physician faculty members at the primary sites received 11,560 weekly content emails and 9810 text messages 10 .The primary sites were emergency departments in a county-supported urban hospital with an emergency medicine residency program and an emergency medicine residency-affiliated university hospital.The MAP reported that 4558 (39.4%) emails had been opened.The MAP considers emails "opened" when an invisible, single-pixel image loads, so users who do not fully load images when reading email, a common option for the participants' institutionally supported email client, may inadvertently deflate the estimated rate.As such, it is difficult to ascertain a true impact in terms of content views, but the open rate determined by MAP is likely to be underestimated.Owing to technical limitations associated with text messaging software, it is not possible to know the text message read or click-through rates.The percentage of faculty members who access the content using this modality alone or in conjunction with email is therefore unknown.
Despite the challenge of characterizing a true utilization rate, the technology and process utilized by the MAP to calculate it has not changed since the implementation of this program, and the rate itself has remained stable over time, averaging 38.5% (1059/2756) during months 1-6, 37.4% (1089/2932) during months 7-12, 40.5% (1135/2801) during months 13-18, and 41.6% (1275/3071) during months 19-24 (Figure 1).One faculty member disenrolled from email and another 11 opted out of text messaging, but there was no overlap between the two groups, so all faculty members received the content through at least one modality during this time 10 .

Effectiveness
Quantitative metrics, including clinical teaching by faculty, faculty feedback, and faculty supervision, improved annually from to 2017-2019 on the ACGME resident survey after the curriculum implementation in 2017.Resident evaluations of faculty in several clinically relevant domains improved on annually required ACGME resident surveys after curriculum implementation (Table 1).This data is provided by the ACGME and delivered aggregated for resident confidentiality, which limits further analysis.

Adoption
The program remains an important departmental faculty development initiative with administrative support and expansion of the program over time.In month 13, we adapted the curricular content for Advanced Practice Providers (APPs) (n=32), who worked in an affiliated ED and occasionally taught APP undergraduate and postgraduate trainees.In their first 12 months, the APP faculty had a MAP-calculated opening rate of 40.9% (569/1456) 10 .We were subsequently approached by clinical faculty at an affiliated community hospital where EM residents rotate in the ED.Due to their less frequent shifts with residents, we used an opt-in procedure for this group, forwarding weekly emails for one month and then offering the opportunity to enroll.Of the 59 faculty members who could interact with residents at this institution, 33 (56%) opted for the program.In the first 6 months, the enrollees at the affiliate site had a MAP-calculated open rate of 58.2% (494/848) 10 .Neither the community physicians nor APPs received text messages due to the administrative time that would be required to facilitate that process.

Implementation fidelity
Using text messages and an MAP, we were able to implement a program with high fidelity to the curriculum we designed and sustained delivery of content using both modalities for 24 months.However, the costs of these tools vary.The use of a marketing automation platform ranges from free to more than USD 300/month depending on the size of the intended audience.Text messages can be scheduled and delivered at a small fee.Additional resources are needed to synchronize texts with a clinical or educational schedule, either via integration of another automated tool or through manual administrative time.The greatest enabler for a curriculum such as ours is the high-yield content that is relevant to the audience.Based on participant feedback, changes to the program after the initial 24-month implementation period included the development of an online content repository that allows faculty to quickly access relevant content and decreased text message frequency with an emphasis on the immediate relevancy of the just-in-time content.

Discussion
Our experience with digital marketing tools such as email and text messaging reflect that they can be used to deliver curricular content to faculty for continuing educational purposes and that faculty can use these resources in a sustainable way.At first glance, it is tempting to consider engagement with content as a relative failure because of open rates of 30.4%-52.4%.In their greater context, these rates reflect 10,499 total content impressions and 4558 unique recipient impressions, which, as noted above, likely represent an underestimation of the true impact.Organizing traditional faculty development programs or workshops to achieve content delivery of this magnitude, without marketing automation tools, would not have been feasible for our group.
While the cost of implementing such programs is not zero, curriculum developers may find that they offer outsized returns on investment when strategically employed.More recently, we found that the digital, synchronous content delivery we designed also offers important infection-prevention advantages in the context of the coronavirus pandemic.These strategies may supplement continuing education when opportunities to gather in groups are limited or undesirable.One limitation of using digital marketing tools to supplement other continuing education curricula is that the observed changes in the target audience's performance are confounded by the presence of other curricula.Obviously, this could be scientifically|mitigated with a randomized study design, which does not always align with the institutional goals related to content dissemination.While not addressed in this study, it may be feasible to employ an even broader application of these tools to the continuing education landscape.Theoretically, these tools can also be used to create institutional and organizational alignment around a variety of topics, including updated|clinical recommendations and communication of upcoming or summarized educational programming.Additional studies exploring the potential roles of digital marketing tools in continuing education and faculty development would be welcome.

Data availability
Underlying data Data underlying the results of resident evaluations of faculty teaching are available as part of the article, and no additional source data are required.They were presented as provided to the residency program by the ACGME.).This enhances attainment of the DoCTRINE criteria.Thank you.However, I find the author's' use of the descriptors "synchronous" and "just-in-time" in the manuscript misleading.Synchronous, in my understanding of the term as applied to health professions education, commonly implies the implementation of an event "contemporaneous" with other learners (within a social learning context, whether the learning environment is a physical space or virtual milieu).A longitudinal series of posts to a discussion board, for example, is not usually referred to as "synchronous".The authors appear to use this term to indicate that content pushes (by email and text) are occurring simultaneously to all recipients; "synchronized", a term also used by the authors in the manuscript, seems more appropriate and provides the intended effect.I believe the distinction I am making is important: a simultaneously-received automated push does not guarantee or equate with the simultaneous opening of content nor does it represent a synchronous in-the-moment collaboration and social interaction with, by, and involving the content by multiple learners (whether co-locator or distributed).Recommendation: I would not refer to this curriculum or its delivery method as "synchronous" but rather "synchronized", "coordinated", or "timed".Likewise, I find the phrase "just-in-time" has many meanings across education in general and specifically in health professions education.I had initially believed the authors' use of the term more aligned with the thoughts discussed by Yilman et al (2021 ).Later in the manuscript (Methods), the authors also incorporate a previously described and recognized framework to assess their technology-enhanced curricular implementation (RE-AIM).These examples support attainment of the DoCTRINE criterion.I acknowledge the limitations imposed by the brief report format.However, I believe there is room for the authors to provide 1-2 sentences of the theory/framework/model that informs on the Novak model of "just-in-time" and support the authors' selection of pre-shift educator priming or knowledge dissemination via content pushes in favor of content pulls (see comment above and below this one).Recommendation: To even further enhance fulfillment of the DoCTRINE criterion, the authors should better explain the theory that underlies the Novak framework/model to support why they chose it as part of their curricular design and implementation.

Unique contribution:
The authors adequately describe how this this curricular innovation (utilization of the marketing automation tools to provides synchronized delivery of faculty development content) is unique from conventional methods of content dissemination for faculty development.They miss the opportunity to explain briefly how content pushes differ from content pulls.Recommendation: To enhance fulfillment of the DoCTRINE criterion, the authors could compare and contrast their innovative method which involves scheduled pushes to its audience with others that leverage on-demand pulls from its audience (consider Orner et al. [2022]).

Curriculum development
4. The authors clearly state the purpose/goals of the educational innovation.Thank you.
5. The authors imply that the objective of this curriculum is to enhance teaching, feedback, and learner assessment (presumably in the patient care environment).The authors do not provide specific detail for readers to understand exactly what learning outcomes (related to the faculty) are expected through engagement with this curriculum.Does the content build in a scaffolded way so that its intended recipients (the faculty as learners) grow developmentally as the longitudinal curriculum unfolds?Readers would love to see the curricular map of how the content was scheduled to unfold (or was it random?).Recommendation: While the authors provide supplemental digital access to their text messages/emails and the data, they do not provide a curriculum map that lists or describes the goals and objectives related to each faculty development content push by week.I believe this is feasible and would enhance attainment of this DoCTRINE criterion.
6.The authors clearly state the target audience of the educational innovation, supporting attainment of the DoCTRINE criterion.Thank you.

Curriculum implementation and Results
7-8.The authors adequately describe the key features of the software and the resources needed to implement the curriculum, supporting attainment of the DoCTRINE criterion.Thank you.
9. The authors adequately describe the instructional method.In the supplemental materials they provide through their registration with the Open Science Foundation (see Reference 10, Michael et al. [2023] below), the authors provide readers with access to the actual curricular tools, supporting attainment of the DoCTRINE criterion.Thank you.
10.While the authors provide outcomes data on their implementation, most of their evaluation data center on program-and process-centered outcomes.The authors indeed provide adequate proof-of-concept to answer the question: does the marketing automated tool work to effectively disseminate information?Regarding the more important question of curricular efficacy, I do not feel the authors gathered sufficient evidence of curricular efficacy (as described in the RE-AIM) related to the learners.The responses to the ACGME resident survey are not specific enough.While the data is clearly depicted in the table and figure, the level of statistical analysis is minimal and descriptive.We are not truly able to infer correlation between this curriculum and any improvement of scores; clearly, causation is impossible.Even if the intent were to provide descriptive data, I believe the authors could have gathered more and provided a slightly more detailed level of analysis to support their story.Engagement with text messages could not be determined.The authors suggest faculty teaching evaluation scores improved in the period of this intervention.Descriptive statistics provided for faculty teaching evaluations over a two-year period; statistical significance is not reported.The authors conclude that digital marketing tools (email/text) can be used to effectively deliver faculty development content.
Feedback: 1) Faculty development is critical to support effective teaching, feedback and assessment in health professions education.The team should be commended on developing an intervention to support ongoing, just-in-time, faculty development to support clinical supervisors and to address commonly known barriers in this educational sphere.
2) Limitations of the study include: i) The introduction could be strengthened with more recent literature on the use of digital technology in the domain of faculty development.In addition, clarification of the 'needs assessment' would be helpful.It appears this was not a true needs assessment of faculty, but rather feedback on areas for improvement from the resident perspective.Changing the 'needs assessment' terminology would help clarify the data source that prompted the intervention.
i) The reach of the emails/texts reported does not equate with engagement/review/implementation of the content by the recipients.The study would be strengthened with quantitative and qualitative feedback from the clinical faculty on their engagement with the materials and how they applied them during supervision.This is an important piece of data that is missing.
2) The data presented on teaching evaluations includes descriptive statistics on percentage change over the study period.However, the team does not address if other faculty development interventions during the study period may have impacted these results.This section would be strengthened by addressing other factors that may have impacted scores and applying appropriate statistical tests to ensure the difference in scores is significant.

Is the work clearly and accurately presented and does it cite the current literature? Partly
Is the study design appropriate and does the work have academic merit?Partly

Are sufficient details of methods and analysis provided to allow replication by others? Yes
If applicable, is the statistical analysis and its interpretation appropriate?No

Have any limitations of the research been acknowledged? Partly
Are all the source data underlying the results available to ensure full reproducibility?Yes

Are the conclusions drawn adequately supported by the results? No
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Faculty development, competency-based medical education I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

Figure 1 .
Figure 1.Estimated percentage of content emails viewed by faculty.The marketing automation platform-estimated email opening rate for each week of content demonstrated stable engagement (average 39.5%) during the 104-week evaluation period, ranging from 30.4% to 52.4%.It does not include emails that are open to text without images.

©
2024 Sirianni G.This is an open access peer review report distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Giovanna Sirianni 1 Clinical Public Health Division, University of Toronto, Toronto, Ontario, Canada 2 Department of Family and Community Medicine, University of Toronto, Toronto, Ontario, Canada Summary: This manuscript outlines a just-in-time faculty development intervention.This intervention included automated, email and text messages pushed to clinical supervisors weekly prior to working with learners.The messages included reminders of weekly content and links to faculty development materials.Outcome included number of emails opened.

Table 1 .
Faculty teaching evaluation scores.Resident evaluations of the faculty for the year prior to implementation (2017) and the following two years showed improvement across six domains relevant to the curriculum delivered.All scores are on a 5-point scale.
is in keeping with curricular best practices (e.g.,Thomas et al. 2022

Recommendation: As the alignment of Novak's "just-in-time" framework was not immediately understandable to me as a reader, I propose the authors consider reviewing the Novak framework in better detail and compare with other constructs of "just- in-time" (e.g., Yilman et al. [2021]); consider comparing and contrasting the two.
).After reading the Novak article and examining the authors' content (especially the email content; seeReference 10, Michael et al. [2023]below), I better understood the alignment between this manuscript's design and the concepts described by Novak (2011) (for readers of this review unfamiliar with Novak's description, I would summarize by stating it embraces a "flipped classroom" approach to content dissemination that cognitively primes and prepares recipients in advance of clinical shifts to enhance student-student, studentteacher, and time on task when students and teachers are present together in the learning environment.
2. The authors provide an adequate review of the literature within the Introduction, including a recent Best Evidence in Medical Education report (systematic review on faculty development curricula by Steinert et al. [2016]

Not sure if the authors are able to address this concern. If the authors have learner-specific outcomes (faculty outcomes, or indirectly, learner outcomes), they should include it to enhance attainment of this DoCTRINE criterion. 11
. The authors describe RE-AIM, one of their primary evaluation frameworks, supporting attainment of the DoCTRINE criterion.Thank you.12-14.The authors describe the number of participants.The authors provide raw data on the process-and program-centered outcomes(seeReference 10, Michael et al. [2023]below).See comment 10.The authors provide an adequate summary of findings, meeting the minimum expectations of the DoCTRINE criteria.However, I do believe they could have enhanced attainment of the DoCTRINE criteria for the Discussion section by expanding the discussion.I would have liked more reflective critique on the successes and opportunities they encountered over the period of implementation and observation.What happened during 2020-2022 with the experience of the pandemic?Finally, the authors could have provided more reflection and commentary, based on their experience, on the types of content (static v. dynamic, infographics v. video or audio media) that worked best using the Novak model and the marketing automated tool platform.They also could have provided more commentary comparing the push-pull aspects (as mentioned earlier).

have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.
This is an open access peer review report distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
https://doi.org/10.21956/mep.21507.r36837©2024Merritt R.Rory MerrittBrown University, Providence, Rhode Island, USA This article highlights challenges and potential solution to several common problem with faculty development (FD).Primary challenges in FD include lack of participant time, scheduling challenges, variability in content and unclear participation metrics.A potential solution proposed by the authors is to leverage marketing tools to develop bite-sized teaching information to EM faculty using weekly emails and pre-shift text messages.Results are somewhat limited but nonetheless important: number of total emails/texts sent and estimated email engagement.Results also include data related to teaching effectiveness.I am not a statistician but would like the know if the teaching effectiveness (resident ratings) are statistically significant.The use of an implementation science framework to support novel faculty development tools is important and this is a well-done paper.

Is the work clearly and accurately presented and does it cite the current literature? Yes Is the study design appropriate and does the work have academic merit? Yes Are sufficient details of methods and analysis provided to allow replication by others? Yes If applicable, is the statistical analysis and its interpretation appropriate? Yes Have any limitations of the research been acknowledged? Yes Are all the source data underlying the results available to ensure full reproducibility? Yes Are the conclusions drawn adequately supported by the results? Yes Competing Interests:
No competing interests were disclosed.

have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.
Reviewer Report 16 May 2024 https://doi.org/10.21956/mep.21507.r36841