INTRODUCTION

In 2013, the Alliance for Academic Internal Medicine (AAIM) developed guidelines for the Department of Medicine (DOM) letter with the goal of providing IM program directors (PDs) with a summary of a student’s overall IM performance for residency selection.1 Guideline adherence was variable but increased over time and letters using guidelines were more credible.2

In 2015, the Association of American Medical Colleges called for revision of the Medical Student Performance Evaluation (MSPE). Revised guidelines recommended inclusion of information on the components of each clerkship grade, component weight, comparison graphs, grade distributions, and unedited performance narratives by evaluators.3

In response to the COVID-19 pandemic, AAIM recently offered new guidance for the 2020–2021 residency application season, including recommendations to change the DOM letter by adopting a standardized template.4

Little is known about how IM PDs view DOM letters following guidelines or individual components of the DOM letter.5 To better understand PDs familiarity with and assessment of guidelines and corresponding DOM letters, we authored five survey questions included in a voluntary survey of IM PDs who completed the 2019 American College of Physicians (ACP) IM In-training Examinationő (2019-IM-ITEő).

METHODS

PDs from 577 programs who completed the 2019-IM-ITE were invited to complete a voluntary web survey through a link in their Score Report or email notification from ACP from October 2019 to February 2020. Surveys were completed by 315 PDs. After excluding 45 respondents who opted out and 19 respondents from non-US programs, the dataset included 251 respondents. The survey consisted of multiple-choice and 5-point Likert scale questions.

To assess statistical representativeness of the responses, self-reported program size and US Census region were compared to the US/US territory-based population of IM programs using data from the Accreditation Council for Graduate Medical Education.6 Responses were dichotomized for items regarding familiarity to allow for a secondary analysis comparing those who had heard of guidelines even if not entirely familiar with them. Responses were compared for goodness-of-fit to program size and region using the adjusted Wald (Pearson) chi-square statistic. Data analysis was conducted in Stata 16.2. The study was exempted by the University of Chicago IRB (IRB20–1108).

RESULTS

PDs were asked how familiar they are with AAIM DOM letter guidelines. Most (84%; 211/251) indicated some degree of familiarity whereas 16% were not familiar. Of PDs familiar (Table 1), most (82%) reported receiving DOM letters following guidelines “very often/often” (37%) or “sometimes” (49%). Larger programs (> 100 residents) reported receiving letters that “often” followed guidelines at a higher rate than medium-sized or smaller programs (p = 0.032).

Table 1 Representativeness of the Survey Responses and Responses, Familiarity with CDIM-APDIM Guidelines for DOM Letters, and Frequency of Receiving DOM Letters that Follow Guidelines

PDs familiar with the guidelines were asked whether letters following them were more useful in discriminating among candidates. Thirty-six percent reported these letters were more useful; 25% believed they were not; and 39% were undecided. Nearly all (92%) reported some degree of redundancy between DOM letters and the MSPE.

PDs familiar with the guidelines were asked what additional information might be useful to include on a DOM letter. More than half indicated adding a distinguishing feature that makes the student more likely to be successful in an IM residency (56%) and sub-internship performance (55%) would be useful; 45% selected shelf-exam scores (Table 2). There were no statistical associations between those items when compared by program size or region.

Table 2 Useful Additional Information, If Any, Identified by Program Directors for Inclusion on a Department of Medicine Letter

DISCUSSION

Most PDs reported some degree of familiarity with DOM guidelines and to receive DOM letters that follow guidelines. Yet, even among letters that follow guidelines most PDs found DOM letters to some extent redundant to the MSPE. Our study is a cross-sectional survey and is not necessarily generalizable over time and to all programs.

Writing DOM letters is time-consuming and reading letters that do not add value is time-wasting. Most importantly, providing PDs with information that does not enhance their ability to select residency applicants is a missed opportunity. This research provides data about what additional information IM PDs might prefer as they review applications. Providing more text from faculty evaluations not included in MSPE and an applicant’s distinguishing feature may further help programs identify best “fit” between applicant and program.

As we move forward to a unique application season during COVID-19, it is more important than ever that we provide value in our DOM evaluations and consider adopting the new AAIM 2020–2021 DOM letter recommendations.4