Educating Medical Residents to Improve the Quality of their Continuity Practice

Background Providing direct feedback on the quality of residents’ continuity clinic is particularly a challenge for large multisite residencies when parent institutions are at different stages of reporting and use different electronic health records. Without this level of feedback on the quality of their outpatient practice, residents may be ill-equipped to succeed in the value based reimbursement environments they will likely encounter post graduation. Objective The authors report on a self assessment tool designed to provide value based feedback and direct individualized quality improvement in resident continuity clinic practices across four sites of a large internal medicine residency program. Method Residents were introduced to a quality reporting tool that they used to self report patient-care metrics throughout their residency. For three years including the academic years ending 2014, 2015 and 2016 a "report card" was provided to each resident and a self assessment done by the resident of their individual quality metrics. The residents used this information to develop team based QI projects to implement in the upcoming year. Results Over the three year reporting period 257 of 365 possible surveys were completed for an overall response rate of 70.4%. The number of surveys completed per resident per year varied from 0 to 65. Resident clinic data suggested that quality metrics exceeded national standards in several domains. Opportunities for improvement were found and used to direct quality improvement projects. Levine S, Andrews R MedEdPublish https://doi.org/10.15694/mep.2017.000135 Page | 2 Conclusion Assessing quality of outpatient medicine practice is an important and easily attainable goal in a resident continuity clinic practice setting.


Introduction
Models of care such as a patient centered medical home (PCMH) or an Accountable Care Organization (ACO) offer hope of lowered cost and improved patient care quality. [D'Aunno (2016), Strange (2010), Jackson (2013)] While uncertainly presently exists about the security of the Affordable Care Act (ACA) or of MACRA (the Medicare Access and Chip reauthorization Act of 2015) it is likely that most practices will continue to enter reimbursement agreements subject to quality reporting under the MIPS (Merit Based Incentive Payment System) or APM (an Alternative Payment Model including either an ACO or a PCMH).
One of the goals of residency education is to prepare residents for the environment they will practice in. Educating residents to function effectively in surroundings that are at different stages of quality reporting and that use different electronic medical records is a particular challenge for large multi-site training programs. Institutions typically exclude resident clinic data from their own quality reporting and medical school curricula provide minimal if any exposure to evolving primary care models of care. [Joo (2011)] Although some medical schools have developed novel population health curriculum models in which students are used as key care coordinators and health coaches, the data extracted in these models is primarily of the faculty practice. [O'Neill (2013), ] Many residencies have incorporated quality improvement opportunities into their curriculum. Some have assessed resident attitudes about medical home models of practice. [Colbert (2012)] Few have provided hands on opportunities for residents to assess their own continuity practice quality. Finding ways to engage residents in extracting this kind of quality data on their own practice is a critical skill for their future practice success.
The University of Connecticut is a training program spanning three area hospitals (Hartford Hospital, St Francis Medical Center, the University of Connecticut Health Center), and the Veterans Administration. The residency program set out to create a practical tool, the clinical effectiveness tool, for tracking quality that residents could use across all 4 sites in order to primarily foster self reflection and ultimately to encourage improvement upon the quality of their own practice. Reported here is work on the former goal, self reflection. Work on the second goal, that of practice improvement, is ongoing. Items selected for inclusion in the clinical effectiveness tool are a simplified version of typical PCMH or ACO primary care measures. The tool was designed, implemented and applied across the four different continuity clinic sites. The purpose behind this exercise was to begin to better position trainees to succeed in a post graduation value based market.

Methods
Residents are first introduced to the clinical effectiveness tool in orientation. In continuity clinic they are encouraged to complete 20 or more surveys by the end of each year. This number is selected to represent a significant sample of a resident's panel. Most resident panels are comprised of between 60 and 80 patients; therefore a sample of 20 chart reviews would sample approximately 1/3 to 1/4 of a given resident's panel. Page | 3 evaluated on their actual clinical effectiveness data outcomes. The tool is web based and accessed thru a desktop icon available at all the clinic site work stations or their smart phones. Data entered into the tool contain no individual patient identifiers. According to institutional policy, the study was deemed exempt from the University of Connecticut Institutional Review Board.
Residents are given a 2 hour introductory lecture on population health and healthcare reform. An overview of the Affordable Care Act is provided and principles of value based reimbursement are discussed. Residents receive this lecture at different points in their training depending on when they are scheduled for an ambulatory block. In order to get interns more comfortable and efficient in clinic sooner all are scheduled within the first quarter of the intern year. The public health lecture is given to first, second and third years on an ambulatory elective and meant to contextualize the clinical effectiveness tool. A composite/cross sectional list of common ACO and/or PCMH quality reporting measures was chosen.
[https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/Core-Measure s.html] The task was not to be inclusive but rather representative of the types of measures residents would encounter in a primary care practice environment upon the completion of their training. Table 1 delineates a list of the items included in the clinical effectiveness tool and the corresponding response options.
Residents were prompted to complete any partial entries by the online system. At the end of the year each resident was given a "report card" with their data, composite data by site and across the program so they could compare their results to their peers. Attached was a form for residents to write a reflection on the quality of their care as well as a plan for improving one metric in the coming academic year. Ambulatory site directors reviewed individual self reflections and practice goals with residents. In many instances, self reflection on practice improvement opportunities led to the design and implementation of quality improvement projects. At yearly "town hall" meetings program wide practice data was shared with residents as a whole.

Results
Over the three year reporting period 257 of 365 residents completed at least one survey for a response rate of 70.4%. Residents were encouraged to complete surveys each year of training. Encouragement was provided by frequent reminders. At present there is no consequence for non participation. 4,569 discrete surveys were collected and entered into a de-identified data base. Residents were encouraged to complete up to or more than 20 surveys per year. The number of surveys completed per resident varied from 0 to 65 in a given reporting year. 26% of eligible residents completed more than 20 surveys in a given reporting year. The results reported are based upon the total number of surveys completed.
Graph 1 illustrates the screening rates for breast, cervical and colon cancer as well as HIV and domestic violence screening. In all cases the percentage screened are of those eligible for screening. Also reported are vaccination rates for influenza, pneumococcus and tetanus. Resident self reported rates for cancer screening potentially exceeded national benchmarks by 20-30%. [Morbidity and Mortality Weekly Report (November 22, 2013), http://www.cdc.gov/cancer] Caution is advised in interpreting the actual size of that difference in the sense that our data is based on the totality of surveys completed, with a variable number per resident. Immunization rates potentially surpassed national benchmarks by similar margins.
[www.cdc.gov/mmwr/preview/mmwrhtml/mm6104a2.htm] Differences between resident and national screening rates are not reported as statistically significant or not because the number of surveys completed per resident in many Levine S, Andrews R MedEdPublish https://doi.org/10.15694/mep.2017.000135 Page | 4 cases failed to meet the 20 survey minimum. Rather than include only those resident data sets that exceeded the 20 or more surveys suggested per year, the program provided feedback to all residents on the totality of their data. HIV and domestic violence screening were identified by residents as two areas to target in quality improvement projects. Two such projects are currently underway.
Graph 2 describes the efficacy of managing blood sugar as well as the co morbidities of hypertension and hyperlipidemia in diabetic patients. Several residents identified these as areas to focus their quality improvement projects on. One such project involves group visits for diabetics with HgA1Cs>9.
Graph 3 describes the prevalence of micro albumin values triggering angiotensin converting enzyme inhibitor or angiotensin receptor blocking therapy. Microalbumin is recorded as N, AB, or ND corresponding to normal, abnormal, or not documented respectively. Also described are the rates of funduscopic and foot exams with the past year in diabetic patients. Credit was given for having seen an ophthalmologist whether or not the funduscopic exam was scanned into the chart. Residents were frustrated that diabetic performance rates were not higher and have self reflected that disparate EHRs will necessitate the development of site specific triggers to remind them to monitor and respond to microalbumen results, to verify ACE or ARB therapy, and to record funduscopic and foot exams.
Graph 4 illustrates the frequency of smoking cessation counseling among smokers.
While some may have noted previous counseling on smoking cessation was provided, included here are data on current smokers. The goal being 100% provided counseling now and previously. No inquiry was made as to the quality or the extent of counseling provided. Most residents were surprised by diminished performance on this measure and have contemplated strategies to document and improve counseling about smoking cessation.

Discussion
Residencies across the nation vary in size. The majority contain multiple clinical sites. The University of CT categorical internal medicine program has 4 sites with somewhat comparable patient populations and resources. However, each of the four sites of our residency program is at different stages of value based reporting. Two, Hartford and Saint Francis Hospitals, are Accountable Care Organizations (ACOs), one is the VA and the fourth, the University hospital, is in the process of becoming a Patient Centered Medical Home (PCMH). Like many institutions across the country, our resident clinic populations are unfortunately excluded from our institutional reporting to CMS. Recognizing this void, we set out to develop a curriculum in population health management. The results presented here indicate the feasibility of resident involvement in self assessment of health care quality reporting using a Web based abstraction instrument for patients cared for in a continuity clinic. Providing a "low stakes" report card is useful in identifying care deficiencies or improvement opportunities. It may be the "low stakes" factor or the team approach that makes the residents willing to complete the forms. Additionally, the data above would indicate that care provided in a resident clinic can actually out-perform national benchmarks. Sharing this information may spark interest in additional clinical sites involving residents and thereby improving quality of care at a wider range of primary care offices.
The educational benefit of this exercise is not only in directing clinical care improvement but in gaining an understanding of the complexities involved in value based reporting. These are skills the residents will need in their future careers. As such, this program provides a means to absolve a traditional education void for residents both in value based reporting but also teaching skills for utilizing an EHR. For example, one of the improvement projects on documenting advance directives focused on utilizing an alert system within the EHR. With disparate electronic Levine S, Andrews R MedEdPublish https://doi.org/10.15694/mep.2017.000135 Page | 5 medical records and therefore variations in embedded notification capabilities, there were differences in the extent to which resident providers were reminded to attend to the items in the clinical effectiveness tool. Recognizing this provided an educational opportunity to teach residents about the importance of bridging EMR gaps to develop individualized reminders including timed tasks, disease specific templates, etc.
Limitations of this study are twofold: one that it is a self reported tool and two that residents varied in their survey completion rates. The fact that the data collector is responsible for rating their own performance may affect the quality of the data collected. Because all the data was self-reported throughout the time period, the bias from this is, at the least, uniform. There may, however, be an unintended additional benefit to that level of transparency. Residents are often reluctant to reveal a lack of knowledge or need for improvement; yet success in the post MACRA (Medicare and CHIP Reauthorization Act of 2015) regulation era will require providers to not only report quality metrics but to also report on their success at quality improvement and practice self assessment. To address the second limitation, variable survey completion rates, hopefully seeing individual quality improvement gains in practice will incentivize more consistent resident involvement.

Conclusion
Exercises like the clinical effectiveness tool allow multi site programs the opportunity to give residents the ability to develop population health management skills when the stakes are still relatively small.