Correlation Between An Email Based Board Review Program and American Board of Pediatrics General Pediatrics Certifying Examination Scores

Objective To investigate the impact of a weekly email based board review course on individual resident performance on the American Board of Pediatrics (ABP) General Pediatrics Certifying Examination for pediatric residents and, specifically, residents with low ABP In-training Examination (ITE) scores. Methods Weekly board-type questions were emailed to all pediatric residents from 2004–2007. Responses to board-type questions were tracked, recorded, and correlated with ITE scores and ABP General Pediatrics Certifying Examination Scores. Results With regard to total number of questions answered, only total number of questions answered correctly had a significant positive correlation with standard board scores (n = 71, r = 0.24, p = 0.047). For “at risk” residents with ITE scores ≤ 200 (n = 21), number of questions answered in PL 3 year (r = 0.51, p = 0.018) and number of questions answered correctly for all PL years (r = 0.59, p = 0.005) had significant positive correlations with standard board scores. Conclusions Participating regularly in the email-based board review course, answering board style questions, and answering correctly to board style questions were associated with higher standard board scores. This benefit existed for all but was especially prominent among those with poor in-training examination scores.

For pediatric board certification, graduates from ACGME-sponsored pediatric residency programs are required to pass the American Board of Pediatrics (ABP) General Pediatrics Certifying Examination. 1 Individual and aggregate board examination performance is used by many stakeholders: physicians, state licensing boards, third-party payers, employers, hospital credentialing committees, and the Pediatric Residency Review Committee (RRC). ABP Certification Examination performance has been positively correlated with USMLE Step 1 scores 2 and ABP In-training Examination (ITE) scores. 3 Although clinical experience and residency program-specific educational activities provide the knowledge base for residents as they prepare for board certification examinations, residents also use additional tools for board preparation. Among others, these methods may include board review books, courses, and study groups. In general, the educational value of commercial test preparation courses in clinical medicine has not been demonstrated. 4 Specific educational interventions have been directed toward residents who were identified as ''at risk'' for poor performance on certification examinations in family medicine 5 and surgery 6Á9 residency programs. Examination performance improvement strategies included individualized learning programs, 6,7 weekly reading assignments, 5,6,8,9 test-taking strategy review, 5,7 content review, 5Á7 and practice questions. 5Á9 The purpose of this study is to investigate the effectiveness of the ''eBoard Review'' course * an innovative course that combines the use of board-type review questions, e-mail delivery of questions to individual residents, and weekly responses with feedback and teaching points. We hypothesize that participation will be positively correlated with ABP General Pediatrics Langenau  In this weekly ''eBoard Review'' program, residents received sample board questions by email. These questions were taken from the Self-assessment Program † , which is a component of the Pediatrics Review and Education Program Curriculum (PREP † Curriculum). 10 As a residency benefit for pediatric residents at Maimonides Infants and Children's Hospital of Brooklyn, each resident receives a subscription to PREP † The Curriculum † , allowing access to the program's educational materials (such as sample board review questions from the Self-assessment Program † ). Before distributing the questions via email, each question was reviewed for content, accuracy, and applicability. Questions were emailed weekly to each pediatric resident. Each resident then replied via email with his or her answer within the week, and responses were recorded. During the subsequent week, additional questions were emailed to each resident along with the preceding week's correct answers, discussions, and teaching points. Essentially, the program functioned as a weekly board review course administered throughout the academic year. A summary of the ''eBoard Review'' program is shown in Figure 1.  Table 1 summarizes the data collection methodology. The hospital's Institutional Review Board approval was received to analyze these data.

Study Design
Statistical Analysis Á Descriptive statistics were used to describe the demographic characteristics. For the continuous variables, Spearman correlation values were used for all the correlation analyses due to the small sample size. The MannÁWhitney test for skewed variables was used to analyze a number of continuous outcomes (number questions answered, number questions answered correctly) for the categorical independent variable of pass/fail on board scores. All p-values are two-sided. SPSS Version 16.02 (SPSS, 2008) was used for all analyses. For the outcome evaluation of certification exam performance, a flowchart of inclusion and exclusion criteria are shown in Figure 2. Of the original 90 residents, 15 left the program prematurely due to a variety of circumstances such as transfer to other programs, failure to meet program requirements for promotion, change in career; consequently, these residents either were not eligible to take the board certification examination or results were not reported back to the program director by the American Board of Pediatrics. In addition, 4 residents, who were board eligible and completed their full training, had not yet taken the board certification examination for a variety of personal reasons. Therefore, 71 residents completed the ABP certifying examination and were eligible for study analysis. Table 2, the average age of participants was almost 33 years old. One-third attended a US medical school, one-quarter consisted of US citizens who attended schools outside of the US, and a little more than one-fifth had a DO degree. Table 3 shows the descriptive statistics for questions answered and questions correctly answered. Although all residents (n 090) elected to participate in the ''eBoard Langenau  Review'' program, individual response rates varied significantly. Of the 90 residents, 16 (7 who were eligible for the study and 9 who were ineligible for the study) did not respond at all during the study period. With regard to questions answered by postgraduate level year (PL year), the greatest range was for PL3 year with a high of 324 questions, and the greatest mean value was for PL2 year at 69.73 questions. This pattern also existed for questions Residents replied voluntarily via email with his or her answer before the Monday of the following week.

Yes No
For each resident, correct and incorrect responses were recorded.
The following Monday, additional questions were emailed, along with the preceding week's correct answers and discussions.
Residents' response rates and perfomances were compared with performances on the annual ABPsponsored Intraining Exam (ITE) and ABP Board Certificaton Exam Note: For the majority of the 2004-2005 academic year, questions were emailed and recorded daily. At the residents' request, the program was then changed from a daily program to a weekly program.   year and total number of questions approached significance for a positive correlation with standard board scores. With regard to questions answered correctly by PL year, PL3 year approached significance for a positive correlation, and total number of questions answered correctly had a significant positive correlation with standard board scores.
Only number questions answered for PL3 year (n0 71, r 00.22, p 00.067) and number of questions answered correctly for PL3 year (n071, r00.22, p0 0.060) had positive correlations approaching statistical significance with percentile board scores. No significant correlations were found between percentile board scores and questions answered or answered correctly based on total questions, questions in PL1 year or questions in PL2 year (data not shown).
Based on the MannÁWhitney test, pass/fail determination approached statistical significance for number questions answered for PL3 year (Fail: n 024, M 0 45.17, SD 081.44; Pass: n 047, M065.13, SD 0 75.51; p00.075) and also number of questions answered correctly for PL3 year (Fail: n 024, M 034.75, SD 0 68.07; Pass: n047, M 048.83, SD 059.22; p 00.065) where those who passed had higher values for questions correct and questions answered correctly. There were no statistically significant differences for PL years, academic years, and totals for PL years in regards to pass/fail determination on board scores (data not shown).
For this particular residency training program, moving average ITE standard scores were 247.5 in 2004, 287 in 2005, and 318 in 2006. National moving average ITE standard scores for these years were 376, 368, and 362 respectively. Based on ITE performance, residents were stratified into risk groups; those with low ITE scores were identified as ''at risk'' for poor performance on the board certification examination. Based on the performance of the most recent ITE exam, risk levels were assigned to residents with ITE scores of 5300, 5250, and5200. As shown in Table 5, Spearman correlation analyses were conducted for ''at-risk individuals'' for the board style question variables for PL years that were either significant or approached significance with standard board scores. For those with ITE scores of B300, there were no significant correlations. For those with ITE scores of B250, number of questions answered in PL3 year, number of questions answered correctly in PL3 year, and total number of questions answered correctly for all PL years had significant positive correlations with standard board scores. Also, total number of questions answered for all PL years approached significance with a positive correlation with standard board scores. For those with ITE scores of B200, number of questions answered in PL3 year, number of questions answered for all PL years, total number of questions answered correctly in PL3 year, and total number of questions answered correctly for all PL years all had significant positive correlations with standard board scores. The highest correlations were seen within this subset of individuals with r values as high as 0.59.
As stated above, 19 of the 90 residents were excluded from the study for leaving the residency program prematurely or not taking the ABP Certifying Exam at the time of analysis. These 19 did not differ significantly from the 71 included in the analyses with regard to gender (p 00.56), US medical school attendance (p 00.60), offshore medical

Discussion
Our study suggests that answering board style questions and answering correctly to board style questions are associated with higher standard board scores. This benefit exists for all, but is especially prominent among those with poor in-training examination scores. Our results also suggest that answering correctly, as opposed to simply answering, confers slightly additional benefits due to the greater significance levels and greater correlation values for answering correctly.
Two specific patterns were noted. Participation in the electronic board review (''eBoard Review'') course had medium to large significant correlations for ''at-risk'' residents with ITE scores of either 5250 or 5200. It is possible that either the increased knowledge acquired by completing these questions or participating in the program motivated them to study more on their own. Also, specifically for at-risk and PL3 residents, we noted significant patterns for a number of board score outcomes. Besides the motivation suggestion mentioned above, either the increased knowledge level at PL3 year confers the most benefit by answering board style questions or possibly increased interest by PL3 year due to upcoming board exams confers benefit. Therefore, residents who were nearing program completion (PL3) and who had low ITE scores might have been particularly motivated to prepare for the certifying examination, thereby taking advantage of every educational opportunity, including the ''eBoard Review'' program. As we did not have qualitative resident satisfaction survey data or formal measures of motivation, we can only speculate about the increased motivation of PL3 residents who are at risk for poor performance on the ABP Certifying Examination. Also, because the number of questions offered varied slightly  between academic years, comparing the number of questions answered between PL years may not necessarily reflect motivational differences between the groups. A more formal study involving motivational measurement could be replicated in the future for all residents, PL3 residents and ''at risk'' residents.
Board-type review questions were disseminated by email so that all residents would be able to participate* residents located at different geographic clinical sites and residents with different on-call schedules. With three hospital-based sites, more than ten ambulatory sites, and more than twenty clinical rotations, scheduling educational activities for residents is particularly challenging. By using email, this educational program allowed for wide material dissemination and access to all residents, regardless of location and availability. Anecdotally, we found that residents often responded during nighttime hours while on-call, thereby maximizing their educational opportunities during nontraditional working hours. Previous studies have demonstrated that traditional and Internet-based CME programs are equally effective. 11 Although we did not use an Internet-based program, we did use email to structure the ''eBoard Review'' program. This method of dissemination might have contributed to the impact of the board review program. Future studies could address resident satisfaction and comparing performance outcomes with methods of board-type question dissemination.
Because the predictive validity of the ABP ITE on future ABP General Pediatrics Certifying Examination performance has been established 3 , pediatric residency program directors can use the ITE scores to identify residents who are ''at risk'' for poor performance on the ABP Certifying Examination. Althouse and McGinnis presented five-year average passing rates for 2001Á2005 ABP certification exam by ITE score groups, and extrapolating from their data, third year residents who score less than 200 on the ITE would have a 36.6% chance of passing the ABP Certification Examination 3 . To reflect the ITE reporting style used by the ABP and to reflect how the ITE scores are likely to be used by program directors, we also used overlapping risk group analysis. Developing educational programs specifically designed for these ''at-risk'' residents becomes very important for the residents themselves and their residency program directors. Residents with low ITE scores may be encouraged to participate in a board review course, such as the ''eBoard Review'' program, in efforts to improve their knowledge and exam performance.
Our study has a few limitations. First, the study was conducted at a single institution. To further evaluate the ''eBoard Review'' program's effectiveness and generalizability, the study should be duplicated at other training institutions. Second, residents with high ITE scores were likely to perform well on their ABP certification examination, regardless of whether they answered ''eBoard Review'' questions or not. This might account for the inability of this study to demonstrate a significant overall correlation between answering board-type questions and certification exam scores. Third, although all residents enrolled in the study, ongoing participation in the ''eBoard Review'' course was voluntary and self-directed. Therefore, consistent participation and individual response rates varied widely, and a significant number of residents did not respond at all during a particular academic year. Fourth, although all residents enrolled in the study and received these questions whether they responded electronically or not, a few residents reported anecdotally that they did read the questions and critiques on a regular basis but did not respond via email. This form of participation was not accounted for in our study and could potentially alter the study's overall results. Fifth, four residents had not yet taken the ABP Certification Examination at the completion of the study; this selection bias could potentially alter our study's findings. Sixth, a significant number of ''at risk'' residents with ITE scores less than 200 and 250 were excluded from the study. This can be explained by the fact that most of these 19 residents, who were excluded from the study, left the training program prematurely due to academic difficulties and were therefore more likely to have lower ITE scores. However, this selection bias could also potentially affect our study's findings.
As of September 2009, the ''eBoard Review'' course continues and is now coordinated by chief residents. Because of positive resident feedback and faculty recognition of the program as an effective tool for ABP board exam preparation, participation in the course is now mandatory. Data continues to be collected, and analysis is likely to be repeated in the future.
In conclusion, the above ''eBoard Review'' approach of sending board style questions by email appears to be useful to pediatric residents as preparation for their ABP General Pediatrics Certifying Examination. It is especially useful for those pediatric residents with poor in-training examination scores as they may have improved certification examination scores after completing such a program.
Shelov and Lita Aeder for their support and contributions to this project and the ''eBoard Review'' course since its inception in 2004. Thank you also to William Roberts, Ed.D and David Langenau, PhD for their review and contribution to the manuscript.