Ensuring competence of graduate trainees remains a paramount goal of all US APRN programs including NAPs awarding Doctor of Nursing Practice (DNP or DrNP), Doctor of Nurse Anesthesia Practice (DNAP), and Doctor of Management Practice in Nurse Anesthesia (DMPNA) degrees [1]. The need for such competence in an increasingly complex health care system resulting from the “burgeoning growth” [4] of science and technology provided the foundation for the original AACN Position Statement on the Practice Doctorate in Nursing, and it was the perception that additional training would enhance patient outcomes that first defined the potential benefits of such programs [4]. Because oral examination assesses domains that correlate with clinical performance [13, 18], this form of evaluation may be useful not only as a benchmark prior to student immersion in clinical rotations, but it also may serve to test whether the additional educational opportunities available in a practice doctorate education enhance student competence and patient safety. This concept is applicable to all APRN specialties, not just NAPs. As such, oral examination provides a means to test the hypothesis that such programs truly enhance patient-centric nursing practice.
DNP MOBE versus MN MOBE performance results
The most noteworthy finding of this study was that DNP students, after completing targeted additions to their curriculum, performed significantly better than recent MN students on their MOBEs in nearly all areas testing clinical analysis and fund of anesthesia knowledge. This observation is important because, although both MN and DNP cohorts achieved passing ratings, cognitive competence represents a continuum (as does clinical competence in general) with improved performance beyond a “pass” threshold having tangible value. In contrast with these areas of testing, both cohorts performed equally well in the domain of Communication. This evaluative section was included because of the vital role of communication in ensuring perioperative patient safety by anesthesia providers [19].
Because the major distinction between these cohorts related to differing didactic curricula (cohorts had similar professional backgrounds; they were separated chronologically by one year in the same institution with identical instructors), outcome differences most likely related to differences in their curricular preparation. Furthermore, relative increases in the mean test scores of DNP students were most notable for the three areas of maximum under-performance by the MN cohort – domain subsets IB, IC, and IIC (Table 3) – and improvements in these cognitive domains represented the expressed focus of changes implemented in the DNP curriculum. These results were significant despite the small size of the study groups and suggest that cognitive competence issues following completion of a didactic and simulation MN course of study (as identified by oral examination) can be effectively addressed by curricular modifications instituted as part of a robust DNP program.
Targeted changes in curriculum were made possible by an expanded DNP program (27 months vs. 36 months) that allowed two new courses to be introduced for improving clinical analysis and fund of knowledge in specific areas. Time constraints in the MN program did not allow for these courses, and there were fewer opportunities for repetition of concepts throughout that course of study. The Selected Topics in Pathophysiology course was designed to enhance students’ understanding of how disease processes relate to perioperative clinical considerations. During the Anesthesia and Co-Existing Disease course, students repeatedly applied abstract reasoning to common adverse perioperative events, and presented this information in an organized, oral format – a skill that requires practice and is critical to professional development.
Teaching students in this manner to employ metacognitive approaches (directing students to think about what they are thinking including recognizing when they do not understand something) can be a powerful tool for learners and may play an important role in preventing errors in formulating differential diagnoses by monitoring and regulating reasoning [20, 21, 22]. A longer DNP program of study permitted effective curricular expansion, with the development of skills necessary for superior performance on the examination, including both critical thinking abilities and mental processing related to effective articulation of answers. Such cognitive competence represents an essential component of clinical competence [23], and similar expansion of the course of study in other APRN DNP programs has been suggested to improve safe patient care compared with MN-prepared graduates [2].
In interpreting the implications of this comparison, it is important to consider that the scores for the DNP cohort were linked to course grades while student performances on the MN MOBE were used for feedback purposes only and were not associated with institutional consequences. Hence, it is possible that improved ratings in the latter group may relate to this additional academic incentive [24]. On the other hand, the most notable improvements in the DNP cohort occurred in those cognitive domains targeted by changes in their modified course of study. This finding suggests that a significant contribution to their superior oral examination performance relates to those curricular modifications.
MOBE as a benchmark evaluation in a NAP
Another finding of this study concerns the ability of MOBE to function as a benchmark evaluation at a critical juncture in nurse anesthesia training, namely just before transition from classroom teaching into clinical practice. The MOBE in this study was designed to meet the specifications of a good benchmark evaluation. These included selection of performance indicators that were (a) essential to professional success, (b) both qualitative and quantitative in nature, and (c) reproducible, to enable comparison with new performance occurring after initiatives arising from benchmarking had been implemented [25]. The scoring rubric in MOBE evaluated cognitive domains critical to competent professional conduct: clinical analysis, fund of anesthesia knowledge, and communication skills. Many of the domain subsets related to generation of precompiled responses and abstract reasoning associated with perioperative adverse event management, critical elements in dynamic decision making that are essential to safe anesthesia [19]. The Scoring Rubric contained a mix of parameters that were amenable to quantitative scoring (e.g., choice of appropriate monitors) and qualitative scoring (e.g., communication skills). Furthermore, the performance indicators employed by MOBE could reproducibly be re-evaluated to enable comparison between different cohorts, as demonstrated by the present study’s comparison of DNP and MN student groups.
Benchmark examinations should be both summative and formative [25] – they not only provide data on performance, but also, they are designed for quality enhancement. By highlighting areas needing improvement, benchmark examinations are useful to define educational targets and objectives and allow discovery of approaches to ensure future excellence [25]. The MOBE used in this study originally was designed for precisely these purposes and successfully identified three areas of underperformance involving critical thinking in MN NAP students [17]. As a result, modifications were implemented in the new DNP NAP curriculum designed to address these areas of performance, and the significantly improved performance of the current DNP cohort with MOBE likely represents a validation of its formative function.
Limitations
An important limitation of this investigation relates to the small sizes of the cohorts. On the other hand, significant differences in performance between these two groups were clear despite these small numbers – and the differences occurred most notably in “targeted” cognitive domains. This latter finding suggests that our conclusions likely have validity despite this limitation.
Confounding variables in this study included (a) minor differences between the MOBE processes (Table 1), and (b) somewhat differing teaching experiences for the two study groups (in addition to modifications in the DNP curriculum that amounted to 6 months of additional preclinical education): the DNP cohort received its final one year of didactic instruction almost entirely online due to the COVID-19 pandemic and, compared with the MN group at the time of the oral examination, they had completed less high-fidelity simulation training and less clinical training (550 versus 880 hours) for the same reason. These factors, however, do not explain the performance improvements of the DNP cohort relative to the MN cohort, because COVID-19 restrictions on in-person teaching likely adversely affected quality of instruction (as faculty and students transitioned rapidly online without preparation) and thereby disadvantaged this group (similarly true of their reduced hours in simulation training and in patient-care roles). Likewise, it is doubtful that minor differences in study methodology biased outcomes significantly.