Using the Novel Musculoskeletal Emergency Medicine Assessment Tool: A Feasibility Study

Background and objective While musculoskeletal (MSK) disorders account for a significant number of primary care and emergency department (ED) visits, there are widely recognized shortcomings and gaps in MSK education throughout medical training. Undergraduate medical education (UME) frequently fails to impart clinically relevant MSK knowledge, while many emergency medicine (EM) residency graduates report feeling unprepared to manage MSK complaints. Existing MSK assessments are not tailored to EM and may inaccurately assess specialty-specific MSK knowledge. The novel validated Musculoskeletal Emergency Medicine Assessment Tool (MEAT) holds great promise in standardizing EM MSK knowledge assessment. This trial of feasibility was conducted to assess the viability and practicality of using MEAT to evaluate MSK knowledge among incoming resident physicians in EM programs. Methods This feasibility study involved 21 incoming EM resident physicians from two programs at a single institution. MEAT was administered online during orientation, and demographic data and survey metadata were collected. UME MSK education details were obtained, and MEAT scores were analyzed. Results Participants reported no difficulties in accessing or understanding the 50-question online MEAT, resulting in a 100% response rate. The average pretest score for all interns was 29.9, with a median of 30. Most participants had documented UME MSK education, but curricular content varied widely. The participants took an average of 32 minutes to complete the assessment. Conclusions MEAT demonstrated successful implementation and high response rates, suggesting a high level of feasibility. The tool can be used to assess baseline MSK knowledge and ultimately track progression during residency with the potential for evaluating educational interventions once further validation studies have been performed. Further adoption of MEAT across multiple EM residency programs will help to enhance the tool's generalizability.


Introduction
Musculoskeletal (MSK) disorders account for nearly 20% of primary care and emergency department (ED) visits annually [1].Despite this prevalence, there are several documented lacunae in MSK education across all levels of medical training and practice [2][3][4][5].Several studies have shown that undergraduate medical education (UME) often falls short in providing clinically relevant MSK knowledge, including diagnosis and management of common MSK conditions [2,3,6].A survey revealed that only 54% of medical students felt their MSK education was adequate [7].Furthermore, only 83% of medical schools require pre-clinical MSK education, and a mere 15% mandate an MSK clerkship [8].This variability in UME MSK curricula underscores the challenge of defining essential foundational MSK education.
Moreover, emergency medicine (EM) residency graduates have reported feeling inadequately prepared to manage MSK complaints [9].A national needs assessment of EM residencies in 2021 highlighted that most respondents felt their curriculum could be improved and expressed a desire to have a standardized MSK assessment specific to EM training [10].Existing validated MSK examinations, such as the Freedman and Bernstein assessment (FB-MSK), were developed for medical students [2] and are potentially inadequate for EM residents due to their non-specialized nature and the subjective grading involved.Scoring on these assessments varies significantly based on the specialty of the test taker and who is grading the test [11][12][13].
Significant variability is noted in scores depending on the rotations a student participated in as well; e.g., those with an orthopedic rotation scored 60.3% compared with 45.1% for those without [11].When testing residents, primary care trainees scored an average of 56.3%, EM residents 77.5%, and orthopedic residents 90%.This raises the concern that this tool may not be adequately relevant and effective for assessing MSK knowledge in non-orthopedic residents and other graduating medical students [3].The MSK30 is another validated MSK assessment tool that uses multiple choice questions (MCQs) to mitigate the element of subjective grading, but it is recommended primarily for medical schools and primary care residencies [3].Additionally, neither the FB-MSK nor the MSK30 have been described as being administered in an online format.
We recently developed and validated the Musculoskeletal Emergency Medicine Assessment Tool (MEAT), an MCQ-based tool focusing on MSK knowledge relevant to EM and designed for online administration and distribution [14].To our knowledge, MEAT is the first assessment tool specifically validated for EM resident physicians.MCQ assessments like MEAT and MSK30 offer advantages such as immediate scoring and reduced grading subjectivity [2,3,14].MEAT can be distributed through a variety of methods including email, text messaging, or via a QR code.It can be formatted to be taken on a personal computer or handheld device.Although MCQ assessments such as MEAT have certain limitations, they are widely used in medical knowledge evaluations [3,[15][16][17].In addition, reports show that MCQ assessments are frequently administered online without any compliance or administration issues [16,17].
Using MEAT at various training stages enables EM residency programs to track the progression of EMrelevant MSK knowledge and evaluate the effectiveness of educational interventions, such as MSK rotations or workshops.However, there are no published trials of this assessment in EM residencies.In light of this, our feasibility study aims to assess incoming resident physicians at the onset of their graduate medical education (GME) using in-person electronic administration of MEAT.

Materials And Methods
This feasibility study was conducted in 2022 and involved all 21 incoming resident physicians from two categorical EM programs at a single institution.One program is affiliated with a level-one academic teaching hospital and the other program is associated with a level-four community hospital.
Using the Qualtrics platform, the MEAT (see Appendices) was incorporated into a survey format and distributed to participants via an anonymous link during intern orientation.This assessment was completed before any formal GME MSK training, ensuring scores reflected baseline knowledge.To ensure the integrity and confidentiality of the data, each participant was assigned a randomized identification number (rID).This identifier was used for tracking scores and making future comparisons without compromising anonymity.Upon completion of the assessment, individual scores were automatically transmitted to the administrator.One point was assigned to each question, for a total possible score of 50.
Demographic data, details of the current GME program, UME location, degree type, and UME MSK curriculum information were collected during assessment administration (Table 1).Additionally, survey metadata, including the duration of each assessment, was captured for analysis.Summary statistics were computed using Microsoft Excel.

Results
The demographic data of the cohort are presented in   The 50-question MEAT was administered to all incoming resident physicians during intern boot camp with 100% compliance in the pretest.The average pretest score for all participants was 29.9, with a median of 30.On average, participants took 32 minutes to complete the assessment.

Discussion
The feasibility study of MEAT demonstrated a 100% response rate, indicative of successful implementation.
Previous online MSK studies have also shown reasonable response rates, though never reaching 100% [18].The high response rate was attributed to the in-person administration during orientation compared to remote administration.The absence of any reported difficulties in accessing, completing, or understanding MEAT indicates that it is feasible that participants will complete subsequent assessments.The mean and median participant scores of 60% indicated minimal variability in responses.While no passing score has yet been established for MEAT, the obtained scores are below conventional passing thresholds, which aligns with the expectation for participants who had not yet undergone any GME activities.Furthermore, the participants took an average of 32 minutes to complete the test, and hence sufficient time should be allocated for its administration.Overall, these findings support the feasibility and potential utility of MEAT as an efficient and accessible tool for assessing MSK knowledge among incoming resident physicians in EM programs.
All participants had documented UME MSK education, with a median of 15% of the total curriculum being dedicated to MSK education.This closely aligns with the estimated 20% prevalence of MSK complaints seen annually in primary care and ED visits.However, the content and clinical relevance of UME MSK education varied significantly, ranging from 0 to 35% of the total curriculum, reflecting variability in MSK knowledge domains and clinical rotations.Although the UME curriculum must fit within the general competencies framework required by accrediting bodies such as the Liaison Committee on Medical Education (LCME) [19] or the American Osteopathic Association Commission on Osteopathic College Accreditation (COCA) [20], there are no specific guidelines on the MSK education content or the phase of UME in which it should be taught.Without such guidance, there is no standardization.Without standardization, tests such as the FB-MSK, MSK30, and MEAT may have to remain specialty-specific rather than generalizable to any medical student.
Following the successful feasibility study, our next step involves leveraging MEAT to establish multiple benchmark data points throughout EM residency training.Through a multi-institutional study involving various EM programs, we aim to generate comprehensive benchmarks for incoming and outgoing EM resident physicians.Educators can use these benchmarks to evaluate the efficacy of their programs on a broader scale.Additionally, they can assess the impact of a specific intervention, such as an MSK rotation or workshop, on resident MSK knowledge.The availability of these data points will enable educators to implement targeted interventions tailored to individual residents or entire programs and drive continuous improvement in MSK education and patient care outcomes within the field of EM.
This feasibility study has several limitations.The small sample size, comprising 21 participants from a single institution, limits the generalizability of the findings.The small sample size also reduces the statistical power to detect significant differences and may not accurately represent the broader population of EM residents across different institutions.Additionally, the study was conducted at a single institution, which may introduce site-specific biases and limit the applicability of the results to other settings with different educational environments and resources.Furthermore, the study did not establish a passing score for the MEAT, meaning that the current scores cannot be definitively interpreted as indicating competency or deficiency in MSK knowledge.We also did not assess the residents' perceived ease of use or satisfaction with the MEAT after completion.Feedback from participants regarding the usability and relevance of the assessment could provide valuable insights for future iterations and broader implementation.As a feasibility study, the primary goal was to assess the practicality and initial response to the MEAT rather than to evaluate its effectiveness comprehensively.Future studies with larger, more diverse cohorts and multiinstitutional participation are necessary to validate the MEAT and establish its efficacy in assessing and improving MSK knowledge among EM residents.

Conclusions
This   The figures mentioned in the table are presented below (Figures 1-8).

FIGURE 2 :
FIGURE 2: Anteroposterior Radiograph of the Wrist Image courtesy of Ian Bickle, Radiopaedia.org

FIGURE 3 :
FIGURE 3: Longitudinal Ultrasound View of the Left Hindfoot Image courtesy of Matthew Negaard

FIGURE 4 :
FIGURE 4: Lateral Radiograph of the WristImage courtesy of Will Denq

FIGURE 6 :
FIGURE 6: Lateral Radiograph of the Digit Case courtesy of Andrew Taylor, Radiopaedia.org

FIGURE 7 :
FIGURE 7: Photo of the Affected DigitImage courtesy of Adam Rosh, Rosh Review

FIGURE 8 :
FIGURE 8: Anteroposterior Radiograph of the Shoulder Case courtesy of Henry Knipe, Radiopaedia.org

TABLE 1 : Demographic Data
MSK: musculoskeletal; PGY: postgraduate yearThis project was reviewed and approved by the University of Arizona Institutional Review Board (IRB 1611010724, dated November 4, 2022).All participants provided informed consent, and data confidentiality was maintained throughout the study.

Table 2 .
All 21 participants completed the assessment.