Impact of an Electronic App on Resident Responses to Simulated In-Flight Medical Emergencies: Randomized Controlled Trial

Background Health care providers are often called to respond to in-flight medical emergencies, but lack familiarity with expected supplies, interventions, and ground medical control support. Objective The objective of this study was to determine whether a mobile phone app (airRx) improves responses to simulated in-flight medical emergencies. Methods This was a randomized study of volunteer, nonemergency resident physician participants who managed simulated in-flight medical emergencies with or without the app. Simulations took place in a mock-up cabin in the simulation center. Standardized participants played the patient, family member, and flight attendant roles. Live, nonblinded rating was used with occasional video review for data clarification. Participants participated in two simulated in-flight medical emergencies (shortness of breath and syncope) and were evaluated with checklists and global rating scales (GRS). Checklist item success rates, key critical action times, GRS, and pre-post simulation confidence in managing in-flight medical emergencies were compared. Results There were 29 participants in each arm (app vs control; N=58) of the study. Mean percentages of completed checklist items for the app versus control groups were mean 56.1 (SD 10.3) versus mean 49.4 (SD 7.4) for shortness of breath (P=.001) and mean 58 (SD 8.1) versus mean 49.8 (SD 7.0) for syncope (P<.001). The GRS improved with the app for the syncope case (mean 3.14, SD 0.89 versus control mean 2.6, SD 0.97; P=.003), but not the shortness of breath case (mean 2.90, SD 0.97 versus control mean 2.81, SD 0.80; P=.43). For timed checklist items, the app group contacted ground support faster for both cases, but the control group was faster to complete vitals and basic exam. Both groups indicated higher confidence in their postsimulation surveys, but the app group demonstrated a greater increase in this measure. Conclusions Use of the airRx app prompted some actions, but delayed others. Simulated performance and feedback suggest the app is a useful adjunct for managing in-flight medical emergencies.

Yes: "Epidemiologic evidence for in-flight medical emergencies (IFMEs) from a ground-based medical support system estimated that medical emergencies occur in 1 of every 604 flights [1]. This is likely an underestimate, because no mandatory reporting system exists and uncomplicated issues often go unreported [2]. Air travel is increasing, with 895.5 million passengers flying in 2015 [3], leading to increased frequency of IFMEs. In one study, 42% of 418 healthcare providers surveyed reported being called upon to give aid in an IFME [4]. The Federal Aviation Administration (FAA) mandates that United States-based airlines carry basic first-aid kits stocked with bandages and splints, and at least one automated external defibrillator (AED) must be available [5]. Beyond the basic kit, no national or international standards exist, though there have been recent calls for consistency [6,7]. Healthcare personnel are also unlikely to be familiar with medical kit contents, flight crew communication and medical emergency protocols [4]. Clinicians' expertise typically consists of their specialty training and life support courses. Emergency response training is often limited as emergency medicine is not a mandatory rotation in medical education [8]. Though helpful, ground-based medical consultation support services (ground medical control) still depend on volunteers to be their "eyes and ears" [9,10]. The assumption is that volunteers will find and report clinical information relevant to the presenting medical emergency [10]. Comfort attending to an IFME is likely to vary substantially across provider backgrounds. Thus, there is a need for education about the environment, and scenario-based basic IFME response training. In recent months, the aviation and healthcare industries have recognized this and called for education in emergency stabilization and flight medicine at both graduate and undergraduate levels [12,13]. Although several authors have discussed the management of in-flight emergencies [14][15][16][17][18][19], little real-time decision support exists outside of ground medical control. Normal emergency response smartphone applications (apps) or cognitive aids may not take the environment into account. In response to this perceived need, a smartphone app was designed by emergency, aerospace medicine, and radiology physicians (airRx) [20] to assist licensed healthcare personnel in dealing with the most common IFMEs. The app offers complaint-specific recommended actions, care algorithms, and in-the-moment information regarding the likely available medications. While serving as a real-time decision support reference, the app also provides a method of just-intime-training (JITT) [21]. Pertinently, the JITT approach has been successful in on-the-job training for first responders in unfamiliar situations [22]. Studies have also shown that smartphone based cognitive aids promote adherence to protocols in both real and simulated clinical scenarios [23][24][25]. A JITT-based smart phone cognitive aid/application is therefore, a reasonable approach to delivering focused learning during an IFME. The objective of this study was to determine usefulness of the airRx smartphone application in responding to simulated IFMEs. Our secondary objective was to examine whether access to the airRx app would increase confidence to respond to an IFME."

Does your paper address CONSORT subitem 2b?
Yes: "Epidemiologic evidence for in-flight medical emergencies (IFMEs) from a ground-based medical support system estimated that medical emergencies occur in 1 of every 604 flights [1]. This is likely an underestimate, because no mandatory reporting system exists and uncomplicated issues often go unreported [2]. Air travel is increasing, with 895.5 million passengers flying in 2015 [3], leading to increased frequency of IFMEs. In one study, 42% of 418 healthcare providers surveyed reported being called upon to give aid in an IFME [4]. The Federal Aviation Administration (FAA) mandates that United States-based airlines carry basic first-aid kits stocked with bandages and splints, and at least one automated external defibrillator (AED) must be available [5]. Beyond the basic kit, no national or international standards exist, though there have been recent calls for consistency [6,7]. Healthcare personnel are also unlikely to be familiar with medical kit contents, flight crew communication and medical emergency protocols [4]. Clinicians' expertise typically consists of their specialty training and life support courses. Emergency response training is often limited as emergency medicine is not a mandatory rotation in medical education [8]. Though helpful, ground-based medical consultation support services (ground medical control) still depend on volunteers to be their "eyes and ears" [9,10]. The assumption is that volunteers will find and report clinical information relevant to the presenting medical emergency [10]. Comfort attending to an IFME is likely to vary substantially across provider backgrounds. Thus, there is a need for education about the environment, and scenario-based basic IFME response training. In recent months, the aviation and healthcare industries have recognized this and called for education in emergency stabilization and flight medicine at both graduate and undergraduate levels [12,13]. Although several authors have discussed the management of in-flight emergencies [14][15][16][17][18][19], little real-time decision support exists outside of ground medical control. Normal emergency response smartphone applications (apps) or cognitive aids may not take the environment into account. In response to this perceived need, a smartphone app was designed by emergency, aerospace medicine, and radiology physicians (airRx) [20] to assist licensed healthcare personnel in dealing with the most common IFMEs. The app offers complaint-specific recommended actions, care algorithms, and in-the-moment information regarding the likely available medications. While serving as a real-time decision support reference, the app also provides a method of just-in-time-training (JITT) [21]. Pertinently, the JITT approach has been successful in on-the-job training for first responders in unfamiliar situations [22]. Studies have also shown that smartphone based cognitive aids promote adherence to protocols in both real and simulated clinical scenarios [23][24][25]. A JITT-based smart phone cognitive aid/application is therefore, a reasonable approach to delivering focused learning during an IFME. The objective of this study was to determine usefulness of the airRx smartphone application in responding to simulated IFMEs. Our secondary objective was to examine whether access to the airRx app would increase confidence to respond to an IFME. " METHODS 3a) CONSORT: Description of trial design (such as parallel, factorial) including allocation ratio Yes: "This was a prospective randomized control trial. Fifty-eight subjects were block randomized by post-graduate year and specialty area to simulated IFMEs with and without access to a smart device application." 3b) CONSORT: Important changes to methods after trial commencement (such as eligibility criteria), with reasons No changes to methods were made.

3b-i) Bug fixes, Downtimes, Content Changes
Not applicable 4a) CONSORT: Eligibility criteria for participants "Subjects were solicited from non-emergency medicine residency programs including Diagnostic Radiology, Family Medicine, Internal Medicine, Pediatrics, Psychiatry, Combined Medicine-Pediatrics, and Obstetrics and Gynecology. Emergency medicine residents were excluded given their expertise and training in management of emergencies. Subjects' performances were kept confidential. They were compensated through a $25 gift card and a copy of the airRx application at no cost to them. Subjects were instructed to keep the scenarios confidential to minimize the relay of scenario information to future participants. "

No this was not deemed relevant for this study 4a-ii) Open vs. closed, web-based vs. face-to-face assessments:
Yes: "Subjects were solicited from non-emergency medicine residency programs including Diagnostic Radiology, Family Medicine, Internal Medicine, Pediatrics, Psychiatry, Combined Medicine-Pediatrics, and Obstetrics and Gynecology. Emergency medicine residents were excluded given their expertise and training in management of emergencies. Subjects' performances were kept confidential. They were compensated through a $25 gift card and a copy of the airRx application at no cost to them. Subjects were instructed to keep the scenarios confidential to minimize the relay of scenario information to future participants. " 4a-iii) Information giving during recruitment Yes. "All subjects were pre-briefed via a standardized script. Both the control and intervention groups were allowed to use any other phone apps they had on their personal smart device that would be accessible during airplane mode. The application group had up to 15 minutes to familiarize themselves with the app. Both groups were aware that the simulation topic was IFME. Scenarios began with the subjects sitting in the simulated cabin with a brief pause before flight attendants announced the IFME and called for assistance. " 4b) CONSORT: Settings and locations where the data were collected Yes: "The study took place at a university hospital affiliated simulation center. Space and movement limits that mimicked the floor distances of a Boeing 737 aircraft were created within a simulation lab with audiovisual recording capability. " 4b-i) Report if outcomes were (self-)assessed through online questionnaires Yes."We also created pre-post simulation surveys for participants to self-assess their readiness for IFMEs, knowledge of resources, medico-legal concerns, crew integration, IFME communications process, and willingness to respond. Surveys were pilot tested for clarity, and usability questions (app group only) were derived from a previously developed technology usability survey" 4b-ii) Report how institutional affiliations are displayed Not relevant 5) CONSORT: Describe the interventions for each group with sufficient details to allow replication, including how and when they were actually administered 5-i) Mention names, credential, affiliations of the developers, sponsors, and owners Yes.

5-ii) Describe the history/development process
This was not relevant to this study.

5-iii) Revisions and updating
Not applicable 5-iv) Quality assurance methods Not applicable to this study 5-v) Ensure replicability by publishing the source code, and/or providing screenshots/screen-capture video, and/or providing flowcharts of the algorithms used Not applicable 5-vi) Digital preservation Not applicable 5-vii) Access Yes: "All subjects were pre-briefed via a standardized script. Both the control and intervention groups were allowed to use any other phone apps they had on their personal smart device that would be accessible during airplane mode. The application group had up to 15 minutes to familiarize themselves with the app. Both groups were aware that the simulation topic was IFME"

5-viii) Mode of delivery, features/functionalities/components of the intervention and comparator, and the theoretical framework
This item was not deemed relevant. The App was not created by authors, they merely tested its utility

5-ix) Describe use parameters NOt important 5-x) Clarify the level of human involvement
The App is meant to be used for Inflight Medical emergencies. "The simulation center has a cadre of standardized participants (SPs) who undergo general and scenario-specific orientation. The SPs went through dry runs of each scenario, received feedback on their performance, and were given ear buds for prompts in real-time. In each scenario there was one SP passenger who became ill and one SP passenger bystander who had relevant information if asked. Stable actor cohorts played these roles. Pathologic physical exam findings were given on cue through pre-written cards from the bystander SP, as healthy patient SPs could not mimic symptoms such as wheezing. For each case there were also two SP flight attendants who communicated with the investigators in the simulation control room ("pilot" and "ground medical support"), and relayed responses to the participants. Real flight attendants trained SPs to portray flight attendant roles through direct observations of their performance in pilot simulations, video review, and discussion of planned responses to questions. In order to isolate subject performances, we instructed the flight attendants to be helpful and follow directions, but wait to inform ground medical control until instructed. Thus, we controlled for variable airline protocols, flight attendant training, and individual responses expected in real life. " 5-xi) Report any prompts/reminders used No prompts were used to elicit subjects to use the app.

5-xii) Describe any co-interventions (incl. training/support)
Not applicable to this study. 6a) CONSORT: Completely defined pre-specified primary and secondary outcome measures, including how and when they were assessed Yes. "The main measures assessed were subject CL completion rates, GRS, time to critical actions, and pre-post simulation confidence surveys. " 6a-i) Online questionnaires: describe if they were validated for online use and apply CHERRIES items to describe how the questionnaires were designed/deployed Not applicable to this study. 6a-ii) Describe whether and how "use" (including intensity of use/dosage) was defined/measured/monitored not applicable 6a-iii) Describe whether, how, and when qualitative feedback from participants was obtained not applicable 6b) CONSORT: Any changes to trial outcomes after the trial commenced, with reasons Yes: "The study took place at a university hospital affiliated simulation center. Space and movement limits that mimicked the floor distances of a Boeing 737 aircraft were created within a simulation lab with audiovisual recording capability. " 7a) CONSORT: How sample size was determined 7a-i) Describe whether and how expected attrition was taken into account when calculating the sample size "Sample size estimation was difficult due to unknown performance expectations, standard deviations, and effect sizes. However, we prospectively estimated our sample size to be 74 total, or 37 per group, to have an 80% chance (power = 0.80) of detecting a 20% improved performance overall in the CL, with an assumed standard deviation of 30%." 7b) CONSORT: When applicable, explanation of any interim analyses and stopping guidelines Yes. "The main measures assessed were subject CL completion rates, GRS, time to critical actions, and pre-post simulation confidence surveys. " 8a) CONSORT: Method used to generate the random allocation sequence "Fifty-eight subjects were block randomized by post-graduate year and specialty area to simulated IFMEs with and without access to a smart device application. " 8b) CONSORT: Type of randomisation; details of any restriction (such as blocking and block size) "Fifty-eight subjects were block randomized by post-graduate year and specialty area to simulated IFMEs with and without access to a smart device application. "