Skip to main content
  • Research article
  • Open access
  • Published:

Implementation of a billable transitional care model for stroke patients: the COMPASS study

Abstract

Background

The COMprehensive Post-Acute Stroke Services (COMPASS) pragmatic trial compared the effectiveness of comprehensive transitional care (COMPASS-TC) versus usual care among stroke and transient ischemic attack (TIA) patients discharged home from North Carolina hospitals. We evaluated implementation of COMPASS-TC in 20 hospitals randomized to the intervention using the RE-AIM framework.

Methods

We evaluated hospital-level Adoption of COMPASS-TC; patient Reach (meeting transitional care management requirements of timely telephone and face-to-face follow-up); Implementation using hospital quality measures (concurrent enrollment, two-day telephone follow-up, 14-day clinic visit scheduling); and hospital-level sustainability (Maintenance). Effectiveness compared 90-day physical function (Stroke Impact Scale-16), between patients receiving COMPASS-TC versus not. Associations between hospital and patient characteristics with Implementation and Reach measures were estimated with mixed logistic regression models.

Results

Adoption: Of 95 eligible hospitals, 41 (43%) participated in the trial. Of the 20 hospitals randomized to the intervention, 19 (95%) initiated COMPASS-TC.

Reach: A total of 24% (656/2751) of patients enrolled received a billable TC intervention, ranging from 6 to 66% across hospitals.

Implementation: Of eligible patients enrolled, 75.9% received two-day calls (or two attempts) and 77.5% were scheduled/offered clinic visits. Most completed visits (78% of 975) occurred within 14 days.

Effectiveness: Physical function was better among patients who attended a 14-day visit versus those who did not (adjusted mean difference: 3.84, 95% CI 1.42–6.27, p = 0.002).

Maintenance: Of the 19 adopting hospitals, 14 (74%) sustained COMPASS-TC.

Conclusions

COMPASS-TC implementation varied widely. The greatest challenge was reaching patients because of system difficulties maintaining consistent delivery of follow-up visits and patient preferences to pursue alternate post-acute care. Receiving COMPASS-TC was associated with better functional status.

Trial registration

ClinicalTrials.gov number: NCT02588664. Registered 28 October 2015.

Peer Review reports

Background

Stroke is the fifth leading cause of death in the United States (US) and the leading preventable cause of disability [1]. Significant improvements in acute stroke care over the past two decades have reduced mortality and increased the number of patients that need assistance transitioning to their communities in the presence of complex comorbidity and residual deficits. Substantial evidence supports the use of stroke rehabilitation and secondary prevention after stroke [2], and growing evidence supports transitional care (TC) models that have since become the standard of care in other countries [3, 4]. However, in the US, there is still no standard of care for stroke patients discharged home [5, 6]. Instead, post-acute care for stroke patients discharged home is fragmented and poorly coordinated and lacks continuity. To encourage providers to offer transitional care management (TCM) services to patients discharged home, the Centers for Medicare and Medicaid Services (CMS) created billing codes for reimbursement. However, TCM billing remains underutilized (7% in 2015) [7].

The large, pragmatic, cluster-randomized COMprehensive Post-Acute Stroke Services (COMPASS) Study examined the comparative effectiveness of a comprehensive TC model (COMPASS-TC) versus usual care for individuals discharged home following a stroke or transient ischemic attack (TIA). COMPASS-TC included evidence-based components of early supported discharge (ESD) [4, 8] and met CMS TCM reimbursement requirements. The primary results of the COMPASS trial have been described [9]. Importantly, we observed considerable heterogeneity in delivery of COMPASS-TC, which suggests that there are hospital-level factors that drive implementation and warrant further investigation.

Given the importance of healthcare redesign to provide more effective and sustainable chronic disease management, it is critical to understand the factors that describe which hospitals were most likely to deliver an intervention that meets TCM billing requirements and which patients were most likely to receive it. This will inform future efforts to implement TC and use TCM billing codes. The aim of this implementation study was to leverage the experiences from the COMPASS Study to evaluate the first phase of implementation of a billable TCM model for patients discharged home with stroke or TIA in the context of real-world clinical practice across the state of North Carolina (NC).

Methods

Study design and sites

The COMPASS study design and selection of hospitals are described in detail elsewhere [10, 11]. Briefly, hospitals were randomized to either the intervention (COMPASS-TC) or control (usual care) arm in Phase 1. In Phase 2, usual care hospitals crossed over to implement the intervention, and the original intervention hospitals attempted to sustain COMPASS-TC with limited study support.

In this paper we analyzed implementation of COMPASS-TC in the 20 hospitals randomized to the intervention arm in Phase 1, including 2751 patients enrolled July 2016 through March 2018 [12]. We evaluated patient and hospital characteristics associated with successful implementation. Our reporting follows the Standards for Reporting Implementation Studies (StaRI) guidelines for transparent and accurate reporting of implementation studies [13] and adheres to CONSORT guidelines.

Context: hospital engagement

As required by the Patient-Centered Outcomes Research Institute (PCORI), hospitals used their existing infrastructure, budget, and staffing to deliver the intervention. The study paid hospitals $50 per enrolled patient (plus $105 per patient who returned to the COMPASS follow-up clinic and received a care plan within 18 days of discharge) but did not pay for staff time to deliver the intervention. Hospitals also received $14,000 prior to implementation to offset costs of training and building infrastructure. These payments were not meant to cover costs. In line with pragmatic trial design as well as PCORI funding guidelines, the financial assistance to hospitals was minimal [14]. The study did not provide financial assistance for intervention costs including compensation for personnel that were delivering the intervention and equipment and materials costs associated with delivery of the intervention. COMPASS-TC was integrated into patient care without additional personnel resources provided to hospitals [10].

Study sample

This implementation analysis included all data from patients enrolled in the Phase 1 intervention arm (n = 2751 events among n = 2689 patients). Patients were eligible for enrollment if they were aged 18 years or older, spoke English or Spanish, were diagnosed with ischemic stroke, hemorrhagic stroke (excluding subdural or aneurysmal hemorrhage), or TIA, and were discharged directly home [10].

COMPASS-TC as a billable TCM intervention

The COMPASS-TC intervention was designed to be consistent with CMS TCM reimbursement requirements [15,16,17], which include requiring that the patient:

  • Transition from an inpatient setting (e.g., acute care hospital) to home;

  • Be of moderate- or high-complexity medical decision-making;

  • Receive communication (direct contact, telephone, or electronic) within two business days of discharge or two or more documented attempts within two business days; and

  • Have a face-to-face clinical visit within seven calendar days (for CPT Code 99496) or within 14 calendar days (for CPT Code 994965) of discharge from the inpatient setting [17].

The foundational component of the COMPASS-TC intervention was post-acute care coordination and management by a registered nurse (RN), post-acute care coordinator (PAC) and an advanced practice provider (APP), defined as a nurse practitioner (NP), physician assistant (PA), or physician. The PAC and/or APP contacted the patient in-person before hospital discharge to home, by phone two days post-discharge (or two attempts), and saw the patient in-person at the COMPASS-TC clinic visit within 14 days of discharge. During the pre-discharge visit, the PAC/APP described COMPASS-TC to the patient and family. The hospital-based team utilized an electronic TC planning tool developed by the study team to systematically evaluate patients [15] during both telephone and face-to-face follow-up. During the two-day call, the PAC/APP asked if any cognitive or physical deficits became apparent at home, completed medication review and reconciliation, referred to home health or outpatient care if indicated, provided patient education stroke symptoms, and reminded the patient of the upcoming clinic visit. Hospitals had flexibility in their COMPASS clinic setting (neurology clinic, hospital-based non-specialty clinic, primary care provider (PCP) clinic). During the clinic visit, the PAC/APP performed a standardized assessment of the patient’s functional status, medical and neurological care needs, social determinants of health, and caregiver’s capacity for assisting the patient during recovery [15, 16]. The results of the comprehensive assessment were captured electronically and generated an individualized electronic care plan (eCare Plan) that was shared with the patient, caregiver, PCP, and home health and outpatient therapy where applicable. The eCare Plan identified areas of patient/caregiver need and directed the PAC/APP in appropriate referrals to relevant community-based services (e.g., caregiver support services, medication management). Each hospital, in partnership with the study team, assembled a community resource directory of available services in their county to populate the eCare Plan, and a network of service providers in their area (community resource network) to support these referrals.

Implementation strategies

An implementation strategy is defined as an activity that facilitates adoption, implementation, and sustainability [18]. We utilized a number of implementation strategies [19] prior to hospital implementation and during active implementation of COMPASS-TC (Additional file 2: Table S1). After hospitals were randomized, training for intervention hospitals consisted of: two-day intensive training “boot camp” (to explain the care model), 6-h site visit (to tailor implementation, identify available community resources, and build community resource networks (CRN)), bi-monthly peer problem-solving calls, monthly data feedback on performance, and one-on-one same-day consulting as requested. Training and ongoing consultation was provided to all intervention hospitals by the study’s Director of Implementation who had both clinical and administrative experience, and a team of multidisciplinary providers with experience in stroke care. All training materials were approved by stakeholders, including patients and caregivers. Educational and training modules were made available online, and monthly educational webinars were provided on a variety of stroke/TIA topics relevant to clinical practice.

RE-AIM framework & measures

To assess implementation, we used the RE-AIM framework, the most widely used framework in implementation science for evaluation [20]. RE-AIM evaluates the Reach, Effectiveness, Adoption, Implementation, and Maintenance of health promoting interventions in real-world, complex settings, attending to both individual and organizational levels of impact, to address questions of translation into practice and generalizability. This framework focuses on features of the settings and participants, features of the implementers, and the frequency and intensity of intervention activities [13] and thus is suitable for process evaluations.

Reach assesses who received the intervention [21] and is a patient-level measure of participation [12]. Randomization occurred at the hospital level. The study met criteria for a waiver of consent and HIPAA Authorization; therefore all patients were enrolled automatically and given opportunities to decline participation in the outcomes survey or to withdraw from the research study entirely [22]. Hospital staff screened, identified, and enrolled eligible patients and initiated the intervention. All patients in the study were theoretically eligible for TCM at discharge as they all transitioned from the hospital directly home with a diagnosis of stroke or TIA, which meet the criteria for moderate- or high-complexity medical decision making. Thus, successfully “reached” patients were those whose care met all TCM billing requirements, including receipt of a call two business days post-discharge (or two attempts) and attendance at a clinic visit (which includes an eCare Plan) 14 calendar days post-discharge. Reach was calculated as the proportion of reached patients out of the total enrolled by hospital staff.

Effectiveness assesses what effect the intervention had on important outcomes [21]. The primary outcome for the COMPASS Study was physical function measured with the Stroke Impact Scale-16 (SIS-16) [23]. The Effectiveness metric was defined as within-hospital mean differences in SIS-16 between patients receiving a 14-day clinic visit versus not. A comparison of COMPASS-TC versus usual care has been presented elsewhere [9].

Adoption assesses where the intervention was implemented [21]. The Adoption metrics were: the number and proportion of hospitals that initiated the intervention, number and characteristics of intervention agents (clinical team delivering the intervention: PAC, PAC back-up, APP, and APP back-up) that were trained, time it took to launch after training, and number of intervention agents that delivered the intervention.

Implementation assesses how each component of the intervention was delivered [21] as intended, including time needed for the implementation [12] of the clinic visit, which is the core of the COMPASS-TC model. The Implementation metrics were defined as:

  1. (1)

    Fidelity to each of the components of the intervention measured with quality measures that were created as part of the COMPASS Study to provide real-time feedback to hospitals (Appendix for details):

  • Proportion of patients identified and enrolled within 2 days,

  • Proportion of patients who had a two-day call or documentation of two attempts (i.e., delivered per protocol),

  • Proportion of patient scheduled/offered a clinic visit within 14 days, even if patient did not attend (i.e. delivered per protocol),

  • Proportion of all completed visits occurring within 14 days, and

  • Proportion of all completed visits during which an eCare Plan generated.

  1. (2)

    Clinic visit duration (minutes)

  2. (3)

    Number of days from discharge to clinic visit among participants who attended a visit at any time.

Maintenance assesses how long the intervention was sustained. The maintenance metric was defined as the absolute number, proportion, and characteristics of hospitals that continued to enroll patients and deliver components of COMPASS-TC for a minimum of 6 months after the end of Phase 1.

Data collection

Process data, such as dates of completion and reasons for not completing a study-related task, were used to compute quality measures. These were collected during the study by hospital staff using forms embedded in a web-based application.

Hospital characteristics were obtained from public data files (e.g., Rural-Urban Community Area (RUCA) code website [24], Joint Commission website [25], NC Stroke Care Collaborative hospital characteristics, Medicare Provider of Services files) [26], and baseline hospital surveys [10, 27], which ascertained staff turnover, dates of training, and clinic locations. Additional surveys administered approximately 6 months into the study captured clinic visit duration, organizational readiness, and partnership synergy. Organizational readiness was measured by the Organizational Readiness to Implement Change (ORIC) Scale [28], which measures change commitment and change efficacy. ORIC is a validated instrument that was administered to the intervention agents (clinical team delivering the intervention: PAC, PAC back-up, APP, and APP back-up). The Partnership Synergy Scale measured the level of engagement between the clinical team and their CRN, as a proxy for how much the community network assisted PACs with linking patients to community-based social services [29]. This is a validated instrument and was administered to intervention agents, site principal investigator, hospital leadership, and community network members (e.g., engaged community-based pharmacists, social service providers, rehabilitation providers supporting the PAC in linking patients to needed services and resources outside the hospital).

Patient demographic and clinical characteristics were obtained from electronic health records and recorded by hospital staff. Patient addresses were geocoded using ArcMap 10.5.1 and the World Geocoding Service. Three (0.1%) patients did not have home addresses and were not geocoded. Shortest distance to the COMPASS clinic visit location was computed using Open Streetmap and ArcGIS Network Analyst.

Patient outcomes were obtained at 90 days by trained (blinded) interviewers administering a phone survey that included self-reported post-stroke physical function, measured with the Stroke Impact Scale-16 (SIS-16) [23]. The SIS-16 was selected as the primary patient outcome for the COMPASS study because of its strong psychometric properties, including validation for proxy, phone, and mail administration [30, 31]. It is a patient-centric measure, developed in with input from patients, caregivers, and providers [23]. It was designed to capture significant residual deficits in mild and moderate stroke, and is superior to other measures in capturing residual deficits that matter to patients. Even in those with the mildest strokes (NIH Stroke Severity Score 0 to 5), only 10% report full function on the SIS-16. The 16 items measure ADLS, IADLS, and physical activities on a scale of 0–100, with higher scores indicating higher function. Patient outcomes were collected by interviewers at the Carolina Survey Research Laboratory. Interviewers were trained on study-specific protocols that incorporated patient feedback obtained during pilot testing. Interviewers were blinded to randomization arm and administered all interviews using standardized computer-assisted telephone interviewing software and scripts [10]. Interviewers were monitored biweekly by their supervisors for quality control.

Hospital audits Hospitals participated in two unannounced case ascertainment audits, each covering a 2 month period, to evaluate the proportion of eligible cases that hospitals correctly identified and enrolled.

Consent Institutional review board (IRB) approval was received through Wake Forest University Health Sciences (central IRB), or through local hospital IRBs. At 90 days, patients provided informed consent [22] for collection of outcomes. Additionally, participants consented to collection of process measures, including the ORIC and Partnership Synergy Scale via email survey.

Statistical analyses

Descriptive statistics were used to summarize each RE-AIM domain. Characteristics of enrolled patients who were reached versus not were compared using Fisher’s exact and Wilcoxon rank sum tests. Generalized linear mixed models (GLMMs) were used to evaluate patient characteristics associated with clinic visit attendance (a primary component of patient Reach) and were adjusted for clinic setting, distance to the clinic, and organizational readiness. Analyses were performed conditional on successful implementation of the 14-day visit (i.e., a visit being scheduled or offered to the patient).

Associations between hospital characteristics and both Reach and Implementation quality measures were estimated with GLMMs, adjusted for patient characteristics including age, race, gender, diagnosis, stroke severity, history of stroke or TIA, presence of at least one cardiovascular comorbidity (i.e., cardiovascular disease, heart failure, atrial fibrillation, or diabetes), and distance to the COMPASS-TC follow-up clinic.

Linear mixed models were used to estimate overall and hospital-specific mean differences in SIS-16 [23] between patients who received a visit and eCare plan within 14 days compared with those who did not receive a visit (Effectiveness). These models were adjusted for age, race, stroke severity, diagnosis, and a log-transformed patient-specific propensity score. Propensity scores were constructed to account for differences in patient characteristics of those who did and did not attend the clinic visit. Propensity scores were estimated with conditional logistic regression and incorporated information such as medical history and comorbidity [9].

Finally, p-values for associations between hospital characteristics and Maintenance were obtained using Fisher’s Exact and Wilcoxon Rank Sum tests.

Missing Data Some covariate data were incompletely ascertained (e.g., 2% missing stroke severity). Multiple imputation with chained equations [32] was used to construct 100 complete datasets that were analyzed with GLMMs as described above and estimates were combined using standard techniques [33].

Results

The extended CONSORT diagram [34] provides summary information about each RE-AIM dimension in temporal order, beginning with adoption (Fig. 1). Taken together, hospital-level adoption of the intervention was moderate among all eligible hospitals, and high among enrolled hospitals. Implementation and Maintenance were high, and the effectiveness of treatment was largely consistent across hospitals; however, Reach was low (Fig. 2). Results for each RE-AIM dimension are described in detail below.

Fig. 1
figure 1

Extended CONSORT Flow Diagram

Fig. 2
figure 2

At-A-Glance Summary of Implementation of COMPASS-TC. Effectiveness is shown as the unadjusted mean Stroke Impact Scale (SIS)-16 (on a scale of 0–100) among patients that received COMPASS-TC within 14 days. These patients had an average adjusted score ~ 4-points higher than patients with no visit

Adoption

Ninety-five hospitals were eligible and invited to participate, of which 41 (43%) signed a letter of agreement to participate, received IRB approval, and were randomized in 40 units. Details of hospital recruitment have been published [11]. Participating hospitals, compared to non-participating hospitals, were more likely to be Primary Stroke Centers (59% vs. 41%), located in metropolitan areas (54% vs. 45%), and have high stroke volume (29% vs. 17% having ≥300 stroke discharges per year) [11].

Of the 20 hospitals randomized to the intervention arm, 19 initiated the intervention (95%). The non-adopting site was a rural, critical access hospital. Reasons for non-adoption included having a small number of stroke patients (reported average of 7 per year) and a change in executive leadership after the letter of agreement was signed.

All hospitals (20/20; 100%) completed both required trainings: an intensive two-day boot camp for intervention agents and a six-hour site visit for intervention agents and hospital and community stakeholders. The PAC role was filled by RNs except for one PA. The APP role was filled by NPs (N = 10), PAs (N = 7), and Neurologists (N = 2). Only 10 hospitals had a designated back-up PAC, and 5 had a designated back-up APP. At the on-site six-hour training, hospitals had a median of 7 hospital staff attendees (IQR 5–8).

Median time from training to launch was 60 days (range 25–190; IQR [39–74]). Mean full time equivalent positions to deliver the intervention was 2.15 (1.64 for low-volume [< 300 patients per year] hospitals and 2.83 for high-volume [300+ patients per year] hospitals).

Implementation

Clinic setting for in-person follow-up varied according to the availability of staffing in each hospital and changed over time for 21% (4/19) of hospitals. Neurology-based clinics (both in-hospital and ambulatory settings) were the most common (12 hospitals, 69% of patients). Five hospitals utilized hospital-based non-specialty clinics, and 7 utilized PCP-based clinics. The median hospital organizational readiness score was 4.3 and ranged from 1.95 to 5.0, with 5 indicating most ready. The median partnership synergy score was 3.8 and ranged from 2.25 to 5.0, with 5 indicating that the hospital staff and their CRN partners collaborated to a large extent to respond to patients’ needs when transitioning home.

Hospital performance on case ascertainment and enrollment Within the 4-month audit period, 58% (796/1376) of eligible patients were enrolled [9], with wide variability (5 to 100%) across intervention hospitals (Fig. 3). Enrolled patients were typical of those discharged directly home after a stroke [9]. The most common reason hospital staff reported for not ascertaining and enrolling cases was COMPASS staffing challenges (e.g., shortages, turnover). Other reasons included patient discharge over the weekend or while the PAC was out of the office or having been missed in screening (e.g., no presumptive stroke diagnosis, discharged directly from emergency department). Among enrolled patients, 47% were provided an introduction to the study in person before discharge, while the remainder were contacted post-discharge by phone or mail. Fifty-eight percent of patients were scheduled for a follow-up visit before discharge.

Fig. 3
figure 3

Hospital-Specific Case Ascertainment. Bars represent the proportion of eligible patients enrolled at individual hospitals over the 4 months of case ascertainment audits. The numbers of patients enrolled out of all eligible patients during the audit period are indicated above each bar

Hospital performance on quality measures On average, 41.7% (n = 2751) of patients were screened, identified, and enrolled concurrent with care (within 2 days of discharge), and 75.9% received a two-day call (or two attempts). Further, 77.5% of eligible patients were scheduled for or offered a clinic visit. Among 975 completed clinic visits occurring at any time after discharge, 98.5% included an eCare Plan, 78% occurred within 14 days as required for CPT Code 994965, and 28% within 7 days as required for CPT Code 99496 (Additional file 1: Figure S1). Shortly after implementation began, PACs estimated that, on average, they spent 43 min (SD 21 min) with patients at the clinic visit. Six months later, they reported an average 29 min (SD 17 min).

In adjusted analysis, higher organizational readiness was associated with higher odds of scheduling a clinic visit (OR = 1.60; 95% CI 1.00 to 2.58; Table 1). High volume and urban location were associated with lower odds of scheduling a clinic visit. Neurology clinics had lower odds of scheduling than hospital-based non-specialty clinics or PCP-based clinics (Table 1).

Table 1 Associations Between Hospital Characteristics and Implementation and Reach

Reach

A total of 656/2751 (24%) of patients enrolled at intervention hospitals received care under the COMPASS-TC model that met TCM billing code requirements. This ranged from 6 to 66% across hospitals (Fig. 4). Not receiving a 14-day clinic visit, independent of whether or not it was scheduled, was the primary reason patients were not reached (see CONSORT diagram Fig. 1 for reasons why). Of those not reached, 977 (69%) received the two-day call but did not attend the clinic visit, 353 (25%) failed to receive both the call and visit, and 93 (7%) had the visit but no call. Reach was lower at both PCP-based clinics and hospital-based non-specialty clinics compared with neurology-based clinics (Table 1). Urban hospitals had lower Reach compared with non-urban hospitals. Patients introduced to the intervention before discharge were more frequently reached than those notified post-discharge by mail or phone (34% vs. 15%, p < 0.0001), as were those with a clinic visit appointment scheduled prior to discharge versus not (32% vs. 12%, p < 0.0001).

Fig. 4
figure 4

Proportion of Patients Meeting Transitional Care Management (TCM) Criteria by Hospital. Circles represent the 19 hospitals that adopted the intervention and are scaled to represent the total number of enrolled participants. Values on the y-axis represent the proportion of patients that met TCM billing criteria

Patient characteristics independently associated with clinic visit attendance when one was offered or scheduled included diagnosis with stroke versus TIA (OR = 1.64, 1.29 to 2.08) and mild stroke severity (Table 2). Having a history of previous stroke or TIA, lack of insurance, living farther from the clinic, and urban residence were associated with lower odds of clinic visit attendance (Table 2). Age, gender, race, and presence of a cardiovascular-related comorbidity were not strongly associated with odds of clinic visit attendance in this analysis model that also adjusted for lack of insurance, rural residence, and other characteristics.

Table 2 Patient Characteristics Associated with Clinic Visit Attendance within 14 Days

Effectiveness

Mean SIS-16 among patients attending a visit within 14 days was 83.0 (SD 19.1) compared with 78.7 (SD 22.1) among those with no visit (adjusted mean difference 3.84 (95% CI 1.42 to 6.27, p = 0.002)). Hospital-specific estimates are shown in Fig. 5. The estimated difference in SIS-16 for patients who attended a clinic visit after 14 days (15–30 days) compared with non-attendees (5.35; 95% CI 0.76 to 9.94) was similar; however, we had limited sample size to evaluate visit timing.

Fig. 5
figure 5

Within-Hospital Differences in Stroke Impact Scale (SIS)-16 between Patients Receiving a 14-day Visit. Forest plot of hospital-specific estimates and 95% confidence intervals (CI). Linear mixed models included propensity scores to account for differences in patients receiving the intervention versus not. Dotted line indicates the overall estimate in treated versus non-treated patients. CI values beyond ±20 are indicated with arrows

Maintenance

Fourteen of 19 (74%) hospitals sustained COMPASS-TC beyond Phase 1 for at least 6 months. Sustaining hospitals had higher partnership synergy than non-sustaining hospitals (median 4.1 vs. 3.0, p = 0.0067). No other measured hospital characteristics (Primary Stroke Center status, annual stroke volume, geographic region, academic affiliation, adequate backup for PAC and APP, PAC turnover, APP turnover, clinic setting, organizational readiness) significantly differed by maintenance status.

Discussion

In the cluster-randomized pragmatic trial of COMPASS-TC, our analysis suggests that implementation of this comprehensive, evidence-based model of post-acute care that met TCM billing requirements was: (1) challenging; (2) multi-dimensional; and (3) impacted by both patient- and system-level factors. There was significant heterogeneity in hospital infrastructure, staffing, site performance, and diversity of the settings in which intervention was delivered. Delivering the care model in hospitals in rural areas and with higher organizational readiness was associated with successful implementation. Within-hospital analysis revealed a clinically meaningful difference in physical function when COMPASS-TC was received according to CMS standards of TCM billing compared with non-receipt of COMPASS-TC. The treatment effect did not diminish when visits occurred beyond the 14-day TCM window.

We employed several unique strategies to enhance COMPASS-TC use. First, it was designed, implemented, and continuously refined in collaboration with a broad range of stakeholder groups (stroke survivors, family caregivers, clinicians, advocacy organizations, community-based services, hospitals and health systems, industry partners, payers, and policy makers), which encouraged patient-centeredness and system-level buy-in for adoption [35]. Second, unlike many standard research studies, hospitals in this pragmatic trial used their own infrastructure, budget and staffing to deliver the intervention with a small per-participant stipend from the coordinating center. Third, we monitored quality metrics and provided monthly reports on these metrics to aid hospitals in their implementation-related continuous quality improvement (QI) efforts. Performance indicators are a commonly-used method of implementation of stroke quality of care [36, 37]. By using performance indicators, as we were evaluating the impact of the care model on patient outcomes, we were simultaneously evaluating implementation. These measures are the first to be established for post-stroke community-based care and could be an important foundation for future post-acute care QI efforts.

The RE-AIM framework is a widely-used and validated approach for assessing implementation [12]. Using RE-AIM, we found that, while hospitals overall had success with Adoption and Maintenance in COMPASS, Implementation and Reach were low. System-level characteristics associated with successful delivery of COMPASS-TC included meeting the patient in the hospital and scheduling the clinic visit before discharge. Another critical system-level characteristic was staff turnover rate, which impeded consistent delivery of the intervention. Organizational readiness, a modifiable system-level characteristic [38, 39], was positively associated with both Implementation and Reach. These findings suggest that effective Implementation will most likely occur at hospitals that already score high on readiness for organizational change or that create readiness before implementation. We found that increasing partnership synergy was associated with better Reach. Neurology-based clinics had greater Reach than other types of clinics, possibly because patients were more inclined to seek a stroke-trained specialist after discharge. Future research would need to investigate the effect of clinic type on patient decision-making and if these findings generalize outside of this study.

Patient factors influenced whether patients were reached by the intervention. Patient preference to seek follow-up care with providers they already had a relationship with, often primary care, was the most prevalent reason for clinic visit non-attendance. Stroke (versus TIA) diagnosis and having insurance were positively associated with clinic visit attendance, and history of stroke or TIA was negatively associated with attendance. Because the majority of patients were seen in specialty follow-up clinics, future implementation studies should identify strategies to address staffing and resource issues and to engage patients for specialty follow-up. The majority of patients in this study were seen in specialty rather than primary care clinics for follow-up. Studies are needed to identify the adequate financial and staffing resources required to fully engage stroke patients for follow-up and the ability of TCM reimbursement to support this level of care.

Other cluster-randomized trials of stroke interventions have been conducted, exhibiting different degrees of success with implementation and effectiveness. A trial of a stroke caregiver training strategy at discharge from the stroke unit in the United Kingdom (UK) showed no improvement in the primary outcome in the intention-to-treat analysis, similar to COMPASS, and the investigators noted variable uptake of the training strategy by the participating hospitals, with an average compliance of 44% [40]. A complex, highly effective intervention, ESD has been incorporated into standard care practices in the UK and Canada and has effectively reduced mortality and dependence among patients after mild to moderate stroke [41], but has not been adapted for the US healthcare system. In a qualitative assessment of implementation of ESD, there was clear buy-in and perceived benefits reported by inpatient and outpatient providers in both an urban and a semi-rural practice setting; however, challenges remained with integrating and streamlining care, specifically social care referrals [42]. An implementation analysis of a complex stroke rehabilitation trial, AVERT (A Very Early Rehabilitation for stroke Trial), also reported that successful implementation was due to interdisciplinary teamwork, education and stroke leadership to achieve buy-in, and developing different ways of working [43].

Our study has limitations. Small sample sizes limited our ability to precisely estimate associations between hospital and patient characteristics with Reach and Implementation of individual components of the intervention. Estimates of Effectiveness within intervention hospitals could be subject to unmeasured confounding, but propensity scores accounted for differences in important patient characteristics (e.g., stroke severity) between those treated and not treated within the same hospital. We are uncertain whether TCM was actually billed during the clinic visits; rather, we extrapolated this based on meeting the TCM billing criteria. Future analyses of administrative claims will inform how often TCM billing actually occurred. Implementation costs were not measured concurrently during implementation due to PCORI funding stipulations, but future analyses will evaluate implementation costs retrospectively through an ancillary funding source.

Conclusion

Understanding the barriers and facilitators of COMPASS-TC implementation is essential for scaling and disseminating evidence-based, comprehensive TC to additional clinical settings. COMPASS-TC was designed to be consistent with CMS reimbursement requirements and the current US healthcare system. In this large-scale pragmatic trial, participating hospitals made substantial changes to both processes and structures of care to deliver COMPASS-TC within their everyday clinical practice using locally-determined hospital infrastructure and staffing. This systematic analysis of COMPASS-TC implementation informs how, and under which enabling contextual circumstances, this model of care may be effective. A future paper will describe the barriers and facilitators of implementation from the perspective of the front line providers.

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available due to the trial is ongoing and data sharing will follow the PCORI Policy for Data Access and Sharing but are available from the corresponding author on reasonable request.

Abbreviations

APP:

Advanced practice provider

AVERT:

A Very Early Rehabilitation for stroke Trial

CMS:

Centers for Medicare and Medicaid Services

COMPASS:

COMprehensive Post-Acute Stroke Services

COMPASS-TC:

Comprehensive Post-Acute Stroke Services - Transitional Care model

CRN:

Community resource networks

eCare Plan:

Electronic care plan

ESD:

Early supported discharge

GLMMs:

Generalized linear mixed models

IRB:

Institutional review board

NC:

North Carolina

NP:

Nurse practitioner

ORIC:

Organizational Readiness to Implement Change

PA:

Physician assistant

PAC:

Post-Acute care coordinator

PCORI:

Patient-Centered Outcomes Research Institute

PCP:

Primary care provider

QI:

Quality improvement

RN:

Registered nurse

RUCA:

Rural-Urban Community Area

SIS-16:

Stroke Impact Scale-16

StaRI:

Standards for Reporting Implementation Studies

TC:

Transitional care

TCM:

Transitional care management

UK:

United Kingdom

US:

United States

References

  1. Mozaffarian D, Benjamin EJ, Go AS, Arnett DK, Blaha MJ, Cushman M, et al. Heart disease and stroke Statistics-2016 update: a report from the American Heart Association. Circulation. 2016;133:e38–360. https://doi.org/10.1161/CIR.0000000000000350.

    Article  PubMed  Google Scholar 

  2. Winstein CJ, Stein J, Arena R, Bates B, Cherney LR, Cramer SC, et al. Guidelines for adult stroke rehabilitation and recovery: a guideline for healthcare professionals from the American Heart Association/American Stroke Association. Stroke. 2016;47:e98–e169. https://doi.org/10.1161/str.0000000000000098.

    Article  PubMed  Google Scholar 

  3. Cameron JI, O'Connell C, Foley N, Salter K, Booth R, Boyle R, et al. Canadian stroke best practice recommendations: managing transitions of care following stroke, guidelines update 2016. Int J Stroke. 2016;11:807–22. https://doi.org/10.1177/1747493016660102.

    Article  PubMed  Google Scholar 

  4. Fearon P, Langhorne P. Services for reducing duration of hospital care for acute stroke patients. Cochrane Database Syst Rev. 2012:Cd000443. https://doi.org/10.1002/14651858.CD000443.pub3.

  5. Davoody N, Koch S, Krakau I, Hagglund M. Post-discharge stroke patients' information needs as input to proposing patient-centred eHealth services. BMC Medical Inform Decis Mak. 2016;16:66. https://doi.org/10.1186/s12911-016-0307-2.

    Article  Google Scholar 

  6. Gallacher K, Morrison D, Jani B, Macdonald S, May CR, Montori VM, et al. Uncovering treatment burden as a key concept for stroke care: a systematic review of qualitative research. PLoS Med. 2013;10:e1001473. https://doi.org/10.1371/journal.pmed.1001473.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Bindman AB, Cox DF. Changes in health care costs and mortality associated with transitional care management Services after a discharge among Medicare beneficiaries. JAMA Intern Med. 2018;178:1165–71. https://doi.org/10.1001/jamainternmed.2018.2572.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Prvu Bettger J, Alexander KP, Dolor RJ, Olson DM, Kendrick AS, Wing L, et al. Transitional care after hospitalization for acute stroke or myocardial infarction: a systematic review. Ann Intern Med. 2012;157:407–16. https://doi.org/10.7326/0003-4819-157-6-201209180-00004.

    Article  PubMed  Google Scholar 

  9. Duncan P, Bushnell CD, Jones SB, Psioda MA, Gesell SB, D'Agostino RB Jr, et al. A transitional care intervention to improve outcomes after stroke or TIA: the COMprehensive post-acute stroke Services (COMPASS) study. International Stroke Conference. Honolulu; 2019.

  10. Duncan PW, Bushnell CD, Rosamond WD, Jones SB, Gesell SB, D'Agostino RB Jr, et al. The comprehensive post-acute stroke Services (COMPASS) study: design and methods for a cluster-randomized pragmatic trial. BMC Neurol. 2017;17:133. https://doi.org/10.1186/s12883-017-0907-1.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Johnson AM, Jones SB, Duncan PW, Bushnell CD, Coleman SW, Mettam LH, et al. Hospital recruitment for a pragmatic cluster-randomized clinical trial: lessons learned from the COMPASS study. Trials. 2018;19:74. https://doi.org/10.1186/s13063-017-2434-1.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89:1322–7. https://doi.org/10.2105/ajph.89.9.1322.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  13. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for reporting implementation studies (StaRI) statement. BMJ. 2017;356:i6795. https://doi.org/10.1136/bmj.i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe K, Zwarenstein M. The PRECIS-2 tool: designing trials that are fit for purpose. BMJ. 2015;350:h2147. https://doi.org/10.1136/bmj.h2147.

    Article  PubMed  Google Scholar 

  15. Duncan PW, Abbott R, Rushing S, Johnson AM, Condon C, Lycan S, et al. COMPASS-CP: an electronic application to capture patient-reported outcome measures to develop actionable stroke care plans. Circ Cardiovascular Qual Outcomes. 2018;11:e004444. https://doi.org/10.1161/CIRCOUTCOMES.117.004444.

    Article  Google Scholar 

  16. Bushnell CD, Duncan PW, Lycan SL, Condon CN, Pastva AM, Lutz BJ, et al. A person-centered approach to Poststroke care: the COMprehensive post-acute stroke Services model. J Am Geriatr Soc. 2018;66:1025–30. https://doi.org/10.1111/jgs.15322.

    Article  PubMed  Google Scholar 

  17. Transitional Care Management Services. In: Medicare learning network. Centers for Medicare and Medicaid 2016. https://www.cms.gov/outreach-and-education/medicare-learning-network-mln/mlnproducts/downloads/transitional-care-management-services-fact-sheet-icn908628.pdf. Accessed 24 July 2019.

  18. Leeman J, Birken SA, Powell BJ, Rohweder C, Shea CM. Beyond "implementation strategies": classifying the full range of strategies used in implementation science and practice. Implement Sci. 2017;12:125. https://doi.org/10.1186/s13012-017-0657-x.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10:21. https://doi.org/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am J Public Health. 2013;103:e38–46. https://doi.org/10.2105/AJPH.2013.301299.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Gaglio B. The fundamentals of the RE-AIM framework. Paper presented at: PCORI annual conference. Arlington; 2017. https://doi.org/10.2105/ajph.2013.301299.

  22. Andrews JE, Moore JB, Weinberg RB, Sissine M, Gesell S, Halladay J, et al. Ensuring respect for persons in COMPASS: a cluster randomised pragmatic clinical trial. J Med Ethics. 2018. https://doi.org/10.1136/medethics-2017-104478.

  23. Duncan PW, Lai SM, Bode RK, Perera S, DeRosa J. Stroke impact Scale-16: a brief assessment of physical function. Neurology. 2003;60:291–6. https://doi.org/10.1212/01.wnl.0000041493.65665.d6.

    Article  CAS  PubMed  Google Scholar 

  24. Katzan IL, Fan Y, Uchino K, Griffith SD. The PROMIS physical function scale: a promising scale for use in patients with ischemic stroke. Neurology. 2016;86:1801–7. https://doi.org/10.1212/wnl.0000000000002652.

    Article  CAS  PubMed  Google Scholar 

  25. Katzan IL, Lapin B. PROMIS GH (patient-reported outcomes measurement information system Global Health) scale in stroke: a validation study. Stroke. 2018;49:147–54. https://doi.org/10.1161/strokeaha.117.018766.

    Article  PubMed  Google Scholar 

  26. Provider of Services Current Files. 2019; https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/Provider-of-Services/. Accessed 3 March 2019.

  27. Prvu-Bettger J, Jones SB, Kucharska-Newton AM, Freburger JK, Coleman SW, Mettam LH, et al. Meeting Medicare requirements for transitional care: do stroke care and policy align? Neurology. 2019. https://doi.org/10.1212/WNL.0000000000006921.

  28. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: a psychometric assessment of a new measure. Implement Sci. 2014;9:7. https://doi.org/10.1186/1748-5908-9-7.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Oetzel JG, Zhou C, Duran B, Pearson C, Magarati M, Lucero J, et al. Establishing the psychometric properties of constructs in a community-based participatory research conceptual model. Am J Health Promotion. 2015;29:e188–202. https://doi.org/10.4278/ajhp.130731-QUAN-398.

    Article  Google Scholar 

  30. Duncan P, Lai S, Tyler D, Perera S, Reker D, Studenski S. Evaluation of proxy responses to the stroke impact scale. Stroke. 2002;33:2593–9. https://doi.org/10.1161/01.str.0000034395.06874.3e.

    Article  PubMed  Google Scholar 

  31. Kwon S, Duncan P, Studenski S, Perera S, Lai SM, Reker D. Measuring stroke impact with SIS: construct validity of SIS telephone administration. Qual Life Res. 2006;15(3):367–76. https://doi.org/10.1007/s11136-005-2292-2.

    Article  PubMed  Google Scholar 

  32. van Buuren S. Multiple imputation of discrete and continuous data by fully conditional specification. Stat Methods Med Res. 2007;16:23. https://doi.org/10.1177/0962280206074463.

    Article  Google Scholar 

  33. Rubin DB. Teaching statistical inference for causal effects in experiments and observational studies. J Edu Behav Stat. 2004;29:343–67. https://doi.org/10.3102/10769986029003343.

    Article  Google Scholar 

  34. RE-AIM. Figures and Tables. 2018; http://www.re-aim.org/resources-and-tools/figures-and-tables/. Accessed 13 November 2018.

    Google Scholar 

  35. Gesell SB, Klein KP, Halladay J, Bettger JP, Freburger J, Cummings DM, et al. Methods guiding stakeholder engagement in planning a pragmatic study on changing stroke systems of care. J Clin Transl Sci. 2017;1:121–8. https://doi.org/10.1017/cts.2016.26.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Xian Y, Xu H, Lytle B, Blevins J, Peterson ED, Hernandez AF, et al. Use of strategies to improve door-to-needle times with tissue-type plasminogen activator in acute ischemic stroke in clinical practice: findings from target: stroke. Circ Cardiovasc Qual Outcomes. 2017;10. https://doi.org/10.1161/circoutcomes.116.003227.

  37. Di Carlo A, Pezzella FR, Fraser A, Bovis F, Baeza J, McKevitt C, et al. Methods of implementation of evidence-based stroke Care in Europe: European implementation score collaboration. Stroke. 2015;46:2252–9. https://doi.org/10.1161/strokeaha.115.009299.

    Article  PubMed  Google Scholar 

  38. Armenakis AA, Harris SG, Mossholder KW. Creating readiness for organizational change. Hum Relat. 1993;46:681–703. https://doi.org/10.1177/001872679304600601.

    Article  Google Scholar 

  39. Weiner BJ, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008;65:379–436. https://doi.org/10.1177/001872679304600601.

    Article  PubMed  Google Scholar 

  40. Forster A, Dickerson J, Young J, Patel A, Kalra L, Nixon J, et al. A structured training programme for caregivers of inpatients after stroke (TRACS): a cluster randomised controlled trial and cost-effectiveness analysis. Lancet. 2013;382:2069–76. https://doi.org/10.1016/s0140-6736(13)61603-7.

    Article  PubMed  Google Scholar 

  41. Langhorne P, Baylan S. Early supported discharge services for people with acute stroke. Cochrane Database Syst Rev. 2017;7:Cd000443. https://doi.org/10.1002/14651858.CD000443.pub4.

    Article  PubMed  Google Scholar 

  42. Chouliara N, Fisher RJ, Kerr M, Walker MF. Implementing evidence-based stroke early supported discharge services: a qualitative study of challenges, facilitators and impact. Clin Rehabil. 2014;28:370–7. https://doi.org/10.1177/0269215513502212.

    Article  PubMed  Google Scholar 

  43. Luker JA, Craig LE, Bennett L, Ellery F, Langhorne P, Wu O, et al. Implementing a complex rehabilitation intervention in a stroke trial: a qualitative process evaluation of AVERT. BMC Med Res Methodol. 2016;16:52. https://doi.org/10.1186/s12874-016-0156-9.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgments

We thank our study participants and their caregivers for making this study possible and our study stakeholders for their contributions in shaping the study design and intervention; in particular, stroke program managers across NC, the Piedmont Triad Regional Council Area Agency on Agency, Community Pharmacy Enhanced Services Network (CPESN® Network), Justus-Warren Heart Disease and Stroke Prevention Task Force, Stroke Advisory Council, NC Department of Health and Human Services, Wake Forest Baptist Health, FaithHealth Division, Carolina Center for Cognitive Rehabilitation, and the American Heart Association/American Stroke Association. We thank participating hospitals for their commitment to this study and their patient care.

Funding

This project received funding from the Wake Forest CTSA through National Center for Advancing Translational Sciences (NCATS) National Institutes of Health (NIH) Grant Award UL1TR001420 and through PCORI Contract Award PCS-1403-14532.

All statements in this report, including its findings and conclusions, are solely those of the authors and do not necessarily represent the views of the NIH or PCORI, its Board of Governors, or Methodology Committee. The funders had no role in the design of the study, analysis, interpretation of data, or in the writing of the manuscript. The funders received monthly updates on data collection.

Author information

Authors and Affiliations

Authors

Contributions

SG, CB, and SJ contributed equally to the manuscript with substantial contributions to the conception and design of the work; the acquisition and interpretation of the data; they drafted and revised the manuscript critically for important intellectual content. SJ, SL, JX, and MP performed the statistical analysis. PD and WR supervised the research team. SG, CB, SJ, SC, JX, BL, JPB, JF, JH, AJ, AKN, LM, AP, MP, MR, WR, MS, JH, and PD made substantial contributions to the conception and design of the work; the acquisition and interpretation of the data; and revised the manuscript for important intellectual content. All authors read and approved the final manuscript for publication; and agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Corresponding author

Correspondence to Sabina B. Gesell.

Ethics declarations

Ethics approval and consent to participate

Institutional review board (IRB) approval was received through Wake Forest University Health Sciences (central IRB), or through local hospital IRBs. At 90 days during the telephone survey, patients provided informed, verbal consent [22] for collection of outcomes, the ethics committees agreed with the verbal consent and HIPAA authorization process and approved this procedure. Additionally, participants consented to collection of process measures, including the ORIC and Partnership Synergy Scale via email survey.

Consent for publication

Not applicable.

Competing interests

Drs. Duncan, Bushnell, and Dr. Bushnell: ownership Interest, Care Directions, the company that integrates the electronic care plan application used in the study (COMPASS-CP) into electronic health records; Others: None.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1: Figure S1.

Days to Clinic Visit After Discharge. Histogram of percent of patients seen for follow up clinic by days after discharge.

Additional file 2: Table S1.

Implementation Strategies Used in the COMPASS Cluster-Randomized Pragmatic Trial. Strategies used for the implementation of COMPASS.

Appendices

Appendix

Quality Measure Definitions

Telephone follow-up by day two

Description : Follow-up telephone call within two business days of hospital discharge or two attempts made

Denominator:

Inclusion criteria:

  • COMPASS eligibility criteria

Exclusion criteria:

  • Patient hospitalized

  • Patient transferred to a skilled nursing facility

  • Patient deceased

Numerator: The metric was met if any of the following occurred

  • Follow-up call completed within two business days

  • Two attempts were made to complete the call

  • Call was attempted, but not completed because the patient refused, could not complete, or did not have a workable number

COMPASS clinic visit offered or scheduled by 14 days

Description : Receipt of follow-up visit within 14 days of hospital discharge or documentation that visit was offered/scheduled

Denominator:

Inclusion criteria:

  • COMPASS eligibility criteria

Exclusion criteria:

  • Patient transferred to nursing home

  • Patient hospitalized

  • Patient deceased

Numerator: The metric was met if any of the following occurred

  • Follow-up clinic visit completed by day 14

  • Documentation of visit scheduled or offered but patient refused or did not show at the clinic

Receipt of eCare Plan at a COMPASS TC visit

Description : Receipt of eCare plan during follow-up clinic visit

Denominator:

Inclusion criteria:

  • COMPASS eligibility criteria

  • Attended a post-discharge clinic visit at any time after discharge

Numerator:

  • eCare Plan generated electronically using the iPad or on paper and shared with the patient.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gesell, S.B., Bushnell, C.D., Jones, S.B. et al. Implementation of a billable transitional care model for stroke patients: the COMPASS study. BMC Health Serv Res 19, 978 (2019). https://doi.org/10.1186/s12913-019-4771-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-019-4771-0

Keywords