Skip to main content
  • Research article
  • Open access
  • Published:

SIPsmartER delivered through rural, local health districts: adoption and implementation outcomes

Abstract

Background

SIPsmartER is a 6-month evidenced-based, multi-component behavioral intervention that targets sugar-sweetened beverages among adults. It consists of three in-person group classes, one teach-back call, and 11 automated phone calls. Given SIPsmartER’s previously demonstrated effectiveness, understanding its adoption, implementation, and potential for integration within a system that reaches health disparate communities is important to enhance its public health impact. During this pilot dissemination and implementation trial, SIPsmartER was delivered by trained staff from local health districts (delivery agents) in rural, Appalachian Virginia. SIPsmartER’s execution was supported by consultee-centered implementation strategies.

Methods

In this mixed-methods process evaluation, adoption and implementation indicators of the program and its implementation strategy (e.g., fidelity, feasibility, appropriateness, acceptability) were measured using tracking logs, delivery agent surveys and interviews, and fidelity checklists. Quantitative data were analyzed with descriptive statistics. Qualitative data were inductively coded.

Results

Delivery agents implemented SIPsmartER to the expected number of cohorts (n = 12), recruited 89% of cohorts, and taught 86% of expected small group classes with > 90% fidelity. The planned implementation strategies were also executed with high fidelity. Delivery agents completing the two-day training, pre-lesson meetings, fidelity checklists, and post-lesson meetings at rates of 86, 75, 100, and 100%, respectively. Additionally, delivery agents completed 5% (n = 3 of 66) and 10% (n = 6 of 59) of teach-back and missed class calls, respectively. On survey items using 6-point scales, delivery agents reported, on average, higher feasibility, appropriateness, and acceptability related to delivering the group classes (range 4.3 to 5.6) than executing missed class and teach-back calls (range 2.6 to 4.6). They also, on average, found the implementation strategy activities to be helpful (range 4.9 to 6.0). Delivery agents identified strengths and weakness related to recruitment, lesson delivery, call completion, and the implementation strategy.

Conclusions

In-person classes and the consultee-centered implementation strategies were viewed as acceptable, appropriate, and feasible and were executed with high fidelity. However, implementation outcomes for teach-back and missed class calls and recruitment were not as strong. Findings will inform the future full-scale dissemination and implementation of SIPsmartER, as well as other evidence-based interventions, into rural health districts as a means to improve population health.

Peer Review reports

Background

Understanding the implementation of effective interventions is critical to promoting their sustained translation into practice-based settings and enhancing their potential impacts on population health. Successful implementation impacts both service and client outcomes [1, 2]. In combination with understanding the generalizability of an intervention’s reach into the intended population and its adoption and maintenance at the organizational level, implementation is key factor in replicating effective interventions in typical community or clinical settings [1]. Further, additional implementation outcomes including acceptability, appropriateness, costs, and feasibility can inform the external validity of evidence-based interventions when delivered outside of the research context [2].

SIPsmartER is a six-month, multi-component, community-based, behavioral intervention designed to reduce the intake of sugar-sweetened beverages (SSBs) among rural, Appalachian adults [3, 4]. SSBs are non-alcoholic beverages that contain sugar and few other nutrients, such as soda/pop, energy drinks, sweet tea, and fruit drinks. SIPsmartER is grounded in the Theory of Planned Behavior [5] and health literacy principles [6]. The intervention design is described in more detail elsewhere [3]. It is one of only two known interventions targeting SSB intake among adults that have demonstrated significant improvement in adults’ SSB consumption [3, 4, 7,8,9]. SIPsmartER also is the only SSB-focused intervention included in the National Cancer Institute’s repository of Research-Tested Intervention Programs (RTIPs) [10]. Translating SIPsmartER into practice-based settings is important given the numerous preventable health conditions associated with excessive SSB intake (e.g., obesity, diabetes, heart disease, cancer, dental caries) [11,12,13,14,15], particularly among populations with low socio-economic status and/or living in rural areas [16,17,18,19]. Given SIPsmartER’s demonstrated effectiveness [4], it is important to explore how it could be disseminated, implemented, and integrated within a system that reaches health disparate communities in Appalachia.

A pilot dissemination and implementation (D&I) trial was collaboratively developed with medical directors and leadership staff from the four local health districts within the Virginia Department of Health (VDH) that service the rural Appalachian counties [20]. The trial design was grounded in the RE-AIM framework [1] and the Interactive Systems Framework [21, 22], and the evaluation was guided by RE-AIM. Specifically, this pilot trial was designed to measure the reach, effectiveness, adoption, and implementation of SIPsmartER when delivered through local health districts. To support the delivery of SIPsmartER, the research team and health department stakeholders developed and applied an implementation strategy [23] that would build both the general and innovation specific capacity necessary to deliver SIPsmartER. This strategy utilized the key elements of consultation identified by Edmunds (e.g., on-going instruction, self-evaluation, and feedback) [24].

The purpose of this paper is to describe SIPsmartER’s adoption and implementation when delivered through rural, local health districts [1, 2]. In addition, determinants of adoption, implementation, and organizational maintenance that align with the Interactive Systems Framework were assessed from the perspective of the delivery agents from the local health districts: (i) acceptability (satisfaction with aspects of the innovation), (ii) appropriateness (perceived fit, relevance, and suitability), and (iii) feasibility (actual fit). This paper specifically focuses on outcomes related to both delivery expectations and the implementation strategy.

Methods

This study is a mixed-methods process evaluation of a pilot type 2 hybrid effectiveness-implementation trial of SIPsmartER [20]. A type 2 hybrid effectiveness-implementation trial allows for the simultaneous testing of an intervention and an implementation strategy to supports its delivery [20]. It specifically reports on SIPsmartER’s adoption and implementation at the organizational-level. A complete description of effectiveness outcomes are outside the scope of this manuscript. However, during this trial, significant improvements from baseline to 6-months in SSB intake (− 403(CI = − 528, − 278) kcals/day, (p < 0.001)), which were comparable to findings from the effectiveness trial [4]. Also, significant changes (all p < 0.05) in SSB-related attitudes, perceived behavioral control, behavioral intentions, and media literacy were observed.

Study procedures were approved by the Institutional Review Boards of Virginia Tech, the University of Virginia, and Virginia Department of Health. Written informed consent was obtained from health department staff.

SIPsmartER type 2 hybrid effectiveness-implementation trial

Identification and logistics of health districts and delivery agents

Each of the four southwest Virginia health districts invited to participate in this trial agreed to participate. These four districts serve the same counties and cities as the effectiveness trial [3]. These areas consistently score among the poorest across Virginia on the Health Opportunity Index [25], are federally designated as medically-underserved [26] and have an average rurality status of 6/9 [27].

Medical directors were asked to identify the staff within their district who would be ideal delivery agents to implement SIPsmartER. The number of SIPsmartER cohorts each district agreed to deliver was determined by budget and power calculations, with each health district expected to deliver two to four cohorts. Budgets were planned collaboratively with medical directors during the grant writing process. Each health district was provided with a sub-contract reflecting the expected percent effort necessary for delivery agents to implement SIPsmartER.

Following awarding of the grant, planning meetings were held between research staff and VDH medical directors and delivery agents to help ensure the compatibility of the approach with the health department delivery system [22, 28] and to devise a plan for the division of SIPsmartER implementation and research tasks between VDH delivery agents and research staff. This plan, detailed in Table 1, was adjusted to better reflect the needs of the delivery agents following the in-person training, the first implementation strategy activity. Notably, the extent of the role of the delivery agents in (i) the completion of teach-back and missed class calls and (ii) participant engagement activities was reduced. During one of these meetings, delivery agents created specific action plans for recruitment of participants within their districts.

Table 1 Distribution of SIPsmartER delivery tasks between Virginia Department of Health (VDH) staff and the research team

SIPsmartER intervention components

SIPsmartER consists of three lessons delivered through group classes, one teach-back call, and eleven interactive voice response (IVR) calls. Participants who did not attend classes had the opportunity to complete the lesson as a missed class phone call. In the classes, participants received instruction on core content necessary to increase motivation and skills to decrease SSBs. During the teach-back call, participant to review content from the first class and complete a personal action plan with a trained research assistant. Through the IVR calls, participants identified their ounces of SSB intake, completed an action plan, and received a motivational message. Intervention activities and materials have been described in detail elsewhere [3, 10, 29,30,31]. In addition to intervention components, there were specific activities to support participant retention, including re-engagement calls to participants who had not completed two activities in a row [32].

SIPsmartER delivery timeline

An initial staggered plan of implementation per district was planned with districts starting SIPsmartER cohorts at different times. A goal of 10 or more participants per cohort was set. However, during the trial, districts completed cohorts at similar times. First cohorts were completed between Fall 2016 and Spring 2017 (n = 6) with additional cohorts completed between Summer 2017 and Spring 2018 (n = 6).

Consultee-centered implementation strategy

An implementation strategy that utilized a consultee-centered approach was drafted by the research team based on the principles outlined by Edmunds and colleagues. This approach involves non-hierarchical interactions between a consultant (e.g., researcher) and consultee (e.g., delivery agent) through which the consultant provides guidance to the consultee related to a current work problem that is within the scope of the consultant’s expertise [24, 33]. Through these interactions, consultees master general skills and build specific skills related to problem-solving implementation barriers and appropriately adapting intervention components. They are also held accountable for delivering the evidence-based program [33].

Then, to ensure the training was compatible with the health department processes for adopting new interventions, the plan for the implementation strategy was presented to the medical directors and delivery agents for feedback and changes were made based on their feedback. Final implementation strategies addressed three of the four consultation techniques identified in Edmund’s review: on-going instruction, self-evaluation, and feedback [24]. The expectations for engagement in each of these activities varied by timing of cohort, as delivery agents were expected to complete more implementation strategy activities during their first cohort(s) than subsequent ones.

On-going instruction

Five implementation strategies related to on-going instruction were utilized.

Recruitment how-to handout

Materials were created for delivery agents and other VDH staff to support participant screening for enrollment. A handout provided scripting and frequently asked questions to engage potential participants. This strategy was seen as more compatible with medical directors’ expectations for recruitment strategies and, as such, provided a relative advantage [22, 28] over other proposed activities, including in-person or teleconference training.

Two-day training

A two-day (~ 12 h), in-person training was held in August 2016, prior to the start of the first cohorts. Two researchers facilitated the training. The training utilized several educational strategies: didactic presentation of foundational principles and key content areas of lessons, modeling lesson activities and calls, and interactive discussions to identify potential barriers and possible solutions. During this training, delivery, agents received instruction in the core principles for SIPsmartER, a comprehensive review of the SIPsmartER intervention components, and practical tips for delivering lessons and calls. Additionally, they practiced delivering lesson activities. Delivery agents also received lesson manuals that included (i) lesson plans with lesson objectives, procedural steps for delivering lessons, background information on key lesson content, and supply lists, (ii) PowerPoint slides to support the execution of the lesson activities, and (iii) participant handouts. These activities aligned with reducing the complexity of the intervention by providing all training and participant-facing materials in a package that could be easily accessed by the delivery agents [22, 28].

Pre-lesson meetings

During the delivery of their first cohort, delivery agents had short (~ 20 min meetings) before each lesson with one of the researchers who had delivered SIPsmartER during the effectiveness trial. These conversations provided time for the delivery agents to have questions and concerns about the lesson addressed. Also, the researcher provided tips for executing lesson activities based on past experiences delivering the lessons. These meetings were optional after a delivery agent’s first cohort.

Co-teaching

Delivery agents had the option to co-teach each lesson during their first cohort with one of the researchers. If co-teaching was requested, the delivery agent and the researcher would discuss logistics for the co-teaching during the pre-lesson meeting.

Lesson recap videos

Lesson overview videos were developed for delivery agents to watch prior to delivering their second cohorts, which were approximately 10 months after the in-person training. These videos reviewed the foundations and design of the curriculum, content background, and specific lesson flow and activities.

Self-evaluation and feedback

Delivery agents completed a fidelity checklist immediately following the delivery of each lesson and were provided with feedback through lesson observations and post-lesson meetings. A member of the research team observed each delivery agent deliver SIPsmartER the first two times they taught each lesson. During the observations, the researcher completed a fidelity sheet and took field notes. Following observed lessons, the delivery agent(s) and the researcher had a short (< 10 min) audio-taped discussion about the lesson delivery, including highlights of the lesson and areas for improvement. Additional instruction was provided if aspects of the lesson delivery (e.g., execution of activities, inclusion of improper content) needed improvement.

Measures

To assess engagement in and perceptions of implementation strategy activities and actual implementation, data from eight measures were used: (i) cohort recruitment logs, (ii) delivery agent engagement logs, (iii) post-training surveys, (iv) post-training interviews, (v) fidelity checklists, (vi) post-cohort interviews, (vii) post-cohort surveys, and (viii) capacity surveys. These measures allowed for a concurrent mixed methods assessment of SIPsmartER’s implementation and adoption.

Cohort recruitment logs

Logs of health district recruitment activities were maintained as a means of tracking the number of cohorts each district recruited.

Delivery agent engagement log

Logs were maintained to track delivery agents’ fidelity to delivery expectations and implementation strategy activities.

Post-training surveys and interviews

Post-training surveys and interviews were completed after the 2 day in-person training and before each delivery agents’ first cohort. Interviews were audio-recorded. These measures captured information from the delivery agents related to the appropriateness of SIPsmartER and its components within the health district and the delivery agents’ regular job functions. Survey items included question about delivery agents’ confidence to complete delivery expectations and implementation strategies and their perceived feasibility of doing so. Items were measured using 6-point Likert scales. Post-training interviews also assessed the adoption and feasibility of program recruitment. Please see Additional file 1 for these instruments.

Fidelity checklists

Unique fidelity checklists were developed for each of the three lessons. These checklists assessed the degree to which a specific lesson’s activities were completed (none = 0, partial = .5, all = 1) and if the activity was modified (no = 0, yes = 1). There were also sections to enter specific notes about the implementation. These checklists were completed by delivery agents after each delivered lesson and by a researcher after each observed lesson. Please see Additional file 2 for these instruments.

Post-cohort surveys and interviews

Post-cohort surveys and interviews were completed after each delivery agent completed a round of cohorts. Interviews were conducted by a researcher who had limited involvement with delivery agents during intervention delivery activities. These measures captured information from the delivery agents related to the feasibility and acceptability of SIPsmartER, its components, and the implementation strategy. Post-cohort surveys also assessed the fidelity to delivery expectations. Scaled items on the post-cohort survey were measured using 4-point Likert scales. Please see Additional file 3 for these instruments.

Capacity surveys

After completing all their cohorts, delivery agents were asked to complete a survey with open-ended questions. Questions were related to the acceptability and appropriateness of maintaining SIPsmartER in their health district, including the resources they would need to sustain the program. Please see Additional file 4 for this instrument.

Analysis

A concurrent mixed-methods approach was used to analyze data [34]. Data from each measure were analyzed independently using the methods described below. Then, qualitative and quantitative data measuring the same adoption or implementation indicators were converged to allow findings to be compared across measures.

Quantitative analyses

Frequencies of completing delivery expectations and engaging in implementation strategy activities were calculated. Means and standard deviations were calculated for items on post-training surveys, post-cohort surveys, and fidelity checklists and are presented by district and overall. To make results from post-training and post-cohort surveys more comparable, post-cohort survey scores were transformed from 4-point Likert scale scores to 6-point Likert scores using linear stretch [35]. An average fidelity score and average activity modification for each delivered lesson was calculated by averaging the fidelity ratings and modification ratings for each lesson activity.

Qualitative analyses

Transcripts of post training and cohort interviews and open-ended questions from capacity surveys were coded using a constant comparative approach by two researchers [36]. Transcripts and open-ended responses on capacity surveys were first organized into categories that reflected the major delivery expectations (recruitment, lesson delivery, and teach-back and missed class calls), implementation strategy, and sustainability. Content within these categories were reviewed for emerging themes. These themes were reviewed and organized to create codes that were applied to the categories. One researcher applied the codes to the transcripts and surveys while another reviewed coding to ensure the appropriate text was captured. Researchers discussed and resolved any differences. This process was repeated within codes to identify more discrete units as needed [37].

Results

Adoption and recruitment

Table 2 presents data related to adoption of SIPsmartER by rural, local health departments, a description of delivery agent roles within each district, and medical director turnover during the implementation process. Delivery agents led or participated in the recruitment of 89% (n = 17) of cohorts. More cohorts were recruited for than were enrolled (19 recruited, 12 enrolled). Two health districts met their target number of enrolled cohorts while one district enrolled one less and another enrolled one more.

Table 2 Rural, Local Health Districts Adoption of and Recruitment for SIPsmartER

Implementation Fidelity

Implementation fidelity data are presented in Table 3.

Table 3 Delivery Agent Fidelity to Intervention Delivery and Implementation Strategy Activities

Intervention delivery

Lesson delivery

Of the anticipated 36 classes (one class for each of the three lessons planned per implemented cohort), 31 (86%) were delivered by delivery agents and one by a researcher. For the two health districts with multiple delivery agents, distribution of teaching responsibilities were approached differently, with one delivery agents in one district co-teaching all lessons and the district’s delivery agent alternating classes.

An average of 93% fidelity was found across all lessons and districts based on fidelity checklists completed during researcher observations. Based on researcher observation, delivery agents modified 17% of activities. Most of these modifications were appropriate, including tailoring examples to be more locally relevant (e.g., adding pictures of popular regional sugary drinks), making activities more suitable for audience (e.g., adapting activities using worksheets so they better met the needs of a very low literate group), and adding extra content to make a concept clearer (e.g., adding a parody of a famous sugary ad created by a health watch group into a lesson addressing advertising) [38]. However, one modification was not appropriate: the addition of health risks that are not well-supported by scientific literature.

Teach-back and missed class calls

Three delivery agents (43%) attempted teach-back calls and five delivery agents (71%) attempted missed class calls. Overall, agents completed three (of 66, 5%) teach-back calls and six (of 59, 10%) missed class calls.

The actual process for completing the teach-back and missed class calls evolved from the plan devised following the training. To maximize both call completion and agent involvement in calls, delivery agents were not assigned specific participants for the calls. Instead, during call periods, delivery agents provided times they could make calls within their work schedule. Then, the research staff would provide delivery agents with a list of participants to call based on whether participants preferred call times were within the delivery agent’s window of availability. Researchers made the calls when the delivery agents were not available.

Implementation strategy

Six (86%) delivery agents attended both days of the in-person training; the other attended only 1 day (due to a family emergency). Eighteen pre-meetings were held: nine of the expected 12 (75%) pre-meetings that were held before each delivery agent’s first time teaching a lesson and nine additional during the delivery agents’ second time teaching the lessons. Three (10%) of the delivered classes were co-taught. Each delivery agent completed a fidelity checklist every time after s/he taught a class for a total of 43 checklists (100%). Twenty-seven classes (84%) were observed by researchers. The five classes that were not observed were the third time delivery agents taught the lessons, so they did not require an observation per the protocol. Post-lesson meetings were completed after all observed lessons.

Acceptability, appropriateness, and feasibility

Quantitative and qualitative findings about implementation acceptability, appropriateness, and feasibility are presented in Tables 4 and 5, respectively.

Table 4 Delivery Agent Quantitative Assessment of Delivery Expectations and Implementation Strategy
Table 5 Delivery Agent Qualitative Assessment of Recruitment, Delivery Expectations, Implementation Strategy

Recruitment

Delivery agents reported that recruitment was often difficult, noting the community seemed uninterested in the program and it was hard to engage other health department staff to support recruitment efforts. They also identified successful approaches to recruitment, including targeting intact groups and having handouts they could use to describe the program during recruitment.

Intervention delivery

Lesson delivery

After the training, delivery agents reported high self-ratings of confidence in preparing for lesson delivery (5.0/6), meeting lesson objectives when teaching (5.0/6), and meeting the learning needs of participants (4.8/6). They also reported moderate feasibility in their ability to adequately prepare for classes (4.3/6). After delivering SIPsmartER, delivery agents perceived their actual delivery performance to be positive, with average scores ranging from 5.2/6 to 5.6/6. Related to participant retention activities, they perceived their ability to complete reminder calls to participants about classes as slightly unfeasible (3.1/6) and their ability to track participant attendance at classes as slightly feasible (4.1/6).

From interviews and capacity-surveys, delivery agents noted low class attendance was an issue for some cohorts. However, they also reported positive aspects of delivery, including observing participants change over time, interactions among participants, and their conversations with participants. They also expressed liking the flexibility to make lesson adjustments to better meet the needs of the population and that the program was well developed and the group lessons were within their scope of professional practice.

Teach-back and missed class calls

On the post-training survey, delivery agents reported moderate to high confidence in their ability to complete teach-back (4.6/6) and missed class calls (4.6/6). However, ratings of the feasibility of conducting these calls were moderately unconfident (2.6/6 and 2.4/6, respectively).

Delivery agents reported seeing the utility of the teach-back calls as a way to connect with participants one-on-one and to ensure they understood the information. However, they perceived both the teach-back and missed class calls as time consuming and potentially an incompatible with their normal schedules.

Implementation strategy

Overall, delivery agents felt confident they would receive necessary support from SIPsmartER research staff (4.9/6) and reported they felt highly satisfied that they received the support they needed (5.7/6). Delivery agents reported they liked that the research staff were knowledgeable and were able to communicate effectively about general needs and questions. One delivery agent expressed initial concern that the level of support, particularly researcher observations, felt initially like their purpose was to judge the delivery agents; however, after engaging in the implementation strategies realized that they were to designed to improve the delivery of the program.

In-person training

Delivery agents were moderately to completely satisfied with the length of (5.3/6) and presentation at (5.3/6) the in-person training. After completing their first cohort, they rated the training as very helpful (5.4/6). Delivery agents commented they liked that the training provided a complete picture of the program, set expectations for roles, and allowed them to develop a network with the research team and peers in other health districts.

Other strategies

Following training, delivery agents reported that completing fidelity checklists (4.7/6) and post-lesson meetings (4.1/6) would be slightly to moderately feasible. Following completion of their first and second rounds of cohorts, delivery agents reported that the implementation strategies were helpful to very helpful, with a range of ratings from 4.9/6 to 6.0/6.

General impressions of acceptability, appropriateness, and feasibility

Through the interviews and capacity surveys, delivery agents from three of the four districts felt SIPsmartER filled a gap in their programming that targeted an important health need. However, agents from the other district felt that SIPsmartER was not different from already established efforts in their district that targeted the same health behavior. Delivery agents also expressed concerns about some of the program logistics, particularly the length of the program. Delivery agents across all districts reported concerns about funding, as many of the health education efforts of health department employees, especially health educators, are driven by grants.

Discussion

This hybrid effectiveness-implementation trial examined adoption and implementation factors related to SIPsmartER, an evidence-based intervention to reduce sugary beverage consumption in a region experiencing health disparities. Through the use of a multi-component implementation strategy that included a consultee-centered approach, it was demonstrated that SIPsmartER could be adopted and implemented with high fidelity across four rural public health districts and by delivery agents with different roles within the health districts. This study contributes to the ISF literature in that findings demonstrate that the packaging an evidence-based intervention to reduce complexity and use of a facilitation process including consultee-centered training can result in the adoption and high-fidelity implementation of an intervention to address sugary beverage consumption in underserved communities. However, findings also found that the potential for sustainability and broader adoption could be jeopardized by intervention features that were less feasible and required research facilitation due to a lack of compatibility with the health department context and a lack of perceived relative advantage. Specifically, delivery agent-initiated telephone contacts were difficult to consistently implement while the in-person small group sessions were more compatible with health department practices. Similarly, qualitative feedback indicated that agent delivered telephone calls and the recruitment processes may not be as feasible.

While public health systems are beginning to play an increasing role in the implementation of evidenced-based interventions targeting chronic health conditions, little is known about how evidence-based programs can be best implemented and sustained in these systems [39,40,41,42]. Related to recruitment, fidelity to the protocol was high in terms of delivery agents being able to recruit 89% of the cohorts. However, findings from post-training and post-cohort interviews indicate low perceptions of feasibility, with most delivery agents clearly expressing their frustration with the recruitment process. Delivery agents employed two broad strategies to recruit participants: (1) surveying of the community through canvassing health department customers or the general community through local events and (2) targeting established groups, including housing and work sites [43]. The former was the strategy most districts started with and it was much less efficient. Recruiting for adult participants within the community for a program consisting of group classes and having to reach a threshold of approximately ten participants to start a class was not a common practice for any of the districts. The lack of experience and protocols within the districts may have weakened the potential reach of the program. This finding highlights the potential usefulness of systematically collecting and recording key patient health behaviors as a means to efficiently identify patients who would be good candidates for interventions [44]. It also highlights the need to consider efficient recruitment strategies that could vary based upon community resources [32].

Delivery agents’ engagement with and perceptions of the consultee-centered implementation strategies demonstrated high fidelity to these activities and reported high perceptions of acceptability, appropriateness, and feasibility. This finding is notable as it provides support for the use of consultation as an implementation strategy in community-based interventions when using professional health district staff as delivery agents. Although consultation is regularly used as an implementation strategy and has potential for use across contexts [45], its use is most commonly reported within the context of supporting community and clinic-based mental health professionals to implement evidence-based programs [33]. The strong implementation fidelity may be due to the design of the consultee-centered approach [24]. Particularly, allowing for a non-hierarchical relationship between the researchers (consultant) and delivery agent (consultee) acknowledged the past training and professional experiences of the delivery agents. Also, being able to adjust the intensity of instruction and feedback activities to reflect the growing skill and content mastery of the delivery agents with the intervention allowed them to still feel well-supported without overburdening them. However, it is important to note, that clear explanations of the purpose of a consultation approach and the specific reasons activities were chosen is needed, as the relative intensity of the strategy made one delivery agent uncomfortable initially. She felt like she was being judged and that the level of support was not needed; yet, after engaging with the implementation strategy, she recognized its purpose was not to judge her but to support the program delivery.

Our findings about the implementation of SIPsmartER in these rural, local health districts reflect previously identified benefits, facilitators, and barriers of implementing evidence-based programs in public health agencies. Related to benefits, findings suggest that by implementing SIPsmartER, their districts added programming that better addressed common risk factors for disease [46]. Delivery agents demonstrated contrasting views of some agents on the relative advantage of the approach from three districts. Three districts mentioned SIPsmartER was similar to another program they implement (a statewide intervention to reduce sugar-sweetened beverages that includes social marketing and single workshops for children, adolescents, and adults). However, delivery agents from two districts reflected that SIPsmartER was better designed to foster behavior change and could serve as a next step to that program. Future training approaches may want to address how SIPsmartER compares to existing programs to underscore the uniqueness and benefit in helping participants change behavior.

Our findings reflect barriers previously identified related to staff turnover, leadership, and agency structure [46,47,48]. Staff turnover was particularly noticeable in this D&I pilot as 75% of the districts had changes in medical directors early in the study. The medical directors were key in the promotion and support of SIPsmartER within their districts. This occurrence is important as past research has demonstrated that leadership is a necessary factor to further implement evidence-based programs within health departments [48,49,50]. Allen and colleagues’ findings suggest that effective leaders within health departments did not just talk about valuing and supporting evidence-based programs but also created an environment that fostered consistent conversation about the evidence-based programs and provided a supportive organizational environment [48]. Hu and colleagues identified that public health agencies with “high agency leadership” and “supportive workplace” were 2.08 times and 1.74 times more likely to use research evidence in the workplace compared to unsupportive environments [50]. In this trial, these changes in medical directors in the three districts may have impacted specific leadership actions related to this pilot study, which could have impacted implementation outcomes. However, researchers cannot control staff turnover, so these finding stresses the need to cultivate multiple formal and informal leaders within the health districts (e.g., opinion leaders, internal implementation leaders, and champions) from early on. In doing so, if one leader leaves the organization or the project, others remain to maintain the support and legitimacy of the program and to drive forward the implementation of the intervention [51].

Agency structures and processes influence the ability and motivation of delivery agents to complete delivery expectations and implementation strategy activities. The impact of agency structure and processes on implementation outcomes was noticeable in this trial. Notably, from interviews and capacity surveys, it was evident that in three of the four districts, delivery agents adhered to work schedules that were within normal business hours. While there was flexibility to adjust schedules and there was specific funding for the delivery agent time during the work day, they did not feel that it was appropriate, acceptable, or feasible both in terms of time and resources for them to make calls to participants during normal work hours and/or to adjust their schedules to accommodate the calls. Therefore, those agents who attempted teach-back and missed class calls did so during normal business hours. Also, as this type and scale of participant recruitment was not common practice in any of the districts, the health districts were lacking both the structure and capacity to efficiently recruit for the program.

The approach to this trial was pragmatic. Planning was guided by the RE-AIM framework with a goal to design an implementation approach that would allow SIPsmartER be able to significantly impact SSB intake, have a broad reach, and be readily adopted, implemented, and sustained in a typical community delivery setting, i.e., local health departments. Additional pragmatic decisions were made to allow for health departments to test out the program (i.e., trialability) [22, 28]. For example, it was decided from the outset that delivery agents would not play a role in the administration of the IVR call system and later decided they would not manage participant retention activities. These decisions were purposeful in order to allow delivery agents the ability to gain experience with recruitment, lesson delivery, and execution of teach-back and missed class calls.

Taken together, the findings and their implications identify next steps and implications for the translation of SIPsmartER into practice. Potential next steps include working with these four and other rural, local health districts to create systems to identify potential participants and streamline recruitment efforts. Additionally, it will be important to assess the feasibility of health department staff managing the automated call and participant retention portions of the program while also delivering the classes and teach-back and missed class calls. Finally, testing the implementation and effectiveness of the program with different combinations of components (lessons, automated calls, teach-back calls, and missed class calls) would aid in determining the most effective and feasible model.

Limitations

Findings are limited by the small number of health districts and delivery agents included in this study and that the geographic location of the health districts are within one state health department. While this may impact generalizability, it is important to note that our mixed-methods design allowed for a robust analysis and found differences in implementation experiences and perceptions across districts. Also, the districts represent the targeted region – rural Appalachia – for future dissemination.

Conclusions

Findings suggest SIPsmartER’s group classes and implementation strategy are appropriate and acceptable to health district delivery agents and can be feasibly implemented with high fidelity within the structure of rural, local public health districts in rural Virginia. However, the execution of teach-back and missed class calls as well as recruitment efforts are perceived as less appropriate and acceptable and may, therefore, be less feasible to faithfully implement. These findings, in conjunction with those related to the effectiveness of the intervention when implemented in this system, will be used to inform the further translation of SIPsmartER into this system.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

IVR:

Interactive Voice Response

SSB:

sugar sweetened beverages

VDH:

Virginia Department of Health

References

  1. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

    Article  CAS  Google Scholar 

  2. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38(2):65–76.

    Article  Google Scholar 

  3. Zoellner J, Chen Y, Davy B, You W, Hedrick V, Corsi T, et al. Talking health, a pragmatic randomized-controlled health literacy trial targeting sugar-sweetened beverage consumption among adults: rationale, design & methods. Contemp Clin Trials. 2014;37(1):43–57.

    Article  Google Scholar 

  4. Zoellner J, Hedrick V, You W, Chen Y, Davy BM, Porter KJ, et al. Effects of a behavioral and health literacy intervention to reduce sugar-sweetened beverages: a randomized-controlled trial. Int J Behav Nutr Phys Act. 2016;13(1):38.

    Article  Google Scholar 

  5. Ajzen I. The theory of planned behavior. Organ Behav Human Decis Process. 1991;50:179–211.

    Article  Google Scholar 

  6. National Research Council. Health literacy: A prescription to end confusion. Washington, D.C.: The National Academies Press; 2004.

    Google Scholar 

  7. Vargas-Garcia EJ, Evans CEL, Prestwich A, Sykes-Muskett BJ, Hooson J, Cade JE. Interventions to reduce consumption of sugar-sweetened beverages or increase water intake: evidence from a systematic review and meta-analysis. Obes Rev. 2017;18(11):1350–63.

    Article  CAS  Google Scholar 

  8. Ostbye T, Krause KM, Stroo M, Lovelady CA, Evenson KR, Peterson BL, et al. Parent-focused change to prevent obesity in preschoolers: results from the KAN-DO study. Prev Med. 2012;55(3):188–95.

    Article  Google Scholar 

  9. Ostbye T, Zucker NL, Krause KM, Lovelady CA, Evenson KR, Peterson BL, et al. Kids and adults now! Defeat obesity (KAN-DO): rationale, design and baseline characteristics. Contemp ClinTrials. 2011;32(3):461–9.

    Google Scholar 

  10. National Cancer Institute. Research-Tested Intervention Programs (RTIPs) 2018 [Available from: https://rtips.cancer.gov/rtips/index.do.

    Google Scholar 

  11. Lauby-Secretan B, Scoccianti C, Loomis D, Grosse Y, Bianchini F, Straif K, et al. Body fatness and Cancer--viewpoint of the IARC working group. N Engl J Med. 2016;375(8):794–8.

    Article  Google Scholar 

  12. Malik VS, Pan A, Willett WC, Hu FB. Sugar-sweetened beverages and weight gain in children and adults: a systematic review and meta-analysis. Am J Clin Nutr. 2013;98(4):1084–102.

    Article  CAS  Google Scholar 

  13. Cheungpasitporn W, Thongprayoon C, Edmonds PJ, Srivali N, Ungprasert P, Kittanamongkolchai W, et al. Sugar and artificially sweetened soda consumption linked to hypertension: a systematic review and meta-analysis. Clin Exp Hypertens. 2015;37(7):587–93.

    Article  CAS  Google Scholar 

  14. Malik VS, Popkin BM, Bray GA, Despres JP, Hu FB. Sugar-sweetened beverages, obesity, type 2 diabetes mellitus, and cardiovascular disease risk. Circulation. 2010;121(11):1356–64.

    Article  Google Scholar 

  15. Bernabe E, Vehkalahti MM, Sheiham A, Aromaa A, Suominen AL. Sugar-sweetened beverages and dental caries in adults: a 4-year prospective study. J Dent. 2014;42(8):952–8.

    Article  Google Scholar 

  16. Popkin BM, Armstrong LE, Bray GM, Caballero B, Frei B, Willett WC. A new proposed guidance system for beverage consumption in the United States. Am J Clin Nutr. 2006;83(3):529–42.

    Article  CAS  Google Scholar 

  17. Kit BK, Fakhouri TH, Park S, Nielsen SJ, Ogden CL. Trends in sugar-sweetened beverage consumption among youth and adults in the United States: 1999-2010. Am J Clin Nutr. 2013;98(1):180–8.

    Article  CAS  Google Scholar 

  18. Rosinger A, Herrick K, Gahche J, Park S. Sugar-sweetened Beverage Consumption Among U.S. Adults, 2011-2014. NCHS Data Brief. 2017;(270):1–8.

  19. Zoellner J, You W, Connell C, Smith-Ray RL, Allen K, Tucker KL, et al. Health literacy is associated with healthy eating index scores and sugar-sweetened beverage intake: findings from the rural lower Mississippi Delta. J Am Diet Assoc. 2011;111(7):1012–20.

    Article  Google Scholar 

  20. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

    Article  Google Scholar 

  21. Wandersman A, Chien VH, Katz J. Toward an evidence-based system for innovation support for implementing innovations with quality: tools, training, technical assistance, and quality assurance/quality improvement. Am J Community Psychol. 2012;50(3–4):445–59.

    Article  Google Scholar 

  22. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3–4):171–81.

    Article  Google Scholar 

  23. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  Google Scholar 

  24. Edmunds JM, Beidas RS, Kendall PC. Dissemination and implementation of evidence-based practices: training and consultation as implementation strategies. Clin Psychol (New York). 2013;20(2):152–65.

    Google Scholar 

  25. Virginia Department of Health. Virginia Health Opportunity Index 2018 [Available from: https://www.vdh.virginia.gov/omhhe/hoi/dashboards/counties.

    Google Scholar 

  26. Health Resources & Services Adminsitration. Shortage Areas 2018 [Available from: https://data.hrsa.gov/topics/health-workforce/shortage-areas.

    Google Scholar 

  27. USDA Economic Research Service. Rural-Urban Continuum Codes 2016 [Available from: https://www.ers.usda.gov/data-products/rural-urban-continuum-codes/.

    Google Scholar 

  28. Rogers EM. Diffusion of innovations. New York: Free Press; 2003.

    Google Scholar 

  29. Bailey AN, Porter KJ, Hill JL, Chen Y, Estabrooks PA, Zoellner JM. The impact of health literacy on rural adults' satisfaction with a multi-component intervention to reduce sugar-sweetened beverage intake. Health Educ Res. 2016;31(4):492–508.

    Article  CAS  Google Scholar 

  30. Porter K, Chen Y, Estabrooks P, Noel L, Bailey A, Zoellner J. Using teach-Back to understand participant behavioral self-monitoring skills across health literacy level and behavioral condition. J Nutr Educ Behav. 2016;48(1):20–6 e1.

    Article  Google Scholar 

  31. Porter KJ, Alexander R, Perzynski KM, Kruzliakova N, Zoellner JM. Using the clear communication index to improve materials for a behavioral intervention. Health Commun. 2019;34(7):782–788.

    Article  Google Scholar 

  32. Estabrooks P, You W, Hedrick V, Reinholt M, Dohm E, Zoellner J. A pragmatic examination of active and passive recruitment methods to improve the reach of community lifestyle programs: the talking health trial. Int J Behav Nutr Phys Act. 2017;14:7.

    Article  Google Scholar 

  33. Nadeem E, Gleacher A, Beidas RS. Consultation as an implementation strategy for evidence-based practices across multiple contexts: unpacking the black box. Admin Pol Ment Health. 2013;40(6):439–50.

    Article  Google Scholar 

  34. Palinkas LA, Horwitz SM, Chamberlain P, Hurlburt MS, Landsverk J. Mixed-methods designs in mental health services research: a review. Psychiatr Serv. 2011;62(3):255–63.

    Article  Google Scholar 

  35. de Jonge T, Veenhoven R, Arends L. Homogenizing responses to different survey questions on the same topic: proposal of a scale homogenization method using a reference distribution. Soc Indic Res. 2014;117:275–300.

    Article  Google Scholar 

  36. Creswell JW. Qualitative inquiry and research design: choosing among five approaches. 2nd ed. Thousand Oaks: Sage; 2007.

    Google Scholar 

  37. Saldana J. The coding manual for Qualititative researchers. 2nd ed. Thousand Oaks: Sage; 2013.

    Google Scholar 

  38. Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65.

    Article  Google Scholar 

  39. Hosler AS, Zeinomar N, Asare K. Diabetes-related services and programs in small local public health departments, 2009-2010. Prev Chronic Dis. 2012;9:E07.

    PubMed  Google Scholar 

  40. Stamatakis KA, Leatherdale ST, Marx CM, Yan Y, Colditz GA, Brownson RC. Where is obesity prevention on the map?: distribution and predictors of local health department prevention activities in relation to county-level obesity prevalence in the United States. J Public Health Manag Pract. 2012;18(5):402–11.

    Article  Google Scholar 

  41. Chen ZA, Roy K, Gotway Crawford CA. Obesity prevention: the impact of local health departments. Health Serv Res. 2013;48(2 Pt 1):603–27.

    Article  Google Scholar 

  42. Parks RG, Tabak RG, Allen P, Baker EA, Stamatakis KA, Poehler AR, et al. Enhancing evidence-based diabetes and chronic disease control among local health departments: a multi-phase dissemination study with a stepped-wedge cluster randomized trial component. Implement Sci. 2017;12(1):122.

    Article  Google Scholar 

  43. Porter KJ, Perzynski KM, Hecht ER, Kruzliakova N, Zoellner JM. Did the approach used to recruit rural adults into a behavioral intervention impact recruitment, enrollment, and engagement? Poster presented at. New Orleans: 39th Annual Meeting & Scientific Sessions of the Society of Behavioral Medicine; 2018.

  44. Estabrooks PA, Boyle M, Emmons KM, Glasgow RE, Hesse BW, Kaplan RM, et al. Harmonized patient-reported data elements in the electronic health record: supporting meaningful use by primary care action on health behaviors and key psychosocial factors. J Am Med Inform Assoc. 2012;19(4):575–82.

    Article  Google Scholar 

  45. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10:21.

    Article  Google Scholar 

  46. Allen P, Sequeira S, Best L, Jones E, Baker EA, Brownson RC. Perceived benefits and challenges of coordinated approaches to chronic disease prevention in state health departments. Prev Chronic Dis. 2014;11:E76.

    Article  Google Scholar 

  47. Brownson RC, Ballew P, Kittur ND, Elliott MB, Haire-Joshu D, Krebill H, et al. Developing competencies for training practitioners in evidence-based cancer control. J Cancer Educ. 2009;24(3):186–93.

    Article  Google Scholar 

  48. Allen P, Jacob RR, Lakshman M, Best LA, Bass K, Brownson RC. Lessons learned in promoting evidence-based public health: perspectives from managers in state public health departments. J Community Health. 2018;43(5):856–63.

    Article  Google Scholar 

  49. Duggan K, Aisaka K, Tabak RG, Smith C, Erwin P, Brownson RC. Implementing administrative evidence based practices: lessons from the field in six local health departments across the United States. BMC Health Serv Res. 2015;15:221.

    Article  Google Scholar 

  50. Hu H, Allen P, Yan Y, Reis RS, Jacob RR, Brownson RC. Organizational supports for research evidence use in state public health agencies: a latent class analysis. J Public Health Manag Pract. 2018. [Epub ahead of print]

  51. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable

Funding

NIH/NCI R21-CA202013 (PI: Zoellner) Funders did not have a role in the design, collection, analysis, interpretation of data, and/or writing of this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

JMZ, KJP, PAE, ESC conceptualized the study; KJP, DJB, KMP, EH, PR, NK, ESC executed the intervention; KJP, PR, NK, ESC, JMZ developed the implementation strategy; KJP, DJB, KMP, EH, NK collected data; KJP, DJB, KMP, EH analyzed data; and all authors provided feedback and approved the final manuscript.

Corresponding author

Correspondence to Kathleen J. Porter.

Ethics declarations

Ethics approval and consent to participate

Study procedures were approved by the Institutional Review Boards of Virginia Tech, University of Virginia, and Virginia Department of Health. Written informed consent was obtained from health department staff.

Consent for publication

Not applicable

Competing interests

The authors declare they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Post-training survey and interview guide. Data reported in Table 4 and Table 5. These measures captured delivery agents’ perceptions of the appropriateness of SIPsmartER and its components within the health district and the delivery agents’ regular job functions and their confidence to complete delivery expectations and implementation strategies and their perceived feasibility of doing so. (PDF 81 kb)

Additional file 2:

Fidelity checklists. Data reported in Table 3. Fidelity checklists captured the degree to which a specific lesson’s activities were completed and if the activity was modified. (PDF 142 kb)

Additional file 3:

Post-cohort survey and interview guide. Data reported in Table 4 and Table 5. These measures assessed feasibility and acceptability of SIPsmartER, its components, and the implementation strategy as well as fidelity to delivery expectations. (PDF 220 kb)

Additional file 4:

Capacity Survey. Data reported in Table 5. This survey assessed the acceptability and appropriateness of maintaining SIPsmartER in their health district, including the resources they would need to sustain the program. (PDF 35 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Porter, K.J., Brock, D.J., Estabrooks, P.A. et al. SIPsmartER delivered through rural, local health districts: adoption and implementation outcomes. BMC Public Health 19, 1273 (2019). https://doi.org/10.1186/s12889-019-7567-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12889-019-7567-6

Keywords