Skip to main content

The iPRISM webtool: an interactive tool to pragmatically guide the iterative use of the Practical, Robust Implementation and Sustainability Model in public health and clinical settings

Abstract

Background

To increase uptake of implementation science (IS) methods by researchers and implementers, many have called for ways to make it more accessible and intuitive. The purpose of this paper is to describe the iPRISM webtool (Iterative, Practical, Robust Implementation and Sustainability Model) and how this interactive tool operationalizes PRISM to assess and guide a program’s (a) alignment with context, (b) progress on pragmatic outcomes, (c) potential adaptations, and (d) future sustainability across the stages of the implementation lifecycle.

Methods

We used an iterative human-centered design process to develop the iPRISM webtool.

Results

We conducted user-testing with 28 potential individual and team-based users who were English and Spanish speaking from diverse settings in various stages of implementing different types of programs. Users provided input on all aspects of the webtool including its purpose, content, assessment items, visual feedback displays, navigation, and potential application. Participants generally expressed interest in using the webtool and high likelihood of recommending it to others. The iPRISM webtool guides English and Spanish-speaking users through the process of iteratively applying PRISM across the lifecycle of a program to facilitate systematic assessment and alignment with context. The webtool summarizes assessment responses in graphical and tabular displays and then guides users to develop feasible and impactful adaptations and corresponding action plans. Equity considerations are integrated throughout.

Conclusions

The iPRISM webtool can intuitively guide individuals and teams from diverse settings through the process of using IS methods to iteratively assess and adapt different types of programs to align with the context across the implementation lifecycle. Future research and application will continue to develop and evaluate this IS resource.

Peer Review reports

Introduction

Implementation science (IS) facilitates and studies the translation of relevant, sustainable, and reproducible evidence-based programs (EBP) into routine settings [1,2,3,4,5]. IS theories, models, and frameworks (TMFs) and methods aim to enhance the value, relevance, and impact of research and improve the translation of evidence to practice [6, 7]. A key predictor of an EBP’s uptake, implementation, and sustainment in real-world settings is fit or alignment with the context [3, 8, 9]. Lack of fit makes research less relevant and is an oversight of much research that has resulted in what many refer to as the “leaky pipeline” in which it takes 17 years for only 14% of evidence to ever make it into routine practice settings [2, 10]. IS models and methods aim to improve alignment between an EBP and context in real-world settings in a manner that can be reproduced in other real-world settings, helping to address the “failure to replicate crisis” [11]. Some evidence suggests that by applying IS models and methods, the lag in translating evidence to practice can be decreased to as little as 3–5 years and the amount of research integration can be increased as high as 80% [12, 13], which can enable rapid learning health systems [14].

Despite its promise and potential benefits, a key reason for the lack of IS uptake by the broad community of researchers and implementers lies in its complexity which can make it difficult to grasp for newcomers [15, 16]. For example, the growing numbers of TMFs and issues related to inconsistent and niche jargon make it difficult for the broad community of researchers and implementers to understand, synthesize, or apply TMF concepts to elevate the impact of their research [7, 17,18,19,20,21,22]. Furthermore, TMFs generally offer limited guidance on how to operationalize or make them actionable. Many of the TMFs and methods have also become overly complex or perceived as inflexible, lacking the pragmatic characteristics that make their application feasible or practical across diverse types of EBPs [15]. As it grew, the field of IS—which prioritizes real-world relevance—may have unintentionally made its methods difficult for the broad community of researchers and implementers to understand or apply [15].

To improve the uptake of IS TMFs and methods, there is a clear need to make IS easier to apply and more intuitive for diverse researchers and implementers who are not IS experts. By making IS TMFs more digestible to the broader community, there is greater potential to mend the “leaky pipeline” and increase the relevance and impact of research. Others have recently published on such tools that aim to simplify the process of applying IS TMFs and methods. These tools include a questionnaire to guide self-assessment of contextual alignment [23], questionnaires to assess context using a specific TMF [24, 25], and a stepwise process for planning clinical trials [26].

The increasing number of initiatives to simplify IS TMFs and methods is encouraging, but more work is needed to provide more comprehensive guidance across all stages of implementation and for diverse audiences in ways that allow for flexible or pragmatic application across diverse EBPs, especially if a goal is to create generalizable and sustainable programs [3, 27]. Because context changes dynamically, such efforts should support iterative consideration of contextual alignment longitudinally across the lifecycle of a program’s planning, implementation, and sustainment stages [28, 29].

We addressed these issues by creating an interactive webtool for diverse audiences to iteratively apply an IS TMF, the Practical, Robust Implementation and Sustainability Model (PRISM) [9]. PRISM was selected for multiple reasons. First, PRISM is intended to be used pragmatically and iteratively across the lifecycle of an EBP to maximize the fit to multilevel context while considering the impact on pragmatic clinical and implementation outcomes [9, 30, 31]. Its inclusion of both contextual determinants and (RE-AIM—Reach, Effectiveness, Adoption, Implementation and Maintenance) implementation outcome measures facilitates consideration of the interrelationship between contextual alignment and outcomes [9, 30, 32, 33]. PRISM is the contextually expanded version of the RE-AIM framework, which are the outcomes included as part of the PRISM framework. RE-AIM considers key implementation and effectiveness outcomes from diverse perspectives and their representativeness. As the most widely used IS evaluation TMF [31, 32, 34], the breadth of familiarity with RE-AIM may make the contextually expanded PRISM easier to digest and understand. PRISM’s contextual domains also consider multisectoral representation to promote representativeness and equity, which are key considerations for any EBP [31, 35, 36].

We selected a webtool as the format for this work, because of its public accessibility, relative ease to sustain and modify once developed, and its automation capabilities. Through automation, a webtool can be interactive and guide users through the process of applying IS methods while also offering real-time, individualized feedback to improve their EBP’s contextual alignment.

The purpose of this article is to describe the iPRISM webtool (iterative, Practical, Robust Implementation and Sustainability Model), findings from our human-centered design process, and how this interactive tool operationalizes PRISM to guide and assess an EBP’s (a) alignment with context, (b) progress on implementation outcomes, (c) required adaptations, and (d) future sustainability and scalability across the implementation lifecycle. We provide suggestions of how the webtool can be used, discuss its strengths and limitations, and make suggestions for future research and practice. In future work, we will share the findings from ongoing evaluations of the webtool’s usability and impact across diverse settings, content areas, EBPs, and users.

Methods

We applied a human-centered design process to co-create an interactive IS webtool to be broadly applicable across EBPs for diverse types of individual users and implementation teams. The goals of the webtool were to pragmatically guide users through the process of iteratively assessing, aligning, and adapting EBPs and implementation strategies to both current context and progress on outcomes to optimize outcomes of uptake, implementation, and sustainment.

Development of PRISM assessment items

As described above, the PRISM framework, which includes the pragmatic RE-AIM outcomes was selected to address these goals. Figure 1 provides an overview of PRISM which consists of multilevel PRISM context domains and the RE-AIM outcomes. The 6 context domains include (1) organizational or setting characteristics, (2) patient or community characteristics, (3) patient or community perspectives on the EBP, (4) organizational perspectives on the EBP, (5) the implementation and sustainability infrastructure, and (6) the external environment. The RE-AIM outcomes include the five dimensions of reach, effectiveness, adoption, implementation, and maintenance. To provide users direction on how to operationalize PRISM context domains and RE-AIM dimensions, a set of prompting assessment questions were developed. These assessment items were initially developed by the research team, informed by our prior experience applying PRISM and RE-AIM in diverse contexts [37,38,39], and then iteratively refined throughout the human-centered design process. The purpose of these assessment items is to capture the general perceptions of individual users regarding potential areas for improvement and to facilitate discussion among teams. The first set of assessment items was drafted by two members of the research team (RG and BR). These items were created to align with the PRISM context domains and RE-AIM outcomes and were formulated to be broad in terms of applicability to the types of contexts, populations, and interventions. Members of the national RE-AIM Working Group [40] provided the first round of feedback for the refinement of the items. Most changes in this stage were related to the wording of the items for clarity and the most appropriate anchors for the response scale. During the human-centered design process, feedback from the participants on the wording and response option for the items were documented. Differing perceptions across team members are anticipated and valued and there are no “right answers or ultimate criterion” against which to validate responses; thus, interrater reliability testing was not relevant.

Fig. 1
figure 1

Overview of the PRISM framework which includes the context domains and RE-AIM outcome dimensions. Adapted from Feldstein and Glasgow [9]. On the left, the figure describes how PRISM is used to assess and align an intervention or program’s characteristics with the characteristics and perspectives of multiple levels of partners including patients or community members and the organizational personnel (e.g., leadership, managers, staff) as well as the implementation and sustainability infrastructure (e.g., resources) and external environment (e.g., policy, guidelines). The figure also demonstrates how contextual alignment of an intervention influences the impact or outcomes. On the right of the figure are PRISM’s RE-AIM outcomes of Reach, Effectiveness, Adoption, Implementation, and Maintenance, which have interdependencies

Human-centered design process

We convened a trans-disciplinary research team to guide the design process, which included English and Spanish-speakers with expertise in IS, clinical informatics, behavioral science, human factors engineering, computer science, public health, global health, health equity, pharmacy, and health care. The research team engaged multisectoral individuals and teams of researchers and practitioners who would be likely users of the webtool. Researchers and practitioners were identified using convenience and snowball sampling and included a wide range of perspectives and types of EBPs, including government, public health, chronic and acute health care, and community settings as well as different levels of IS expertise (none to expert).

The human-centered co-creation process consisted of two consecutive, iterative phases that led to the current version of the webtool: (1) design and (2) usability testing. Each of these phases included use of progressively higher fidelity prototypes. The various prototypes allowed us to nimbly refine the webtool within our resource constraints, notably given the expense of web developer time. In each of these phases, users were asked to follow the “think aloud” method [41] and simulate or apply the webtool to an EBP they were familiar with. The think aloud protocol involves participants verbalizing their actions and thoughts throughout to gain insight into their thought processes [41].

The simulations represented EBPs at various stages, including pre-implementation planning, implementation, and sustainment. At the end of each simulation, users were asked semi-structured questions about the experience and how they might use the actual webtool. Types of questions asked to understand usability assessed likelihood of using or recommending the webtool to others, whether more direction/instruction was needed to use the webtool, and if there were confusing aspects of the webtool. Between each simulation, the research team discussed the findings and made changes to the prototype as appropriate. Decisions of whether to make changes balanced user requests with resource availability and evidence-based principles of human-computer interaction, which considers the impact of usability errors on user experience and usefulness of a technology [42].

Design testing entailed the research team developing initial, low-fidelity Excel-based prototypes of the webtool. The Excel-based prototypes consisted mostly of static images and allowed for minimal interactivity. The focus of design testing was to validate the general direction of the planned user experience and refine the content, including wording of the assessment questions and the types of graphical feedback displays. Because of the low fidelity of the prototype at this phase, participants had limited ability to simulate use of the webtool, but they were asked to think aloud as they reviewed the prototype, followed by a semi-structured discussion.

Usability testing included higher-fidelity Adobe XD prototypes of the webtool that allowed for more interactivity and functionality (e.g., hover states, clickable links, toggling between pages). The higher fidelity prototype more closely resembled what the actual webtool might look like and allowed users to simulate how they would apply the webtool to an actual EBP with minimal input required from the research team. The purpose of usability testing was to identify any ergonomic issues and optimize ease of use, which included considerations of flow and identification of usability errors. Usability errors are defined as characteristics that cause confusion or limit its potential to assist users in applying PRISM. At this stage of testing with prototypes, usability was evaluated qualitatively and did not include validated quantitative assessments.

Based on input from design and usability testing, an external web developer (Insight Designs LLC; Boulder, CO) built the current version of the webtool. The webtool will undergo additional usability and user testing, including validated assessments to quantify usability, acceptability, and feasibility of the actual webtool, which will be described in detail in a future paper. In the spirit of rapid dissemination and agile design, the current version of the webtool is publicly available for use and is described in detail here.

Results

Based on feedback from 28 potential target users, including those with and without IS or PRISM expertise, we refined the webtool’s purpose, content, navigation, target audience, and visual displays to optimize the user experience. Table 1 summarizes the type of feedback and usability errors identified during the think aloud and semi-structured discussion as well as the rationale for whether changes were made based on the research team’s discussion. When asked about likelihood of using or recommending the webtool to others, users generally expressed interest in using the webtool and high likelihood of recommending it to others. During the simulated usability testing with a high-fidelity prototype, users were generally able to complete the webtool without additional support or clarification.

Table 1 Description of feedback or usability errors from user-centered design process and decisions made

Description of the webtool

The iPRISM webtool is publicly available and can be found at https://prismtool.org. After a brief video introduction and summary of PRISM, the webtool guides users through the process of aligning an EBP with context to maximize impact during the planning, implementation, and sustainment stages of a program. It is available in English and Spanish and can be used by individuals or teams for diverse public health and healthcare related EBPs. The same PRISM context and RE-AIM outcome assessment items are used across planning, implementation, and sustainment stages, with minor modifications to the wording across stages. Users are encouraged to use the webtool early in the planning stage and repeatedly during implementation and sustainment in the spirit of designing for dissemination, equity, and sustainment [43]. The webtool can also be used separately for any stage (i.e., planning, implementation, sustainment). Before using the webtool, users should already have an idea of the context, intervention, and intended outcomes of interest.

To support users with varying degrees of IS experience while also preserving streamlined user interfaces, the webtool includes embedded education and training, including a video tutorial and use of optional links and info buttons with hover effects for additional information and examples (e.g., more information on PRISM or RE-AIM). Upon completing the webtool, users are provided tabular and graphical summaries of their assessments of fit to context and impact on outcomes as well as a prioritized list of feasible and impactful implementation strategies and formal action plans for accountability. A menu bar allows users to efficiently toggle back and forth across these components and review or modify their content or their responses.

The webtool is organized into four sequential steps, which are described in greater detail in Table 2: Step 1: Set up; Step 2: Assessment of context and impact on outcomes; Step 3: Review of assessment results; and Step 4: Identification and prioritization of implementation strategies and action planning. Whether users are completing the webtool as individuals or teams, they will complete each of these four steps, which span multiple web pages. However, for teams, there is an additional, Step 5: Team results report. The steps are described below and in more detail in Table 2.

Table 2 Description of the iPRISM Webtool steps

Step 1: Set up

This step orients the user to the webtool and asks the user a number of questions about their EBP (e.g., name, setting), how they are completing the tool (individual versus team; stage of their EBP), and allows them to select the Spanish version. Responses to many of these questions dictate how the data entered will be stored and ultimately presented to the user.

Step 2: Assessment of context and impact on outcomes

This step guides the user to systematically consider the multilevel context (e.g., role of community members or patients and family; implementation staff; supervisors or decision makers; larger organizational setting, community and policy) based on the PRISM context domains and evaluate the estimated relationship between the contextual alignment of the EBP and the perceived or actual impact on outcomes. Impact is based on the RE-AIM outcome measures (e.g., equitable reach, implementation, sustainment). The webtool includes 21 questions to operationalize the PRISM context domains and RE-AIM outcomes, with specific consideration of representativeness and equity by asking about multi-level perspectives of the context and intervention as well as representativeness of outcomes. Figure 2 provides an illustration of the questions and slider bar response options. Additional file 1 includes the full list of the itemized questions for each stage of implementation.

Fig. 2
figure 2

Illustration of the assessment questions and slider bars. Depicted here are two assessment questions for the RE-AIM outcome dimensions

Step 3: Review of assessment results

In this step, the output or responses to the assessment questions are displayed in graphical format. Based on feedback during the design process, we selected a radar bar chart as the primary graphical display (Fig. 3) in which the PRISM and RE-AIM results are displayed side by side to facilitate consideration of the relationship between the contextual alignment of an EBP (PRISM) and outcomes (RE-AIM).

Fig. 3
figure 3

Illustration of the radar bar graphs that summarize a user’s responses. A user can opt to (1) view alternative displays (table and bar graph format), (2) export and print the figures, and (3) hover over an area of the radar bar graph to see additional details of the questions and their responses

Step 4: Identification and prioritization of implementation strategies and action planning

In this step, the user is guided through the process of using their assessment scores to identify impactful and feasible implementation strategies that can improve contextual alignment and impact of their EBP. The definition of impactful implementation strategies includes consideration of representativeness or equity of outcomes.

The objectives of this step are slightly different depending on the stage of EBP implementation. In the planning stage, the webtool directs the user to develop strategies that optimize the initial contextual alignment of an EBP with the local setting before it is deployed with consideration of the anticipated impact on outcomes. During the implementation stage, the webtool directs the user to identify strategies for mid-course adaptations based on any changes in context and the user’s report on or perception of progress on outcomes. For the sustainment stage, the webtool directs users to identify strategies that could improve ongoing maintenance and sustainability of the EBP based on current and anticipated progress on desired outcomes.

This step is split into 4 sub-sections on one page in which responses to one section dynamically auto-populates subsequent sections in order to guide users through the process of identifying, rating and prioritizing strategies or adaptations. As illustrated in Fig. 4, the webtool prompts users to prioritize those strategies with the highest feasibility and impact ratings, which are displayed in the upper right quadrant of a scatterplot. After reviewing the scatterplot, users are encouraged to consider adjusting their implementation strategies if the impact and feasibility ratings are suboptimal. Changes to the implementation strategies will dynamically update the subsequent sections. At the end of this component, users are offered a template to create a formal action plan for the implementation strategies they prioritized based on feasibility and impact. If users are done, they can then select a button to complete their assessment, which will end their session.

Fig. 4
figure 4

Illustration of the scatterplot that assists a user in prioritizing strategies based on impact and feasibility ratings

Step 5 (only for teams): Team results report

Users who complete the webtool as part of a team will have the option of viewing and exporting their team’s responses. Figure 5 includes an example of what the team report includes.

Fig. 5
figure 5

Illustration of the team summary report. Depicted here is the PRISM team summary report. The RE-AIM results are also summarized similarly for teams

Discussion

The iPRISM webtool advances IS by making its TMFs and approaches more user-friendly for the broad community of researchers and implementers with and without IS expertise. The webtool aims to simplify the theoretical PRISM context domains and RE-AIM outcomes by using actionable assessment questions and then guiding users through the process of identifying and prioritizing strategies to align and adapt an EBP with the context (and their progress when used in later stages). By focusing on this broad audience, the webtool has potential to result in greater adoption of IS TMFs and methods. For experienced implementation scientists, the iPRISM webtool may be used to support grant submissions or as a resource when collaborating with implementation teams with minimal to no IS experience.

The interactive feedback and guidance provided by the iPRISM webtool is similar in nature to the Program and Clinical Sustainability Assessment Tools (PSAT/CSAT) [44, 45]. While the PSAT/CSAT focuses on assessing the sustainability of an EBP [44, 45], our webtool focuses on how EBP contextual alignment can be leveraged to optimize not just sustainability but also other implementation and effectiveness outcomes (e.g., reach, adoption, effectiveness). Our webtool is not the first tool to guide assessment, alignment, and adaptation of an EBP to the implementation context [23,24,25,26, 46, 47], but it is to our knowledge the first web-based interactive resource. In addition, and in contrast to other available tools, it integrates context with outcomes and is fully automated. The interactive design of our webtool aims to improve ease of use by guiding users through the process with prompts and makes the process more efficient by summarizing assessments with individualized feedback on how to improve contextual alignment and outcomes. Other strengths of our webtool are the guidance to identify, evaluate, and prioritize impactful implementation strategies as well as the prompts to create formal action plans for execution and accountability.

Although the web-based format of our tool is a strength, it does present some notable challenges. First, the cost of contracting with a web developer to build the webtool was relatively high, totaling an estimated $100,000. Table 3 illustrates the varying costs of different features of the webtool and translation. It was also difficult to identify a compatible web developer that was both within our budget and that possessed the skillset we desired. We sought a web developer that not only had a track record of developing well-designed websites, but that was also able to provide expertise related to best practices in web design, including data security standards, accessibility, and human-computer interaction. Ultimately, we found a web developer that met our needs and augmented their skill by embedding within our research team an academic-based expert in human-computer interaction, data visualization, and software design (author SD). While the initial cost of developing the webtool was high, ongoing maintenance is anticipated to be low, limited to website domain and hosting costs and the research team’s in-kind time. To facilitate sustainability, the web developers strategically built the website to grant the research team administrative access and ability to make changes over time, thus averting the need for ongoing maintenance costs from the web developer.

Table 3 Description of web development costs for different features of the webtool

Limitations

There are also limitations to the current webtool. Although we aimed to create an intuitive tool that could be used across multiple audiences with varying degrees of IS expertise, the webtool would still benefit from further refinements to minimize jargon and to expand its relevance to the global community of researchers and implementers, including different Spanish-speaking contexts and other languages. We tested and refined the webtool with Spanish speakers, with an emphasis on select countries in Latin America, potentially limiting its cultural relevance to other Spanish speaking settings. It is also unclear how much facilitation from someone with IS expertise different types of webtool users require or if users with limited IS experience have enough guidance from the webtool alone to select implementation strategies. The webtool described here is also not intended to be the final product but rather a living tool that will be refined over time. As of yet, we have not yet comprehensively tested the usability or acceptability of the webtool. However, we have formatively evaluated usability and acceptability with prototypes of the webtool and found them to be satisfactory based on user report of interest using the webtool and recommending it as well as ability to use the prototypes without needing additional direction/instruction; thus, we deemed the webtool ready for dissemination. Finally, due to web development cost and time constraints, we were not able to develop all design features that we and our implementation partners desired, such as a professionally formatted PDF report that included all figures and tabular results or the ability to save and return to a given assessment when using the webtool iteratively.

In future work, we will continue to refine the webtool to make it intuitive for diverse audiences in different settings. We are actively continuing user testing of the webtool and will continually update the webtool based on this work. As we are able, we will continue to add advanced design features, including features that will allow the user to customize the tool based on their preferences. We will also prioritize adding design features and automation that make the user experience more intuitive and efficient, in addition to continually de-jargonizing the language and content. To appeal to the informational needs of our broad intended audience, we will also continue to embed additional, optional training and tutorials, including examples of how the webtool can be used. Other areas for future work include testing the webtool under different conditions (e.g., with or without a facilitator for individuals or teams), assessing how and when the tool is used over time, and evaluating the impact of using the tool on various outcomes (e.g., user perceptions, project specific primary aims, RE-AIM outcomes, social and equity impacts).

Conclusion

In summary, we have created a new webtool designed to make it easy for diverse researchers and implementers to assess, align, and adapt EBPs to a specific clinical or public health issue using PRISM. By simplifying the use of PRISM and making it more actionable, this webtool is anticipated to increase uptake of this IS framework by more diverse audiences of researchers and practitioners, thereby resulting in more research that is relevant, reproducible, and sustainable. As IS advances, there is a clear need for ongoing development of this webtool and similar resources to make TMFs more intuitive and approachable. Future work should prioritize development and evaluation of user-friendly approaches to apply other TMFs and to guide sustainment and adaptations of EBPs.

Availability of data and materials

https://prismtool.org/

Abbreviations

IS:

Implementation science

iPRISM:

Iterative Practical, Robust Implementation and Sustainability Model

PRISM:

Practical, Robust Implementation and Sustainability Model

TMFs:

Theory, models, and frameworks

EBP:

Evidenced-based programs

RE-AIM:

Reach, Effectiveness, Adoption, Implementation and Maintenance

PSAT/CSAT:

Program and Clinical Sustainability Assessment Tools

References

  1. Kilbourne AM, Glasgow RE, Chambers DA. What can implementation science do for you? Key success stories from the field. J Gen Intern Med. 2020;35(Suppl 2):783–7. https://doi.org/10.1007/S11606-020-06174-6.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Brownson R, Colditz G, Proctor E. Dissemination and implementation research in health: translating science to practice. 3rd ed. New York: Oxford University Press; 2023.

  3. Shelton RC, Lee M. Sustaining evidence-based interventions and policies: recent innovations and future directions in implementation science. Am J Public Health. 2019;109(S2):S132–4. https://doi.org/10.2105/AJPH.2018.304913.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clin Transl Sci. 2012;5(1):48–55. https://doi.org/10.1111/J.1752-8062.2011.00383.X.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012;102(7):1274–81. https://doi.org/10.2105/AJPH.2012.300755.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Moullin JC, Dickson KS, Stadnick NA, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1(1). https://doi.org/10.1186/S43058-020-00023-7.

  7. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50. https://doi.org/10.1016/j.amepre.2012.05.024.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. https://doi.org/10.1186/1748-5908-4-50.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. 2008;34(4):228–43. https://doi.org/10.1016/S1553-7250(08)34030-6.

    Article  PubMed  Google Scholar 

  10. Densen P. Challenges and opportunities facing medical education. Trans Am Clin Climatol Assoc. 2011;122:48–58. http://www.ncbi.nlm.nih.gov/pubmed/21686208. Accessed 16 Aug 2018.

    PubMed  PubMed Central  Google Scholar 

  11. Huebschmann AG, Leavitt IM, Glasgow RE. Making health research matter: a call to increase attention to external validity. Annu Rev Public Health. 2019;40:45–63. https://doi.org/10.1146/annurev-publhealth-040218-043945.

    Article  PubMed  Google Scholar 

  12. Harden SM, Balis LE, Strayer T, Wilson ML. Assess, plan, do, evaluate, and report: iterative cycle to remove academic control of a community-based physical activity program. Prev Chronic Dis. 2021;18:E32. https://doi.org/10.5888/pcd18.200513.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Fixsen DL, Blasé KA, Timbers GD, Wolf MM. In search of program implementation: 792 replications of the Teaching-Family Model. Behav Anal Today. 2007;8(1):96. https://doi.org/10.1037/h0100104.

    Article  Google Scholar 

  14. Trinkley KE, Ho PM, Glasgow RE, Huebschmann AG. How dissemination and implementation science can contribute to the advancement of learning health systems. Acad Med. 2022;97(10):1447–58. https://doi.org/10.1097/ACM.0000000000004801.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Beidas RS, Dorsey S, Lewis CC, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. 2022;17(1):1–15. https://doi.org/10.1186/S13012-022-01226-3.

    Article  Google Scholar 

  16. Curran GM. Implementation science made too simple: a teaching tool. Implement Sci Commun. 2020;1(1). https://doi.org/10.1186/S43058-020-00001-Z.

  17. Rabin B, Viglione C, Brownson R. Developing terminology for dissemination and implementation research. In: Brownson R, Colditz G, Proctor EK, editors. Dissemination and implementation research in health. 3rd ed. New York: Oxford University Press; 2023. pp. 27-65.

  18. Colquhoun H, Leeman J, Michie S, et al. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9(1):1–6. https://doi.org/10.1186/1748-5908-9-51/TABLES/1.

    Article  Google Scholar 

  19. Mitchell SA, Chambers DA. Leveraging implementation science to improve cancer care delivery and patient outcomes. J Oncol Pract. 2017;13(8):523. https://doi.org/10.1200/JOP.2017.024729.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Rabin B. Dissemination-Implementation.org. https://dissemination-implementation.org/. Accessed 15 Nov 2022.

  21. Tabak RG, Khoong EC, Chambers D, Brownson RC. Models in dissemination and implementation research: useful tools in public health services and systems research. 2013. http://uknowledge.uky.edu/cgi/viewcontent.cgi?article=1012&context=frontiersinphssr&sei-redir=1&referer=http://www.bing.com/search?q=Models+in+dissemination+and+implementationresearch%3A+useful+tools+in+public+health+services+andsystems+research&form=DLRD. Accessed 18 Mar 2015.

  22. Strifler L, Cardoso R, McGowan J, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102. https://doi.org/10.1016/j.jclinepi.2018.04.008.

    Article  PubMed  Google Scholar 

  23. Coyle K, Carcone AI, Butame S, Pooler-Burgess M, Chang J, Naar S. Adapting the self-assessment of contextual fit scale for implementation of evidence-based practices in adolescent HIV settings. Implement Sci Commun. 2022;3(1):115. https://doi.org/10.1186/S43058-022-00349-4.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Hunter SC, Kim B, Kitson AL. Mobilising Implementation of i-PARIHS (Mi-PARIHS): development of a facilitation planning tool to accompany the Integrated Promoting Action on Research Implementation in Health Services framework. Implement Sci Commun. 2023;4(1):2. https://doi.org/10.1186/S43058-022-00379-Y.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Robinson CH, Damschroder LJ. A pragmatic context assessment tool (pCAT): using a Think Aloud method to develop an assessment of contextual barriers to change. Implement Sci Commun. 2023;4(1):3. https://doi.org/10.1186/S43058-022-00380-5.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Kowalski CP, Kawentel LM, Kyriakides TC, et al. Facilitating future implementation and translation to clinical practice: the Implementation Planning Assessment Tool for clinical trials. J Clin Transl Sci. 2022;6(1):e131. https://doi.org/10.1017/CTS.2022.467.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Palinkas LA, Spear SE, Mendon SJ, et al. Conceptualizing and measuring sustainability of prevention programs, policies, and practices. Transl Behav Med. 2020;10(1):136–45. https://doi.org/10.1093/TBM/IBZ170.

    Article  PubMed  Google Scholar 

  28. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117. https://doi.org/10.1186/1748-5908-8-117.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Chambers DA, Norton WE. The Adaptome: advancing the science of intervention adaptation. Am J Prev Med. 2016;51(4):S124–31. https://doi.org/10.1016/j.amepre.2016.05.011.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Rabin BA, Cakici J, Golden CA, Estabrooks PA, Glasgow RE, Gaglio B. A citation analysis and scoping systematic review of the operationalization of the Practical, Robust Implementation and Sustainability Model (PRISM). Implement Sci. 2022;17(1):62. https://doi.org/10.1186/S13012-022-01234-3.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8:134. https://doi.org/10.3389/fpubh.2020.00134.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Glasgow RE, Harden SM, Gaglio B, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7(MAR):64. https://doi.org/10.3389/fpubh.2019.00064.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Studts C, Glasgow R. The RE-AIM framework: evolutions and applications in health psychology. In: Brown K, editor. Sage handbook of health psychology. 2nd ed. Thousand Oaks: Sage Publications; 2024.

  34. Vinson C, Stamatakis K, Kerner J. Dissemination and implementation research in community and public health settings. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating research to practice. 2nd ed. New York: Oxford University Press; 2018. pp. 355–370.

  35. Shelton RC, Adsul P, Oh A. Recommendations for addressing structural racism in implementation science: a call to the field. Ethn Dis. 2021;31(Suppl 1):357–64. https://doi.org/10.18865/ed.31.S1.357.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Adsul P, Chambers D, Brandt HM, et al. Grounding implementation science in health equity for cancer prevention and control. Implement Sci Commun. 2022;3(1):56. https://doi.org/10.1186/S43058-022-00311-4.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Maw AM, Morris MA, Glasgow RE, et al. Using Iterative RE-AIM to enhance hospitalist adoption of lung ultrasound in the management of patients with COVID-19: an implementation pilot study. Implement Sci Commun. 2022;3(1):89. https://doi.org/10.1186/S43058-022-00334-X.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Rabin BA, McCreight M, Battaglia C, et al. Systematic, multimethod assessment of adaptations across four diverse health systems interventions. Front Public Health. 2018;6(APR). https://doi.org/10.3389/FPUBH.2018.00102.

  39. Glasgow RE, Battaglia C, McCreight M, Ayele RA, Rabin BA. Making implementation science more rapid: use of the RE-AIM framework for mid-course adaptations across five health services research projects in the Veterans Health Administration. Front Public Health. 2020;8:194. https://doi.org/10.3389/FPUBH.2020.00194.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Harden SM, Strayer TE, Smith ML, et al. National working group on the RE-AIM planning and evaluation framework: goals, resources, and future directions. Front Public Health. 2020;7:390. https://doi.org/10.3389/FPUBH.2019.00390/BIBTEX.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Ericsson KA, Simon HA. Protocol analysis: verbal reports as data. Vol. 23. 1993. https://doi.org/10.2307/3151491.

  42. Dix A, Finlay J, Abowd GD, Beale R, editors. Human-computer interaction. 3rd ed. New York: Pearson Education Limited; 2004.

  43. Kwan BM, Brownson RC, Glasgow RE, Morrato EH, Luke DA. Designing for dissemination and sustainability to promote equitable impacts on health. Annu Rev Public Health. 2022;43(1). https://doi.org/10.1146/ANNUREV-PUBLHEALTH-052220-112457.

  44. Luke DA, Calhoun A, Robichaux CB, Moreland-Russell S, Elliott MB. The Program Sustainability Assessment Tool: a new instrument for public health programs. Prev Chronic Dis. 2014;11(2014). https://doi.org/10.5888/PCD11.130184.

  45. Malone S, Prewitt K, Hackett R, et al. The Clinical Sustainability Assessment Tool: measuring organizational capacity to promote sustainability in healthcare. Implement Sci Commun. 2021;2(1). https://doi.org/10.1186/S43058-021-00181-2.

  46. Tools and Templates – The Consolidated Framework for Implementation Research. https://cfirguide.org/tools/tools-and-templates/. Accessed 6 Nov 2022.

  47. The Hexagon: an exploration tool | NIRN. https://nirn.fpg.unc.edu/resources/hexagon-exploration-tool. Accessed 6 Nov 2022.

Download references

Acknowledgements

Not applicable.

Reporting guidelines

We chose GUIDED because this was an intervention development study. We used TIDieR because it helps articulate the intervention we created and is also recommended to be used by GUIDED.

Funding

This work was supported in part by the NCI Center P50 Grant (5P50CA244688), the National Cancer Institute’s Consortium for Cancer Implementation Science, and the University of Colorado’s Department of Family Medicine. Dr. Trinkley’s time was supported in part by the NHLBI (K12HL137862 and 1K23HL161352).

Author information

Authors and Affiliations

Authors

Contributions

All authors, especially KT, RG, and BR, contributed to the content, functionality, and assessments of the iPRISM webtool as well as the study design and writing of this paper. SD contributed specific expertise in human-computer interaction. MF and BF contributed specific expertise in translation and appropriate cultural adaptation of the Spanish version.

Authors’ information

Not applicable.

Corresponding author

Correspondence to Katy E. Trinkley.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

Russell Glasgow is a member of the Editorial Board for the journal. The authors declare that they have no other competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

iPRISM Webtool Assessment Questions.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Trinkley, K.E., Glasgow, R.E., D’Mello, S. et al. The iPRISM webtool: an interactive tool to pragmatically guide the iterative use of the Practical, Robust Implementation and Sustainability Model in public health and clinical settings. Implement Sci Commun 4, 116 (2023). https://doi.org/10.1186/s43058-023-00494-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43058-023-00494-4

Keywords