Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Impact of surgical intervention trials on healthcare: A systematic review of assessment methods, healthcare outcomes, and determinants

  • Juliëtte J. C. M. van Munster ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Visualization, Writing – original draft, Writing – review & editing

    j.j.c.m.van_munster@lumc.nl

    Affiliations Department of Otorhinolaryngology and Head and Neck Surgery, Leiden University Medical Center (LUMC), Leiden University, Leiden, the Netherlands, Leiden University Neurosurgical Center Holland (UNCH), LUMC and The Hague Medical Center (HMC), Leiden, the Netherlands

  • Amir H. Zamanipoor Najafabadi,

    Roles Conceptualization, Methodology, Supervision, Writing – review & editing

    Affiliation Leiden University Neurosurgical Center Holland (UNCH), LUMC and The Hague Medical Center (HMC), Leiden, the Netherlands

  • Nick P. de Boer,

    Roles Data curation, Investigation, Writing – review & editing

    Affiliation Department of Otorhinolaryngology and Head and Neck Surgery, Leiden University Medical Center (LUMC), Leiden University, Leiden, the Netherlands

  • Wilco C. Peul,

    Roles Investigation, Methodology, Supervision, Writing – review & editing

    Affiliation Leiden University Neurosurgical Center Holland (UNCH), LUMC and The Hague Medical Center (HMC), Leiden, the Netherlands

  • Wilbert B. van den Hout,

    Roles Conceptualization, Investigation, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Biomedical Data Science–Medical Decision Making, Leiden University Medical Center, Leiden University, Leiden, the Netherlands

  • Peter Paul G. van Benthem

    Roles Conceptualization, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Otorhinolaryngology and Head and Neck Surgery, Leiden University Medical Center (LUMC), Leiden University, Leiden, the Netherlands

Abstract

Background

Frameworks used in research impact evaluation studies vary widely and it remains unclear which methods are most appropriate for evaluating research impact in the field of surgical research. Therefore, we aimed to identify and review the methods used to assess the impact of surgical intervention trials on healthcare and to identify determinants for surgical impact.

Methods

We searched journal databases up to March 10, 2020 for papers assessing the impact of surgical effectiveness trials on healthcare. Two researchers independently screened the papers for eligibility and performed a Risk of Bias assessment. Characteristics of both impact papers and trial papers were summarized. Univariate analyses were performed to identify determinants for finding research impact, which was defined as a change in healthcare practice.

Results

Sixty-one impact assessments were performed in 37 included impact papers. Some surgical trial papers were evaluated in more than one impact paper, which provides a total of 38 evaluated trial papers. Most impact papers were published after 2010 (n = 29). Medical records (n = 10), administrative databases (n = 22), and physician’s opinion through surveys (n = 5) were used for data collection. Those data were analyzed purely descriptively (n = 3), comparing data before and after publication (n = 29), or through time series analyses (n = 5). Significant healthcare impact was observed 49 times and more often in more recent publications. Having impact was positively associated with using medical records or administrative databases (ref.: surveys), a longer timeframe for impact evaluation and more months between the publication of the trial paper and the impact paper, data collection in North America (ref.: Europe), no economic evaluation of the intervention, finding no significant difference in surgical outcomes, and suggesting de-implementation in the original trial paper.

Conclusions and implications

Research impact evaluation receives growing interest, but still a small number of impact papers per year was identified. The analysis showed that characteristics of both surgical trial papers and impact papers were associated with finding research impact. We advise to collect data from either medical records or administrative databases, with an evaluation time frame of at least 4 years since trial publication.

Introduction

Research impact is defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, environment or quality of life[17]. Despite the introduction of multiple research impact evaluation frameworks by governments and funding bodies (e.g. the Research Excellence Framework and the Payback framework[8]), the actual methods used in case studies vary widely, and therefore it remains unclear which methods are most appropriate for evaluating research impact in different fields of healthcare research.

In the field of surgical research, the translational impact of surgical trials on clinical practice is rarely evaluated, hampering optimal implementation and de-implementation of surgical interventions[9]. It was suggested that reducing low-value surgical interventions, based on high-quality evidence, can save €153 million per year in the United Kingdom alone[1014]. High-quality surgical research has increased worldwide in the past decades[15]. But to actually reduce the use of these low-value interventions, high quality research evaluating the impact of clinical trials is warranted as well, measuring relevant and actable outcomes on healthcare[5, 8, 16]. This statement is supported in The Innovation, Development, Exploration, Assessment, and Long-term study (IDEAL) Framework, which was introduced to improve quantity and quality of surgical research[9, 17]. For example, Ainsworth et al. showed that the overall impact of a trial on the effectiveness of axillary lymph node clearance did not significantly change practice, although the trial had important implications for clinical practice. It was recommended to better inform patients of their treatment options as a result of the outcomes from the impact trial[18].

A standardized approach of research impact evaluation could address methodological discrepancies and better inform decision makers and healthcare practitioners[5, 7, 1923]. Therefore, the aim of this systematic review was to identify and review the methods used to assess the impact of surgical intervention trials on healthcare in case studies to provide a strategy for surgical research evaluation to researchers, healthcare practitioners and decision makers. In addition, we assessed possible determinants for finding surgical impact in terms of characteristics of the original trial and characteristics of the impact study.

Methods

This systematic review was reported according to the Preferred reporting items for systematic review and meta-analysis protocols (PRISMA) guidelines and was registered in the PROSPERO register (registration number: CDR42018106812) before title-abstract screening and full-text screening was performed[24].

Literature search and eligibility criteria

PubMed, Embase, Web of Science, and The Cochrane Library were searched systematically on March 10, 2020. Together with a trained librarian we compiled our search strategy for impact papers consisting of four concepts: “surgery”, “clinical trials”, “impact”, and “clinical practice”. The full search strategy can be found in the Appendix. We included papers that investigate the impact of surgical intervention trials as defined in the Research Excellence Framework[1]: “Research impact was defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, environment or quality of life, beyond academia”. Papers were excluded when they investigated the impact of non-surgical trials, the impact of surgical treatments on healthcare not related to trial publication, or the impact of future research or guideline implementation without the impact of the actual surgical trial on healthcare. Also, (descriptions of) original investigations, study protocols, expert opinions, letters to the editor, (economic) analyses of interventions, and papers describing methodological implications for impacts studies were excluded. Screening of eligible articles was performed independently by two authors (JM and NB). If agreement could not be reached between the two authors, the opinion of two other authors (WH and AZ) was requested to reach consensus. For each impact paper, the associated trial paper (or papers) was (were) identified from the provided references. We also searched databases on most important research impact frameworks mentioned in previous reviews[5, 7, 1921, 23], but did not discovered additional impact papers.

Risk of bias assessment

Two authors (JM and AZ) independently assessed the Risk of Bias (RoB) of both the articles describing the surgical trials (trial papers) and the impact papers. For the trial papers, quality was calculated using the Methodological Items for Non-Randomized Studies (MINORS), which includes important quality assessments applicable to randomized studies as well[25]. The ideal score is 24 points for comparative studies, and 16 points for non-comparative studies. No tool exists to specifically assess the quality of studies estimating the impact of trials on healthcare practice. We used the Robins-I tool to assess Risk of Bias since we feel this fits best to analyze impact assessments[26].

Data extraction

Data were extracted from each impact paper and each trial paper by two authors (JM and NB). From the impact papers, we extracted the following data: primary author, publication date, surgical specialty, region of data collection and data collection methods, timeframe of evaluation in years, outcome measurement, number of time points for outcome measurement, analysis methods, limitations, and main results. Conclusions as reported by the authors of the impact papers were divided in two groups: yes (research impact occurred) and no (no research impact or no clear statements made by the authors). From the trial papers, we extracted the following data: publication date, type of comparison (surgery vs. surgery, surgery vs. watchful waiting, or surgery vs. non-surgical treatment), implementation vs. de-implementation, sample size, economic evaluation performed (possibly in a separate paper), study design, external funding, and conclusion made by the authors.

Analyses

Univariate analyses were performed to identify determinants of both the trial papers and the impact papers on finding research impact. Conclusions made by the authors of the impact papers were used to define whether impact papers did or did not found research impact. The following characteristics of the trial papers were analyzed: time since publication trial paper in months, economic evaluation performed (yes vs. no), type of comparison (surgery vs. surgery, surgery vs. watchful waiting, and surgery vs. non-surgical treatment), implementation versus de-implementation, specialty (oncological surgery as a subspecialty of general surgery versus other specialties (e.g. non-oncological general surgery, neurosurgery, trauma surgery), external funding (yes versus no), sample size, RoB score (MINORS), and whether a significant difference was found for the treatment outcomes (yes versus no). For the impact papers we examined: design (purely descriptive, comparative analysis, or time series analysis), data collection (opinion of physicians, medical records, administrative databases), case-mix presentation (yes versus no), the continent where the evaluation was performed (North America versus Europe), timeframe of evaluation (range between years that were evaluated), months between publication impact paper and trial paper, months between literature search and impact paper, and RoB score (Robins-I). For continuous variables, we performed an independent t-test or Mann-Whitney U test in case of non-parametric data, and Chi-square tests for categorical variables or Fisher exact tests in case of less than five observed values per category, all two-sided with a statistical significance level of P<0.05. Post-hoc analyses were performed for significant findings for possible determinants with more than two groups, using Fisher exact tests for all possible comparisons between groups, with a Bonferroni correction for multiple testing. SPSS Statistics software (version 26; IBM Corp) was used for all statistical analyses.

Results

Search strategy and selection

The search identified 5237 unique publications, of which 108 full-text articles were evaluated for eligibility and 37 included in the analysis. Reasons for exclusion are presented in the Flow Diagram (Fig 1).

Characteristics and quality assessment of the impact papers

The number of papers increased over time, with a maximum of 6 surgical intervention trial impact papers per year issued in 2017[2732] (Fig 2).

Surprisingly, none of the included impact papers mentioned the use of a methodological framework to assess the impact of the trial papers. Most impact papers were published in the surgical oncology field[28, 3343] or neurosurgical field[27, 30, 4450] (Table 1, details in S2 Table) and were conducted in North America[27, 3032, 3440, 43, 44, 46, 47, 4957], less often in Europe[33, 41, 42, 45, 48, 5863], and not on other continents.

Medical records or hospital data and administrative databases were most often used as sources of data[2734, 3642, 4457, 59, 62, 63]. Furthermore, most impact papers compared data before and after publication by performing a pre-trial and post-trial comparison, a trend analysis, or a mixture of those two methods. Five articles performed an interrupted time series analysis or spline regression analysis[31, 50, 52, 53, 57]. Five papers (14%) applied an economic evaluation by comparing total charges between time periods before and after trial publication[28, 37, 47, 50, 52]. Impact categories that were studied are outlined in Table 2.

thumbnail
Table 2. Analysis of findings from the surgical impact papers.

https://doi.org/10.1371/journal.pone.0233318.t002

All studies investigated changes in clinical practice, whereas some studies investigated changes in policy and health gain. RoB assessment of the impact papers is presented in Table 3. We appraised 1 study as ‘low RoB’[53], 5 studies as ‘moderate RoB’[37, 41, 47, 52, 56], 13 studies as ‘serious RoB’[28, 3033, 39, 40, 43, 44, 48, 50, 57, 59], and 18 studies as ‘critical RoB’[27, 3436, 38, 45, 46, 49, 51, 54, 55, 58, 6065].

thumbnail
Table 3. Risk of bias assessment of impact papers (Robins-1).

https://doi.org/10.1371/journal.pone.0233318.t003

Conclusions of the impact papers

The impact of 7 surgical intervention trials was evaluated more than once[6673], resulting in 61 conclusions by the authors concerning the impact of the trial paper, of which 49 times (80%) a significant impact on healthcare or policy was reported (S2 Table). In more recent years, a significant greater proportion of the articles reported impact on healthcare (P = 0.04) (Fig 2 and Table 5). Primarily, impact was found in a change in healthcare practice (mostly in a change in procedure rate after publication (n = 48, 98%)), but also in a change in policy e.g. a guideline revision (n = 17; 52%), and in a change in patient benefit, such as an increase or decrease in complications and mortality (n = 8; 24%). Additionally, 3 out of 5 papers that performed a cost evaluation reported cost savings[28, 50, 52] and 2 papers noticed a rise in healthcare costs[37, 47] after publication of the surgical trial.

The trial by Mendelow[70], that evaluated early surgery versus conservative treatment for intracerebral hemorrhage, was evaluated by 3 impact papers. Two of the impact papers reported a decrease in procedures[48, 49], whereas one paper did not observe a change in procedure rate[44]. However, this can be due to different study periods. The trial by Prinssen and by the EVAR trial participants[67, 74], that compared the effectiveness of endovascular aneurysm repair for abdominal aortic aneurysm with open repair, were evaluated twice[58, 59]. One paper did not find research impact by surveying Dutch surgeons before and after trial publication, while the other paper witnessed an increasing trend in numbers of endovascular procedures in the United Kingdom. For the remaining six papers that were examined more than once, no differences in conclusions were found between papers reporting on the same trial.

Characteristics and quality assessment of the trial papers

Most trial papers were non-blinded multicenter RCTs (Table 4, details in S3 Table).

thumbnail
Table 4. Summary of the characteristics of the surgical intervention trial papers.

https://doi.org/10.1371/journal.pone.0233318.t004

The median sample size was 461 (interquartile range (IQR): 131–991). Fifteen studies (39%) evaluated surgery vs. watchful waiting, while 19 studies (50%) evaluated surgery vs. surgery, and 4 studies (11%) evaluated surgery vs. a non-surgical treatment. The outcomes from the trials according to the authors were heterogeneous, and half of the papers supported de-implementation while the other half supported implementation of the evaluated procedure. The average score on the MINORS scale was 21 (SD 2.8).

Determinants of impact

Outcomes on determinants of impact are shown in Table 5.

Impact was found more often when impact was studies through administrative databases or medical records compared to through the opinion of physicians. Post-hoc analysis between the three groups showed a significant difference between the use of administrative data and the opinion of physicians (administrative database vs. opinion of physicians, P<0.001; medical records vs. opinion of physicians, P = 0.04). Additionally, impact papers from the continent of North America were more likely to report an impact on practice patterns than those from Europe. Correspondingly to Fig 2, more impact was found in more recent years (fewer months between our literature search and publication date of the impact paper). Also, a longer timeframe (in years) for impact evaluation was associated with finding impact. Additionally, more time (in months) between publication of the trial paper and publication of the impact paper lead to more healthcare impact. When no economic evaluation was performed additional to the trial paper, it was more likely that impact on healthcare was found. Furthermore, when the trial paper did not find a significant difference, the impact paper was more likely to find an impact. Additionally, we noticed that all surgical oncology papers (n = 14) translated research into practice, but this was not significantly different from other specialties. No differences were found for the other characteristics of the trial papers.

Discussion

This systematic review of surgical impact papers found an increase in these published manuscripts over the years. Neurosurgical research and surgical oncology research was most often evaluated. However, of the large numbers of surgical trials that were published[14], only in a very small percentage the healthcare impact has been evaluated. Moreover, impact papers did not use frameworks, and results from the Risk of Bias assessment showed that many impact papers have a high RoB, which hampers the reduction of low-value surgical interventions and provision of ongoing feedback to decisionmakers[13, 75]. The analyses of impact determinants showed that certain methodological aspects of both the surgical trial papers and impact papers are advantageous for impact evaluation, such as a long enough timeframe to measure impact and the use of administrative databases compared with surveys assessing physician opinion.

Impact frameworks

It is remarkable that not one of the identified impact papers mentioned the use of a framework to assess healthcare impact. In contrast, a review on multi-project research programs, including non-surgical projects, found that most impact papers did use a conceptual framework[23]. One explanation for this contrast could be that existing frameworks are designed for general research programs[8, 7678], while, as described by the IDEAL recommendations, important differences exist between surgical intervention research and other research fields[17]. A general and specific approach for impact assessments in surgery, as an addition to the IDEAL framework, could improve methods and inform clinicians, researchers, and funding bodies.

Importance of proper study design and data collection to evaluate healthcare impact of surgical trials

Our results showed that administrative databases and medical hospital data were most frequently used as data sources for surgical intervention research impact, and were more often associated with healthcare impact. In the IDEAL framework it is also recommended to use registries and routine databases for long-term study[79]. Not only is the use of administrative databases more objective than the opinion of experts, it might also be more representative for surgical research impact, as it includes a wider population and relatively longer follow-up is compared to hospital data[79, 80]. Conversely, data on specific case-mix variables is sometimes lacking within registers, which is important for proper comparison over time and between regions an which could be retrieved more easily in studies using patient charts.

We found more impact in more recent published impact papers, which might indicate more attention for research implementation in more recent years, but this could also indicate that more recent impact papers evaluated a longer time lag. Especially since the results show that impact could not have occurred yet within a limited time lag, and when not enough time has passed since publication of the trial paper: the implementation of 14% of all research into clinical practices takes 17 years on average[8183]. Still, 80% of the trial papers in this review had an impact on clinical practice within an average time span of approximately 4 years. This might be explained by the fact that only pivotal, high quality surgical trials are selected for evaluation. The results showed that surgical impact papers were only published by authors from Europe or North America. Nevertheless, the largest increase in publication of randomized surgical intervention trials was observed in Asia, implying more surgical impact assessments are needed there[15]. Additionally, the results showed that impact papers from North America found more often an impact on healthcare than those from Europe. This could indicate that practice in North America is more susceptible to research, but it could also be that researchers in Europe are more likely to study and publish about studies with an unclear research impact. We found only one paper that focused on impact in terms of changes in geographic variation. Since it was suggested that practice variation can partly be explained by gaps in scientific knowledge, future research could also focus on evaluating impact on practice variation[84].

Analysis of impact

In this review impact on healthcare practice was found in most of the papers. However, it is important to assess the impact of published trials independent of already existing time-trends in the frequency of treatment, in treatment approach, or both[85]. Unfortunately, this was only performed in the minority of studies. One possibility to correct for time-trends is the use of difference in difference analysis[86]. The ideal control group would be a group that is unaware of a certain trial publication, but randomizing for the knowledge of trial results would be impossible and unethical. One option could be to compare with another intervention which was not evaluated. Another possibility could be performing interrupted time series analysis and to control for secular trends in the data by using segmented regression to measure the changes in procedures before and after trial publication[86, 87]. Three impact papers showed data that were measured after publication only, although impact studies require a comparative study[85, 88]. Hopefully, it now is easier to perform comparative studies with the rise and availability of multiple healthcare administrative databases. In addition, we found limited numbers of studies that analyzed costs before and after trial publication. The authors feel cost analyses, for example return-on-investment analysis or cost-benefit analysis, could be beneficial in the impact assessment of surgical research, especially since huge savings from reducing low-value surgery were predicted[13].

Trial paper determinants of impact

More frequently impact was reported in cases of trials that did not find statistically significant differences, although from previous research the opposite was concluded[89, 90]. However, especially in the surgical field there is special attention for reducing low-value interventions[13] which might support this result. When no differences are found between an interventional procedure and watchful waiting for instance, one can say that this intervention is of low value and a change in procedure numbers is expected[13]. Indeed, a majority of the surgical trial papers supported de-implementation of a surgical technique, which was also a determinant for finding research impact. Furthermore, more impact papers found impact when no additional economic analysis was performed on the original trial; although in most cases the economic evaluation supported the outcomes from the RCT, making the evidence even stronger. It might be difficult to publish an additional economic analysis, when strong evidence on the effectiveness of a surgical intervention is already published and widely accepted, which we observe in impact on healthcare. Although not significant, it is notable that all surgical oncology impact papers reported impact on healthcare. This implies more attention for research or evidence-based medicine in the surgical oncology field compared to other surgical specialties.

Ideas on improvement of knowledge translation in the current era

In this review, we focused on the impact of surgical research, which can support prompter implementation and thereby improve quality of healthcare. In the Dutch program ‘Leading the change’, five factors that influence implementation were identified, of which one is the use of audit and feedback for healthcare quality evaluation[91]. Encouragement on the use of impact evaluations by governments and funding bodies is needed to address the importance of these studies. More research on methodological issues and reporting guidelines for healthcare evaluations is needed to provide universal guidelines for research impact evaluations. Also, more research is needed on why some study results are translated into clinical practice whereas other results are not. It would also be interesting to investigate the impact of research on regional variation in healthcare as stated in the IDEAL-framework[92]. Moreover, it is believed that little variation is seen in clinical practice when there is strong evidence and a professional consensus for interventions[93].

Strengths and limitations

To our knowledge, the present review is the first that specifically focuses on the impact of surgical research. This is necessary since there are some inherent differences with non-surgical studies and therefore different approaches to evaluate research impact are needed for both research fields[5, 19, 21]. Additionally, a limitation of this review is the small numbers of papers reporting ‘no impact’, which impeded multivariate analyses. Despite the focus on surgical trials, we found heterogeneous outcomes and evaluated procedures, which may have hidden the influence determinants can have in a more homogeneous setting. Last, previous reviews on methodological frameworks for research impact mentioned that they found parts of their included publications through grey literature (papers not indexed in bibliographic databases)[7, 20, 21]. This might be similar for impact papers since it is a relatively new research field, resulting in an underestimation of the number of surgical trial papers.

Conclusions

In conclusion, more impact papers are needed to track changes in healthcare practice over time and provide knowledge on the impact of surgical research to researchers, funders, physicians, and policy makers. Eventually, this knowledge can help to reduce low-value surgical procedures. However, quality improvement of the used methods of published impact papers is necessary to draw valid conclusions, especially since we found that timeframe of evaluation and the data source of the impact papers is associated with finding research impact. We advise to collect data from either medical records or administrative databases, and perform comparative studies with a time frame of at least 4 years after publication. By routinely using valid methods as a completion of stage 4 of the IDEAL-framework, knowledge on societal research impact can be demonstrated and thereby feedback on overall quality of care.

Supporting information

S1 Table. Extensive information on impact papers.

https://doi.org/10.1371/journal.pone.0233318.s002

(DOCX)

S2 Table. Extensive information on trial papers.

https://doi.org/10.1371/journal.pone.0233318.s003

(DOCX)

S3 Table. Risk of Bias surgical intervention trial papers.

https://doi.org/10.1371/journal.pone.0233318.s004

(DOCX)

Acknowledgments

We would like to thank Jan Schoones (Library LUMC) for his assistance with the literature search.

References

  1. 1. HEFCE. REF 2014: Assessment framework and guidance on submissions 2011. 2014.
  2. 2. Searles A, Doran C, Attia J, Knight D, Wiggers J, Deeming S, et al. An approach to measuring and encouraging research translation and research impact. Health research policy and systems. 2016;14(1):60–. pmid:27507300.
  3. 3. Group LPP. Maximizing the Impacts of Your Research: A Handbook for Social Scientists. London: London School of Economics; 2011.
  4. 4. Sciences CAoH. Making an Impact, A Preferred Framework and Indicators to Measure Returns on Investment in Health Research. 2009.
  5. 5. Cruz Rivera S, Kyte DG, Aiyegbusi OL, Keeley TJ, Calvert MJ. Assessing the impact of healthcare research: A systematic review of methodological frameworks. PLoS medicine. 2017;14(8):e1002370. Epub 2017/08/10. pmid:28792957; PubMed Central PMCID: PMC5549933.
  6. 6. Deeming S, Searles A, Reeves P, Nilsson M. Measuring research impact in Australia's medical research institutes: a scoping literature review of the objectives for and an assessment of the capabilities of research impact assessment frameworks. Health research policy and systems. 2017;15(1):22–. pmid:28327199.
  7. 7. Milat AJ, Bauman AE, Redman S. A narrative review of research impact assessment models and methods. Health Res Policy Syst. 2015;13:18. Epub 2015/04/18. pmid:25884944; PubMed Central PMCID: PMC4377031.
  8. 8. Buxton MJ, Hanney S. [Developing and applying the Payback Framework to assess the socioeconomic impact of health research]. Med Clin (Barc). 2008;131 Suppl 5:36–41. Epub 2009/07/28. pmid:19631821.
  9. 9. Barkun JS, Aronson JK, Feldman LS, Maddern GJ, Strasberg SM, Altman DG, et al. Evaluation and stages of surgical innovations. Lancet (London, England). 2009;374(9695):1089–96. Epub 2009/09/29. pmid:19782874.
  10. 10. Howes N, Chagla L, Thorpe M, McCulloch P. Surgical practice is evidence based. The British journal of surgery. 1997;84(9):1220–3. Epub 1997/10/06. pmid:9313697.
  11. 11. Shawhan RR, Hatch QM, Bingham JR, Nelson DW, Fitzpatrick EB, McLeod R, et al. Have we progressed in the surgical literature? Thirty-year trends in clinical studies in 3 surgical journals. Diseases of the colon and rectum. 2015;58(1):115–21. Epub 2014/12/10. pmid:25489703.
  12. 12. Wells CI, Robertson JP, O'Grady G, Bissett IP. Trends in publication of general surgical research in New Zealand, 1996–2015. ANZ journal of surgery. 2017;87(1–2):76–9. Epub 2016/11/03. pmid:27804200.
  13. 13. Malik HT, Marti J, Darzi A, Mossialos E. Savings from reducing low-value general surgical interventions. Br J Surg. 2018;105(1):13–25. Epub 2017/11/09. pmid:29114846.
  14. 14. McCulloch P, Feinberg J, Philippou Y, Kolias A, Kehoe S, Lancaster G, et al. Progress in clinical research in surgery and IDEAL. The Lancet. 2018;392(10141):88–94.
  15. 15. Ahmed Ali U, van der Sluis PC, Issa Y, Habaga IA, Gooszen HG, Flum DR, et al. Trends in worldwide volume and methodological quality of surgical randomized controlled trials. Ann Surg. 2013;258(2):199–207. Epub 2013/06/19. pmid:23774315.
  16. 16. Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JP, et al. Biomedical research: increasing value, reducing waste. Lancet (London, England). 2014;383(9912):101–4. Epub 2014/01/15. pmid:24411643.
  17. 17. McCulloch P, Taylor I, Sasako M, Lovett B, Griffin D. Randomised trials in surgery: problems and possible solutions. BMJ (Clinical research ed). 2002;324(7351):1448–51. Epub 2002/06/18. pmid:12065273; PubMed Central PMCID: PMC1123389.
  18. 18. Ainsworth RK, Kollias J, Le Blanc A, De Silva P. The clinical impact of the American College of Surgeons Oncology Group Z-0011 trial—results from the BreastSurgANZ National Breast Cancer Audit. Breast (Edinburgh, Scotland). 2013;22(5):733–5. Epub 2013/01/08. pmid:23290275.
  19. 19. Greenhalgh T, Raftery J, Hanney S, Glover M. Research impact: a narrative review. BMC medicine. 2016;14:78. Epub 2016/05/24. pmid:27211576; PubMed Central PMCID: PMC4876557.
  20. 20. Raftery J, Hanney S, Greenhalgh T, Glover M, Blatch-Jones A. Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme. Health technology assessment (Winchester, England). 2016;20(76):1–254. Epub 2016/10/22. pmid:27767013; PubMed Central PMCID: PMC5086596.
  21. 21. Banzi R, Moja L, Pistotti V, Facchini A, Liberati A. Conceptual frameworks and empirical approaches used to assess the impact of health research: an overview of reviews. Health research policy and systems. 2011;9:26. Epub 2011/06/28. pmid:21702930; PubMed Central PMCID: PMC3141787.
  22. 22. Hanney SR, Gonzalez-Block MA. Health research improves healthcare: now we have the evidence and the chance to help the WHO spread such benefits globally. Health research policy and systems. 2015;13:12. Epub 2015/04/19. pmid:25888723; PubMed Central PMCID: PMC4351695.
  23. 23. Hanney S, Greenhalgh T, Blatch-Jones A, Glover M, Raftery J. The impact on healthcare, policy and practice from 36 multi-project research programmes: findings from two reviews. Health research policy and systems. 2017;15(1):26. Epub 2017/03/30. pmid:28351391; PubMed Central PMCID: PMC5371238.
  24. 24. Shamseer L, Moher D, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ (Clinical research ed). 2015;350:g7647. Epub 2015/01/04. pmid:25555855.
  25. 25. Slim K, Nini E, Forestier D, Kwiatkowski F, Panis Y, Chipponi J. Methodological Index for Non-randomized Studies (MINORS): development and validation of a new instrument. ANZ journal of surgery. 2003;73:712–6. pmid:12956787
  26. 26. Sterne JA, Hernan MA, Reeves BC, Savovic J, Berkman ND, Viswanathan M, et al. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. Bmj. 2016;355:i4919. Epub 2016/10/14. pmid:27733354; PubMed Central PMCID: PMC5062054 http://www.icmje.org/coi_disclosure.pdf and declare: grants from Cochrane, MRC, and NIHR during the conduct of the study. Dr Carpenter reports personal fees from Pfizer, grants and non-financial support from GSK and grants from Novartis, outside the submitted work. Dr Reeves is a co-convenor of the Cochrane Non-Randomised Studies Methods Group. The authors report no other relationships or activities that could appear to have influenced the submitted work.
  27. 27. Degnan AJ, Hemingway J, Hughes DR. Medicare Utilization of Vertebral Augmentation 2001 to 2014: Effects of Randomized Clinical Trials and Guidelines on Vertebroplasty and Kyphoplasty. Journal of the American College of Radiology: JACR. 2017;14(8):1001–6. S1546-1440(17)30495-7 [pii]; pmid:28778222
  28. 28. Fillion MM, Glass KE, Hayek J, Wehr A, Phillips G, Terando A, et al. Healthcare Costs Reduced After Incorporating the Results of the American College of Surgeons Oncology Group Z0011 Trial into Clinical Practice. Breast J. 2017;23(3):275–81. pmid:27900818
  29. 29. Palmer JAV, Flippo-Morton T, Walsh KK, Gusic LH, Sarantou T, Robinson MM, et al. Application of ACOSOG Z1071: Effect of Results on Patient Care and Surgical Decision-Making. Clinical Breast Cancer. 2017:270–5. pmid:29129549
  30. 30. Rosenbaum BP, Kshettry VR, Kelly ML, Mroz TE, Weil RJ. Trends in Inpatient Vertebroplasty and Kyphoplasty Volume in the United States, 2005–2011: Assessing the Impact of Randomized Controlled Trials. Clin Spine Surg. 2017;30(3):E276–E82. 01933606-201704000-00030 [pii]. pmid:28323712
  31. 31. Sheth U, Wasserstein D, Jenkinson R, Moineddin R, Kreder H, Jaglal S. Practice patterns in the care of acute Achilles tendon ruptures: is there an association with level I evidence? The bone & joint journal. 2017;99-B(12):1629–36. 99-B/12/1629 [pii]; pmid:29212686
  32. 32. Amin NH, Hussain W, Ryan J, Morrison S, Miniaci A, Jones MH. Changes Within Clinical Practice After a Randomized Controlled Trial of Knee Arthroscopy for Osteoarthritis. Orthop J Sports Med. 2017;5(4):2325967117698439. Epub 2017/04/30. pmid:28451610; PubMed Central PMCID: PMC5400146.
  33. 33. Ahern TP, Larsson H, Garne JP, Cronin-Fenton DP, Sorensen HT, Lash TL. Trends in breast-conserving surgery in Denmark, 1982–2002. Eur J Epidemiol. 2008;23(2):109–14. pmid:17987392
  34. 34. Caudle AS, Bedrosian I, Milton DR, DeSnyder SM, Kuerer HM, Hunt KK, et al. Use of Sentinel Lymph Node Dissection After Neoadjuvant Chemotherapy in Patients with Node-Positive Breast Cancer at Diagnosis: Practice Patterns of American Society of Breast Surgeons Members. Annals of Surgical Oncology. 2017;24(10):2925–34. pmid:28766207
  35. 35. Gainer SM, Hunt KK, Beitsch P, Caudle AS, Mittendorf EA, Lucci A. Changing behavior in clinical practice in response to the ACOSOG Z0011 trial: a survey of the American Society of Breast Surgeons. Annals of surgical oncology. 2012;19(10):3152–8. Epub 2012/07/24. pmid:22820938.
  36. 36. Le VH, Brant KN, Blackhurst DW, Schammel CM, Schammel DP, Cornett WR, et al. The impact of the American College of Surgeons Oncology Group (ACOSOG) Z0011 trial: An institutional review. Breast (Edinburgh, Scotland). 2016;29:117–9. Epub 2016/08/02. pmid:27479042.
  37. 37. Rea JD, Cone MM, Diggs BS, Deveney KE, Lu KC, Herzig DO. Utilization of laparoscopic colectomy in the United States before and after the clinical outcomes of surgical therapy study group trial. Annals of surgery. 2011;254(2):281–8. pmid:21685791
  38. 38. Robinson KA, Pockaj BA, Wasif N, Kaufman K, Gray RJ. Have the American College of Surgeons Oncology Group Z0011 trial results influenced the number of lymph nodes removed during sentinel lymph node dissection? Am J Surg. 2014;208(6):1060–4; discussion 3–4. Epub 2014/10/15. pmid:25312842.
  39. 39. Yao K, Liederbach E, Pesce C, Wang CH, Winchester DJ. Impact of the American College of Surgeons Oncology Group Z0011 Randomized Trial on the Number of Axillary Nodes Removed for Patients with Early-Stage Breast Cancer. J Am Coll Surg. 2015;221(1):71–81. Epub 2015/04/23. pmid:25899731.
  40. 40. Bazan JG, Fisher JL, Parka KU, Marcus EA, Bittoni MA, White JR. Assessing the Impact of CALGB 9343 on Surgical Trends in Elderly-Women With Stage I ER plus Breast Cancer: A SEER-Based Analysis. Frontiers in Oncology. 2019;9:9. WOS:000474724200001. pmid:30723704
  41. 41. Garcia-Etienne CA, Mansel RE, Tomatis M, Heil J, Biganzoli L, Ferrari A, et al. Trends in axillary lymph node dissection for early-stage breast cancer in Europe: Impact of evidence on practice. Breast. 2019;45:89–96. Epub 2019/03/30. pmid:30925382.
  42. 42. Joyce DP, Lowery AJ, McGrath-Soo LB, Downey E, Kelly L, O'Donoghue GT, et al. Management of the axilla: has Z0011 had an impact? Ir J Med Sci. 2016;185(1):145–9. [pii]. pmid:25595827
  43. 43. Palmer JAV, Flippo-Morton T, Walsh KK, Gusic LH, Sarantou T, Robinson MM, et al. Application of ACOSOG Z1071: Effect of Results on Patient Care and Surgical Decision-Making. Clinical breast cancer. 2017. pmid:29129549
  44. 44. Adeoye O, Ringer A, Hornung R, Khatri P, Zuccarello M, Kleindorfer D. Trends in surgical management and mortality of intracerebral hemorrhage in the United States before and after the STICH trial. Neurocrit Care. 2010;13(1):82–6. Epub 2010/04/13. pmid:20383612.
  45. 45. Beez T, Steiger HJ. Impact of randomized controlled trials on neurosurgical practice in decompressive craniectomy for ischemic stroke. Neurosurgical review. 2018. [pii]. pmid:29556835
  46. 46. Cox M, Levin DC, Parker L, Morrison W, Long S, Rao VM. Vertebral Augmentation After Recent Randomized Controlled Trials: A New Rise in Kyphoplasty Volumes. Journal of the American College of Radiology: JACR. 2016;13(1):28–32. Epub 2015/11/08. pmid:26546300.
  47. 47. Kelly ML, Kshettry VR, Rosenbaum BP, Seicean A, Weil RJ. Effect of a randomized controlled trial on the surgical treatment of spinal metastasis, 2000 through 2010: a population-based cohort study. Cancer. 2014;120(6):901–8. pmid:24327422
  48. 48. Kirkman MA, Mahattanakul W, Gregson BA, Mendelow AD. The effect of the results of the STICH trial on the management of spontaneous supratentorial intracerebral haemorrhage in Newcastle. Br J Neurosurg. 2008;22(6):739–46; discussion 47. Epub 2008/12/17. pmid:19085356.
  49. 49. Simon SD, Koyama T, Zacharia BE, Schirmer CM, Cheng JS. Impact of clinical trials on neurosurgical practice: an assessment of case volume. World Neurosurg. 2015;83(4):431–7. Epub 2015/02/07. pmid:25655690.
  50. 50. Smieliauskas F, Lam S, Howard DH. Impact of negative clinical trial results for vertebroplasty on vertebral augmentation procedure rates. J Am Coll Surg. 2014;219(3):525–33.e1. Epub 2014/07/17. pmid:25026873.
  51. 51. Halm EA, Tuhrim S, Wang JJ, Rojas M, Hannan EL, Chassin MR. Has evidence changed practice?: Appropriateness of carotid endarterectomy after the clinical trials. Neurology. 2007;68:187–94. pmid:17224571
  52. 52. Howard D, Brophy R, Howell S. Evidence of no benefit from knee surgery for osteoarthritis led to coverage changes and is linked to decline in procedures. Health Aff (Millwood). 2012;31(10):2242–9. Epub 2012/10/11. pmid:23048105.
  53. 53. Hussain MA, Mamdani M, Tu JV, Saposnik G, Khoushhal Z, Aljabri B, et al. Impact of Clinical Trial Results on the Temporal Trends of Carotid Endarterectomy and Stenting From 2002 to 2014. Stroke. 2016;47(12):2923–30. Epub 2016/11/12. pmid:27834754; PubMed Central PMCID: PMC5120767.
  54. 54. Mahan ST, Osborn E, Bae DS, Waters PM, Kasser JR, Kocher MS, et al. Changing practice patterns: the impact of a randomized clinical trial on surgeons preference for treatment of type 3 supracondylar humerus fractures. J Pediatr Orthop. 2012;32(4):340–5. Epub 2012/05/16. pmid:22584832.
  55. 55. Potts A, Harrast JJ, Harner CD, Miniaci A, Jones MH. Practice patterns for arthroscopy of osteoarthritis of the knee in the United States. Am J Sports Med. 2012;40(6):1247–51. Epub 2012/05/09. pmid:22562787.
  56. 56. Williams RF, Interiano RB, Paton E, Eubanks JW, Huang EY, Langham MR, et al. Impact of a randomized clinical trial on children with perforated appendicitis. Surgery. 2014;156(2):462–6. S0039-6060(14)00133-0 [pii]; pmid:24878457
  57. 57. Salata K, Hussain MA, Mestral C, Greco E, Mamdani M, Tu JV, et al. The impact of randomized trial results on abdominal aortic aneurysm repair rates from 2003 to 2016: A population-based time-series analysis. Vascular. 2019;27(4):417–26. Epub 2019/03/26. pmid:30907272.
  58. 58. Baas AF, Grobbee DE, Blankensteijn JD. Impact of randomized trials comparing conventional and endovascular abdominal aortic aneurysm repair on clinical practice. Journal of endovascular therapy: an official journal of the International Society of Endovascular Specialists. 2007;14(4):536–40. Epub 2007/08/19. pmid:17696629.
  59. 59. Brown CN, Sangal S, Stevens S, Sayers RD, Fishwick G, Nasim A. The EVAR Trial 1: has it led to a change in practice? The surgeon: journal of the Royal Colleges of Surgeons of Edinburgh and Ireland. 2009;7(6):326–31. Epub 2010/08/05. pmid:20681373.
  60. 60. Rovers MM, van der Bij S, Ingels K, van der Wilt GJ, Zielhuis GA. Does a trial on the effects of ventilation tubes influence clinical practice? Clin Otolaryngol Allied Sci. 2003;28(4):355–9. 723 [pii]. pmid:12871252
  61. 61. Rovers MM, Hoes AW, Klinkhamer S, Schilder AG. Influence of single-trial results on clinical practice: example of adenotonsillectomy in children. Arch Otolaryngol Head Neck Surg. 2009;135(10):970–5. 135/10/970 [pii]; pmid:19841333
  62. 62. Mc Colgan R, Dalton DM, Cassar-Gheiti AJ, Fox CM, O'Sullivan ME. Trends in the management of fractures of the distal radius in Ireland: did the Distal Radius Acute Fracture Fixation Trial (DRAFFT) change practice? Bone Joint J. 2019;101-b(12):1550–6. Epub 2019/12/04. pmid:31786993.
  63. 63. Costa ML, Jameson SS, Reed MR. Do large pragmatic randomised trials change clinical practice?: assessing the impact of the Distal Radius Acute Fracture Fixation Trial (DRAFFT). The bone & joint journal. 2016;98-B(3):410–3. 98-B/3/410 [pii]; pmid:26920968
  64. 64. Knook MT, Stassen LP, Bonjer HJ. Impact of randomized trials on the application of endoscopic techniques for inguinal hernia repair in The Netherlands. Surgical endoscopy. 2001;15(1):55–8. [pii]. pmid:11178764
  65. 65. Joyce DP, Lowery AJ, McGrath-Soo LB, Downey E, Kelly L, O'Donoghue GT, et al. Management of the axilla: has Z0011 had an impact? Ir J Med Sci. 2016;185(1):145–9. Epub 2015/01/18. pmid:25595827.
  66. 66. Buchbinder R, Osborne RH, Ebeling PR, Wark JD, Mitchell P, Wriedt C, et al. A randomized trial of vertebroplasty for painful osteoporotic vertebral fractures. N Engl J Med. 2009;361(6):557–68. Epub 2009/08/07. pmid:19657121.
  67. 67. Endovascular aneurysm repair versus open repair in patients with abdominal aortic aneurysm (EVAR trial 1): randomised controlled trial. Lancet (London, England). 2005;365(9478):2179–86. Epub 2005/06/28. pmid:15978925.
  68. 68. Giuliano AE, McCall L, Beitsch P, Whitworth PW, Blumencranz P, Leitch AM, et al. Locoregional recurrence after sentinel lymph node dissection with or without axillary dissection in patients with sentinel lymph node metastases: the American College of Surgeons Oncology Group Z0011 randomized trial. Annals of surgery. 2010;252(3):426–32; discussion 32–3. Epub 2010/08/27. pmid:20739842; PubMed Central PMCID: PMC5593421.
  69. 69. Kallmes DF, Comstock BA, Heagerty PJ, Turner JA, Wilson DJ, Diamond TH, et al. A randomized trial of vertebroplasty for osteoporotic spinal fractures. N Engl J Med. 2009;361(6):569–79. Epub 2009/08/07. pmid:19657122; PubMed Central PMCID: PMC2930487.
  70. 70. Mendelow AD, Gregson BA, Fernandes HM, Murray GD, Teasdale GM, Hope DT, et al. Early surgery versus initial conservative treatment in patients with spontaneous supratentorial intracerebral haematomas in the International Surgical Trial in Intracerebral Haemorrhage (STICH): a randomised trial. Lancet (London, England). 2005;365(9457):387–97. Epub 2005/02/01. pmid:15680453.
  71. 71. Moseley JB, O'Malley K, Petersen NJ, Menke TJ, Brody BA, Kuykendall DH, et al. A controlled trial of arthroscopic surgery for osteoarthritis of the knee. N Engl J Med. 2002;347(2):81–8. Epub 2002/07/12. pmid:12110735.
  72. 72. Rousing R, Andersen MO, Jespersen SM, Thomsen K, Lauritsen J. Percutaneous vertebroplasty compared to conservative treatment in patients with painful acute or subacute osteoporotic vertebral fractures: three-months follow-up in a clinical randomized study. Spine (Phila Pa 1976). 2009;34(13):1349–54. Epub 2009/05/30. pmid:19478654.
  73. 73. Costa ML, Achten J, Parsons NR, Rangan A, Griffin D, Tubeuf S, et al. Percutaneous fixation with Kirschner wires versus volar locking plate fixation in adults with dorsally displaced fracture of distal radius: randomised controlled trial. Bmj. 2014;349:g4807. Epub 2014/08/07. pmid:25096595; PubMed Central PMCID: PMC4122170.
  74. 74. Prinssen M, Verhoeven EL, Buth J, Cuypers PW, van Sambeek MR, Balm R, et al. A randomized trial comparing conventional and endovascular repair of abdominal aortic aneurysms. N Engl J Med. 2004;351(16):1607–18. Epub 2004/10/16. pmid:15483279.
  75. 75. Timbie JW, Fox DS, Van Busum K, Schneider EC. Five reasons that many comparative effectiveness studies fail to change patient care and clinical practice. Health affairs (Project Hope). 2012;31(10):2168–75. Epub 2012/10/11. pmid:23048092.
  76. 76. Parker J, van Teijlingen E. The Research Excellence Framework (REF): Assessing the Impact of Social Work Research on Society. Practice. 2012;24(1):41–52.
  77. 77. Landry R, Amara N, Lamari M. Climbing the Ladder of Research Utilization: Evidence from Social Science Research. Science Communication. 2001;22(4):396–422.
  78. 78. Weiss AP. Measuring the impact of medical research: moving from outputs to outcomes. The American journal of psychiatry. 2007;164(2):206–14. Epub 2007/02/03. pmid:17267781.
  79. 79. McCulloch P, Altman DG, Campbell WB, Flum DR, Glasziou P, Marshall JC, et al. No surgical innovation without evaluation: the IDEAL recommendations. The Lancet. 2009;374(9695):1105–12.
  80. 80. Chowdhury TT, Hemmelgarn B. Evidence-based decision-making 6: Utilization of administrative databases for health services research. Methods in molecular biology (Clifton, NJ). 2015;1281:469–84. Epub 2015/02/20. pmid:25694328.
  81. 81. Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annual review of public health. 2009;30:151–74. Epub 2009/08/26. pmid:19705558.
  82. 82. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. Journal of the Royal Society of Medicine. 2011;104(12):510–20. pmid:22179294.
  83. 83. Westfall JM, Mold J, Fagnan L. Practice-based research—"Blue Highways" on the NIH roadmap. Jama. 2007;297(4):403–6. Epub 2007/01/25. pmid:17244837.
  84. 84. Wennberg JE. Dealing with medical practice variations: a proposal for action. Health affairs (Project Hope). 1984;3(2):6–32. Epub 1984/01/01. pmid:6432667.
  85. 85. Moons KGM, Altman DG, Vergouwe Y, Royston P. Prognosis and prognostic research: application and impact of prognostic models in clinical practice. 2009;338:b606. J BMJ. pmid:19502216
  86. 86. Laverty AA, Laudicella M, Smith PC, Millett C. Impact of ‘high-profile’ public reporting on utilization and quality of maternity care in England: a difference-in-difference analysis. Journal of health services research & policy. 2015;20(2):100–8. pmid:25712568
  87. 87. Penfold RB, Zhang F. Use of interrupted time series analysis in evaluating health care quality improvements. Academic pediatrics. 2013;13(6 Suppl):S38–44. Epub 2013/12/07. pmid:24268083.
  88. 88. Reilly BM, Evans AT. Translating clinical research into clinical practice: impact of using prediction rules to make decisions. Annals of internal medicine. 2006;144(3):201–9. Epub 2006/02/08. pmid:16461965.
  89. 89. Newson R, King L, Rychetnik L, Bauman AE, Redman S, Milat AJ, et al. A mixed methods study of the factors that influence whether intervention research has policy and practice impacts: perceptions of Australian researchers. BMJ open. 2015;5(7):e008153. Epub 2015/07/23. pmid:26198428; PubMed Central PMCID: PMC4513518.
  90. 90. Hudgins JD, Fine AM, Bourgeois FT. Effect of Randomized Clinical Trial Findings on Emergency Management. Academic emergency medicine: official journal of the Society for Academic Emergency Medicine. 2016;23(1):36–47. Epub 2016/01/01. pmid:26720855.
  91. 91. Hijden EJEV M.D.H.; Koolman X.;. Vertaling van zorgevaluaties naar de praktijk: een voorstel vanuit de intrinsieke verbeterdrang van zorgprofessionals. Nederlands Tijdschrift voor Geneeskunde. 2019;163(D4175).
  92. 92. Hirst A, Philippou Y, Blazeby J, Campbell B, Campbell M, Feinberg J, et al. No Surgical Innovation Without Evaluation: Evolution and Further Development of the IDEAL Framework and Recommendations. Ann Surg. 2019;269(2):211–20. Epub 2018/04/27. pmid:29697448.
  93. 93. Wennberg JE. The paradox of appropriate care. Jama. 1987;258(18):2568–9. Epub 1987/11/13. pmid:3669227.