Hostname: page-component-8448b6f56d-m8qmq Total loading time: 0 Render date: 2024-04-18T21:06:52.526Z Has data issue: false hasContentIssue false

Assessing Markers of Reproducibility and Transparency in Smoking Behaviour Change Intervention Evaluations

Published online by Cambridge University Press:  01 January 2024

Emma Norris*
Affiliation:
Health Behaviour Change Research Group, Department of Health Sciences, Brunel University, UK Centre for Behaviour Change, University College London, UK
Yiwei He
Affiliation:
Psychology & Language Sciences, University College London, UK
Rachel Loh
Affiliation:
Psychology & Language Sciences, University College London, UK
Robert West
Affiliation:
Research Department of Epidemiology & Public Health, University College London, UK
Susan Michie
Affiliation:
Centre for Behaviour Change, University College London, UK
*
Correspondence should be addressed to Emma Norris; emma.norris@brunel.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

Introduction. Activities promoting research reproducibility and transparency are crucial for generating trustworthy evidence. Evaluation of smoking interventions is one area where vested interests may motivate reduced reproducibility and transparency. Aims. Assess markers of transparency and reproducibility in smoking behaviour change intervention evaluation reports. Methods. One hundred evaluation reports of smoking behaviour change intervention randomised controlled trials published in 2018-2019 were identified. Reproducibility markers of pre-registration; protocol sharing; data, material, and analysis script sharing; replication of a previous study; and open access publication were coded in identified reports. Transparency markers of funding and conflict of interest declarations were also coded. Coding was performed by two researchers, with inter-rater reliability calculated using Krippendorff’s alpha. Results. Seventy-one percent of reports were open access, and 73% were pre-registered. However, there are only 13% provided accessible materials, 7% accessible data, and 1% accessible analysis scripts. No reports were replication studies. Ninety-four percent of reports provided a funding source statement, and eighty-eight percent of reports provided a conflict of interest statement. Conclusions. Open data, materials, analysis, and replications are rare in smoking behaviour change interventions, whereas funding source and conflict of interest declarations are common. Future smoking research should be more reproducible to enable knowledge accumulation. This study was pre-registered: https://osf.io/yqj5p.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
Copyright © 2021 Emma Norris et al.

1. Introduction

Researchers are becoming increasingly aware of the importance of reproducibility and transparency in scientific research and reporting [Reference Munafò, Nosek and Bishop1, Reference Nosek, Alter and Banks2]. A well-documented “replication crisis” in psychology and other disciplines has shown that engrained academic incentives encouraging novel research have led to biased and irreproducible findings [Reference Ioannidis3Reference Open Science Collaboration6]. Researchers, journals, and funding organisations across psychology and health sciences are contributing to reforming scientific practice to improve the credibility and accessibility of research [Reference Munafò, Nosek and Bishop1, Reference Norris and O’Connor7].

“Open Science,” where some or all parts of the research process are made publicly and freely available, is essential for increasing research transparency, credibility, reproducibility, and accessibility [Reference Kathawalla, Silverstein and Syed8]. Reproducibility-facilitating research behaviours are varied and occur throughout the research life cycle. During study design, pre-registration and protocols specify the hypotheses, methods, and analysis plan to be used in proposed subsequent research in repositories such as Open Science Framework and AsPredicted. Such specification is designed to reduce researcher degrees of freedom and undisclosed flexibility, ensuring features such as primary and secondary hypotheses and analysis plans remain fixed and preventing “p-hacking” [Reference Head, Holman, Lanfear, Kahn and Jennions9]. Within health research, pre-registration and protocol sharing also facilitate future replication and real-world adoption of medical and behavioural interventions [Reference Huebschmann, Leavitt and Glasgow10]. During data analysis, scripts can be made more reproducible by marking their code with step-by-step comments, improving clarity and replication [Reference van Vliet11]. During dissemination, materials (such as intervention protocols and questionnaires), data, and analysis scripts can be made available by uploading to repositories such as Open Science Framework or GitHub [Reference Klein, Hardwicke and Aust12], facilitating the replication of effective research and interventions [Reference Heirene13]. Allowing data and trial reports to be made available regardless of their findings enables a more accurate picture of the full state of research, minimising the “file drawer” problem by which positive findings are more likely to be published than negative findings [Reference Rotton, Foos, Van Meek and Levitt14]. Sharing data and analysis code also allows for checking of research findings and conclusions, as well as easier synthesis of related findings via meta-analyses [Reference Ross15]. Transparency-facilitating research behaviours include reporting sources of research funding and conflicts of interest [Reference Fontanarosa, Flanagin and DeAngelis16, Reference Smith17]. These are important in that they help readers to make informed judgements about potential risks of bias [Reference Cristea and Ioannidis18].

Metascience studies have assessed markers of reproducibility and transparency in the related domains of psychology and life sciences. A recent study exploring 250 psychology studies of varying study designs published between 2014 and 2017 found transparency and reproducibility behaviours to be infrequent [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Although public availability of studies via open access was common (65%), sharing of research resources was low for materials (14%), raw data (2%), and analysis scripts (1%). Pre-registration (3%) and study protocols (0%) were also infrequent [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Transparency of reporting was inconsistent for funding statements (62%) and conflict of interest disclosure statements (39%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Metascience studies have assessed reproducibility and transparency across other disciplines, including 250 studies in social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], 149 studies in biomedicine [Reference Wallach, Boyack and Ioannidis21], and 480 studies across two journals in biostatistics [Reference Rowhani-Farid and Barnett22], all with no restrictions on study designs. Other research has focused on the prevalence of specific reproducibility behaviours, such as the prevalence of open access publications, finding about 45% across scientific discipline assessed in 2015 [Reference Piwowar, Priem and Larivière23].

However, the extent of reproducibility and transparency behaviours in public health research, including smoking cessation, is currently unclear. A recent investigation of randomised controlled trials addressing addiction found data sharing to be nonexistent. 0/394 trials were found to make their data publicly available, with 31.7% of included trials addressing tobacco addiction [Reference Vassar, Jellison, Wendelbo and Wayant24]. It must be noted that various persistent barriers to data sharing exist, including technical, motivational, economic, political, legal, and ethical considerations (van Panhuis et al., 2014), which may limit the uptake of this specific Open Science behaviour. Markers of wider reproducibility behaviours are yet to be assessed in addiction research.

Transparent reporting in terms of funding and conflicts of interest is especially crucial for smoking cessation, where tobacco and pharmaceutical companies fund some research directly or indirectly [Reference Garne, Watson, Chapman and Byrne25]. Such vested interests may distort the reporting and interpreting of results, and this may especially be the case in areas of controversy such as e-cigarette research [Reference Heirene13, Reference Smith17, Reference Munafò and West26, Reference West27]. The aim of the current study is to assess markers of (i) reproducibility and (ii) transparency within smoking intervention evaluation reports.

2. Methods

2.1. Study Design

This was a retrospective observational study with a cross-sectional design. Sampling units were individual behaviour change intervention reports. This study applied a methodology used to assess reproducibility and transparency in the wider psychological sciences [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19] and social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20] to the context of smoking randomised controlled trial intervention reports. This study was pre-registered: https://osf.io/yqj5p. All deviations from this protocol are explicitly acknowledged in the appendix.

2.2. Sample of Reports

The Cochrane Tobacco Group Specialised Register of controlled trials was searched in November 2019, identifying 1630 reports from 2018 to 2019. Inclusion criteria were randomised controlled trials published in 2018 and 2019. Exclusion criteria were trial protocols, abstract-only entries, and economic or process evaluations. Of the 157 reports remaining after applying these criteria, 100 reports were selected due to time and resource constraints using a random number generator. PDFs were obtained from journal websites. These reports were also already included in the ongoing Human Behaviour-Change Project ([Reference Michie, Thomas and Johnston28, Reference Michie, Thomas and Mac Aonghusa29], https://osf.io/efp4x/), working to synthesis published evidence in behaviour change, beginning with smoking intervention evaluations. A list of all 100 reports included in this study is available: https://osf.io/4pfxm/.

2.3. Measures

Article characteristics extracted in this study were as follows: (i) 2018 journal impact factor for each report using the Thomson Reuters Journal Citation Reports facility and (ii) country of the corresponding author (Table 1). Additional article characteristics already extracted as part of the Human Behaviour-Change Project are also reported: (iii) smoking outcome behaviour (smoking abstinence, onset, reduction, quit attempt, or second-hand smoking) and (iv) behaviour change techniques (BCTs) in the most complex intervention group, coded using the Behaviour Change Techniques Taxonomy v1 [Reference Michie, Richardson and Johnston30]. In short, data from the Human Behaviour-Change Project was extracted using EPPI-Reviewer software [Reference Thomas, Brunton and Graziosi31] by two independent reviewers before their coding was reconciled and agreed. The full process of manual data extraction within the Human Behaviour-Change Project [Reference Bonin, Gleize and Finnerty32]. All extracted data on included papers is available: https://osf.io/zafyg/.

Table 1: Measured variables and operationalization.

∗If a response marked with an asterisk is selected, the coder is asked to provide more detail in a free text response box. Note: identified measured variables have been adapted from a previous study assessing the transparency and reproducibility in psychological sciences [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19].

Markers of research reproducibility were assessed by recording the presence of the following in included reports: (i) pre-registration: whether pre-registration was reported as carried out, where the pre-registration was hosted (e.g., Open Science Framework and AsPredicted), whether it could be accessed, and what aspects of the study were pre-registered; (ii) protocol sharing: whether a protocol was reported as carried out and what aspects of the study were included in the protocol; (iii) data sharing: whether data was available, where it was available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), whether the data was downloadable and accessible, whether data files were clearly documented, and whether data files were sufficient to allow replication of reported findings; (iv) material sharing: whether study materials were available, where they were available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), and whether the materials were downloadable and accessible; (v) analysis script sharing: whether analysis scripts were available, where they were available (e.g., online repository such as Open Science Framework, upon request from authors, as a journal supplementary file), and whether the analysis scripts were downloadable and accessible; (vi) replication of a previous study: whether the study claimed to be a replication attempt of a previous study; and (vii) open access publication: whether the study was published as open access.

Markers of research transparency were assessed by recording the presence of the following in included reports: (i) funding sources: whether funding sources were declared and if research was funded by public organisations (such as research councils or charities), pharmaceutical, tobacco, or other companies; (ii) conflicts of interest: whether conflicts of interest were declared and whether conflicts were with public organisations (such as research councils or charities), pharmaceutical, tobacco, or other companies. All measured variables are shown in Table 1.

2.4. Procedure

Data collection took place between February and March 2020. Data for all measures were extracted onto a Google Form (https://osf.io/xvwjz/). All reports were independently coded by two researchers. Any discrepancies were resolved through discussion, with input from a third researcher if required.

2.5. Analysis

Research reproducibility was assessed using the markers of pre-registration; sharing of protocols, data, materials, and analysis scripts; replication; and open access publishing (Table 1). Research transparency was assessed using the markers of funding source and conflicts of interest declarations. Inter-rater reliability of the independent coding of the two researchers was calculated using Krippendorff’s alpha [Reference Hayes and Krippendorff33] using Python 3.6 (https://github.com/HumanBehaviourChangeProject/Automation-InterRater-Reliability).

3. Results

Inter-rater reliability was assessed as excellent across all coding, a=0.87. Full data provided on OSF: https://osf.io/sw63b/.

3.1. Sample Characteristics

Seventy-one out of 100 smoking behaviour change intervention reports were published in 2018 and 29 published in 2019. Out of the 100 reports, four had no 2018 journal impact factor, with the remaining 96 reports having impact factors ranging from 0.888 to 70.67 (mean=4.95). Fifty-four out of 100 reports took place in the United States of America (https://osf.io/j2zp3/). Data from the Human Behaviour-Change Project identified that out of the 100 reports, 94 had a primary outcome behaviour of smoking abstinence, two of smoking onset and smoking reduction, respectively, and one of quit attempts and second-hand smoking, respectively. Forty-six out of the total 93 behaviour change techniques (BCTs) within the Behaviour Change Techniques Taxonomy (BCTTv1) were identified in the included reports. An average of 4.41 BCTs was identified in each report. The most commonly identified BCTs were as follows: social support (unspecified) (BCT 3.1, n=65/100), pharmacological support (BCT 11.1, n=61/100), problem solving (BCT 1.2, n=42/100), and goal setting (behaviour) (BCT 1.1, n=34/100). A figure of all outcome behaviour and BCT codings can be found: https://osf.io/6w3f4/.

3.2. Markers of Reproducibility in Smoking Behaviour Change Intervention Evaluation Reports

Final reconciled coding of reproducibility and transparency for all smoking behaviour change intervention reports can be found at https://osf.io/jcgx6/.

3.2.1. Article Availability (Open Access)

Seventy-one out of 100 smoking behaviour change intervention reports were available via open access, with 29 only accessible through a paywall (Figure 1(a)).

Figure 1:

3.2.2. Pre-registration

Seventy-three out of 100 smoking behaviour change intervention reports stated that they were pre-registered, with 72 of these being accessible. Fifty-four studies were pre-registered at ClinicalTrials.gov, with the remainder pre-registered at the International Standard Randomized Clinical Trial Number registry (ISRCTN; n=7), the Australian and New Zealand Clinical Trials Registry (ANZCTR; n=4), Chinese Clinical Trial Registry (ChCTR; n=2), Netherlands Trial Register (NTR; n=2), Iranian Clinical Trials Registry (IRCT; n=1), Clinical Research Information Service in Korea (CRIS; n=1), or the UMIN Clinical Trials Registry in Japan (UMIN-CTR; n=1).

All of the 72 accessible pre-registrations reported methods, with 2 also reporting hypothesis. Only two accessible pre-registrations included hypothesis, methods, and analysis plans. Twenty-six of the 100 reports did not include any statement of pre-registration. One report stated the study was not pre-registered (Figure 1(b)).

3.2.3. Protocol Availability

Seventy-one out of 100 smoking behaviour change intervention reports did not include a statement about protocol availability. For the 29 reports that included accessible protocols, 23 had a protocol that included hypothesis, methods, and analysis plans. Three reports only had methods in their protocol, whereas two of them included both hypothesis and methods, and one of them included methods and analysis plans (Figure 1(c)).

3.2.4. Material Availability

Twenty-two out of 100 reports included a statement saying the intervention materials used were available. Sixteen of these reports provided materials via journal supplementary files, and six reports stated that their materials were only available upon request from the authors (Figure 1(d)).

3.2.5. Data Availability

Sixteen out of 100 reports included a data availability statement. Nine reports stated data was available upon request from the authors, and one stated the data was not available. The remaining six articles included their data in the supplementary files hosted by the journals, but one article’s data file could not be opened. Four of the remaining articles had clearly documented data files, but only two of them contained all necessary raw data. As such in total, only seven reports provided links to data that was actually accessible (Figure 1(e)).

3.2.6. Analysis Script Availability

Three out of 100 reports included an analysis script availability statement. However, only one provided accessible script as a supplementary file, with the remaining two stating analysis script available upon request from authors (Figure 1(f)).

3.2.7. Replication Study

None of the 100 smoking behaviour change intervention reports were described as replication studies (Figure 1(g)).

3.3. Markers of Transparency in Smoking Behaviour Change Intervention Evaluation Reports

Final reconciled coding of reproducibility and transparency markers for all smoking behaviour change intervention reports can be found at https://osf.io/jcgx6/.

3.3.1. Funding

Ninety-four of the 100 smoking behaviour change intervention reports included a statement about funding sources. Most of the reports disclosed public funding only such as via government-funded research grants, charities, or universities (n=80). Eight reports disclosed both public funding and funding from private companies. Five reports disclosed funding from private companies only, including pharmaceutical (n=3), tobacco companies (n=1), and other companies (n=1). One report reported receiving no funding (Figure 1(h)).

3.3.2. Conflicts of Interest

Eighty-eight of the 100 articles provided a conflict of interest statement. Most of these reports reported that there were no conflicts of interest (n=51). Thirty-seven reports reported that there was at least one conflict of interest, including from a pharmaceutical company (n=27), private company (n=17), public organisation (n=13), and tobacco company (n=3) (Figure 1(i)).

4. Discussion

This assessment of 100 smoking behaviour change intervention evaluation reports identified varying levels of research reproducibility markers. Most reports were open access and pre-registered; however, research materials, data, and analysis scripts were not frequently provided and no replication studies were identified. Markers of transparency assessed here by funding source and conflicts of interest declarations were common.

4.1. Assessment of Reproducibility Markers in Smoking Behaviour Change Intervention Evaluation Reports

Pre-registration, as a marker of research reproducibility, was found to be higher for smoking RCTs (73%) than in wider psychological research of varying study designs (3%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Open access reports were at similarly moderate levels (71%) to psychology (65%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], but greater than the 45% observed in the social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], 25% in biomedicine [Reference Wallach, Boyack and Ioannidis21], and 45% across scientific literature published in 2015 [Reference Piwowar, Priem and Larivière23]. This high rate of open access publishing in smoking interventions may reflect increasing requirements by health funding bodies for funded researchers to publish in open access outlets [Reference Severin, Egger, Eve and Hürlimann34, Reference Tennant, Waldner, Jacques, Masuzzo, Collister and Hartgerink35] and increasing usage of preprint publication outlets such as PsyArXiv for the psychological sciences and medRxiv for medical sciences.

The proportion of open materials was lower than in biomedicine (13% vs. 33%) [Reference Wallach, Boyack and Ioannidis21] but similar to the 11% of the social sciences [9]. Open analysis scripts were found to be as infrequently provided in smoking interventions as in wider psychological research (both 1%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], and biostatistics [Reference Rowhani-Farid and Barnett22].

Open data of smoking interventions was found to be very low (7%), but greater than the 0% estimate in a larger sample of 394 smoking RCTs [Reference Vassar, Jellison, Wendelbo and Wayant24] and to the 2% of wider psychological research [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19]. Raw data are essential for meta-analyses to make sense of the diverse smoking cessation evidence. Common barriers for including studies in meta-analyses include a lack of available data, often after requests from authors [Reference Greco, Zangrillo, Biondi-Zoccai and Landoni36, Reference Ioannidis, Patsopoulos and Rothstein37]. Provision of raw data as supplementary files to published intervention reports or via trusted third-party repositories such as the Open Science Framework [Reference Klein, Hardwicke and Aust12] is important to facilitate evidence synthesis, especially in a field as important for global health as smoking cessation.

No replication attempts were identified in this sample of smoking intervention reports, compared to 5% in wider psychology studies [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19] and 1% in the social sciences [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20]. This lack of replication may be due to a lack of available resources of smoking interventions to facilitate replication, as identified in this study, or may reflect a lack of research prioritisation and funding for replication, with novel rather than confirmatory research prioritised at global, institutional levels [Reference Munafò, Nosek and Bishop1, Reference Open Science Collaboration6].

4.2. Assessment of Transparency Markers in Smoking Behaviour Change Intervention Evaluation Reports

Declaration of funding sources and conflicts of interest, as markers of research transparency, was found here to be commonly provided in smoking intervention evaluation reports. Funding sources were declared in more smoking reports (95%) than wider psychology (62%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], social sciences (31%) [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], and biomedical science reports (69%) [Reference Wallach, Boyack and Ioannidis21]. Similarly, a statement on conflicts of interest was provided more commonly in smoking interventions (88%) than wider psychology (39%) [Reference Hardwicke, Thibault, Kosie, Wallach, Kidwell and Ioannidis19], social sciences (39%) [Reference Hardwicke, Wallach, Kidwell, Bendixen, Crüwell and Ioannidis20], and biomedical science reports (65%) [Reference Wallach, Boyack and Ioannidis21]. Seventeen percent of studies reported conflicts from private companies and 3% from tobacco companies. The comparatively high level of transparency markers observed here in smoking interventions compared to other fields is likely to reflect improved reporting following previous controversies in the field [Reference Garne, Watson, Chapman and Byrne25, Reference Bero38, Reference Malone and Bero39]. Funding and disclosure statements are now commonly mandated by journals related to smoking cessation [Reference Cristea and Ioannidis18, Reference Munafò and West26, Reference Nutu, Gentili, Naudet and Cristea40].

4.3. Strengths and Limitations

A strength of this study is its use of double coding by two independent researchers of all reproducibility and transparency markers, enabling inter-rater reliability assessment. A limitation is that this study is based on a random sample of 100 evaluation reports of smoking behaviour change interventions, whereby assessments of reproducibility and transparency may not be generalizable to broader smoking interventions. Second, markers of reproducibility and transparency were dependent on what was described within evaluation reports. Direct requests to authors or additional wider searching of third-party registries such as Open Science Framework may have identified additional information indicating reproducibility. The absence of explicit statements on protocol, material, data, and analysis script availability may not necessarily signal that resources will not be shared by authors, but arguably does add an extra step for researchers to seek out this information. Third, this approach of assessing Open Science behaviours in reported research may omit more nuanced approaches to Open Science taken by journals or authors, which may make assessed figures lower than in actual practice.

4.4. Future Steps to Increase Reproducibility and Transparency of Smoking Interventions

Urgent initiatives are needed to address the low levels of reproducibility markers observed here in smoking intervention research, especially in the areas of open materials, data, analysis scripts, and replication attempts. As with any complex behaviour change, this transformation requires system change across bodies involved in smoking cessation research: researchers, research institutions, funding organisations, journals, and beyond [Reference Munafò, Nosek and Bishop1, Reference Norris and O’Connor7]. Interventions are needed to increase the capability, opportunity, and motivation of these bodies to facilitate behaviour change towards reproducible research in smoking interventions [Reference Michie, Thomas and Johnston28, Reference Michie, van Stralen and West41]. For example, capability can be addressed by providing researcher training, equipping them with the skills needed to make their research open and reproducible, such as how to use the Open Science Framework, how to preprint servers, and how to make their analysis reproducible. Opportunity to engage in reproducible research in smoking interventions can be facilitated within institutions, facilitating discussions around open and reproducible working [Reference Orben42] and developing a culture around valuing progressive and open research behaviours [Reference Norris and O’Connor7].

Motivation to research reproducibly can be addressed by providing researcher incentives [Reference Norris and O’Connor7]. Open Science badges recognising open data, materials, and pre-registration have been adopted by journals as a simple, low-cost scheme to increase researcher motivation to engage in these reproducibility behaviours [Reference Kidwell, Lazarević and Baranski43]. Open Science badges have been identified as the only evidence-based incentive program associated with increased data sharing [Reference Rowhani-Farid, Allen and Barnett44]. However, adoption of Open Science badges in smoking cessation journals is currently low, indicating this as one important initiative currently missing in this field. Future research could compare this study’s baseline assessment of reproducibility and transparency markers in smoking cessation intervention evaluation reports to assess changes in reporting and researcher behaviour.

5. Conclusions

Reproducibility markers of smoking behaviour change intervention evaluation reports were varied. Pre-registration of research plans and open access publication were common, whereas the provision of open data, materials, and analysis was rare and replication attempts were nonexistent. Transparency markers were common, with funding sources and conflicts of interest usually declared. Urgent initiatives are needed to improve reproducibility in open materials, data, analysis scripts, and replication attempts. Future research can compare this baseline assessment of reproducibility and transparency in the field of smoking interventions to assess changes.

Appendix

Updates to Preregistered Protocol

During the course of this study and peer review, we made minor adjustments to the preregistered protocol as follows:

  1. (1) We revised the remit of “smoking cessation” to instead refer to “smoking behaviour change” more broadly. This allowed inclusion of cessation, reduction, and second-hand smoke intervention reports included within the Human Behaviour-Change Project knowledge system

  2. (2) Within the article characteristics measured variables, we added “smoking cessation behaviour” to identify whether each report addressed smoking cessation, reduction, or second-hand smoke specifically

  3. (3) Within the article characteristics measured variables, we added “behaviour change techniques” to specify the intervention content identified within each report. Behaviour change techniques were already coded within the parallel Human Behaviour-Change Project: working to synthesis published evidence in behaviour change, beginning with smoking intervention evaluations

Data Availability

All data are provided on OSF: https://osf.io/5rwsq/.

Conflicts of Interest

RW has undertaken research and consultancy for companies that develop and manufacture smoking cessation medications (Pfizer, J&J, and GSK). He is an unpaid advisor to the UK’s National Centre for Smoking Cessation and Training and a director of the not-for-profit Community Interest Company, Unlocking Behaviour Change Ltd. No other competing interests to disclose.

Acknowledgments

The authors would like to thank Ailbhe N. Finnerty for calculating inter-rater reliability. EN was employed during this study on The Human Behaviour-Change Project, funded by a Wellcome Trust collaborative award (grant number 201,524/Z/16/Z).

References

Munafò, M. R., Nosek, B. A., Bishop, D. V. et al., “A manifesto for reproducible science,Nature Human Behaviour, vol. 1, no. 1, pp. 19, 2017.10.1038/s41562-016-0021CrossRefGoogle ScholarPubMed
Nosek, B. A., Alter, G., Banks, G. C. et al., “Scientific standards. Promoting an open research culture,Science, vol. 348, no. 6242, pp. 14221425, 2015.10.1126/science.aab2374CrossRefGoogle ScholarPubMed
Ioannidis, J. P., “Why most published research findings are false,PLoS Medicine, vol. 2, no. 8, article e124, 2005.10.1371/journal.pmed.0020124CrossRefGoogle ScholarPubMed
John, L. K., Loewenstein, G., and Prelec, D., “Measuring the prevalence of questionable research practices with incentives for truth telling,Psychological Science, vol. 23, no. 5, pp. 524532, 2012.10.1177/0956797611430953CrossRefGoogle ScholarPubMed
Nosek, B. A., Spies, J. R., and Motyl, M., “Scientific utopia II: restructuring incentives and practices to promote truth over publishability,Perspective on Psychological Science, vol. 7, no. 6, pp. 615631, 2012.10.1177/1745691612459058CrossRefGoogle ScholarPubMed
Open Science Collaboration, , “Estimating the reproducibility of psychological science,Science, vol. 349, no. 6251, article aac4716, 2015.10.1126/science.aac4716CrossRefGoogle Scholar
Norris, E. and O’Connor, D. B., “Science as behaviour: using a behaviour change approach to increase uptake of open science,Psychology & Health, vol. 34, no. 12, pp. 13971406, 2019.10.1080/08870446.2019.1679373CrossRefGoogle ScholarPubMed
Kathawalla, U. K., Silverstein, P., and Syed, M., “Easing Into Open Science: A Tutorial for Graduate Students,” 2020, https://psyarxiv.com/vzjdp/.10.31234/osf.io/vzjdpCrossRefGoogle Scholar
Head, M. L., Holman, L., Lanfear, R., Kahn, A. T., and Jennions, M. D., “The extent and consequences of p-hacking in science,PLoS Biology, vol. 13, no. 3, article e1002106, 2015.10.1371/journal.pbio.1002106CrossRefGoogle ScholarPubMed
Huebschmann, A. G., Leavitt, I. M., and Glasgow, R. E., “Making health research matter: a call to increase attention to external validity,Annual Review of Public Health, vol. 40, no. 1, pp. 4563, 2019.10.1146/annurev-publhealth-040218-043945CrossRefGoogle Scholar
van Vliet, M., “Seven quick tips for analysis scripts in neuroimaging,PLoS Computational Biology, vol. 16, no. 3, article e1007358, 2020.10.1371/journal.pcbi.1007358CrossRefGoogle ScholarPubMed
Klein, O., Hardwicke, T. E., Aust, F. et al., “A practical guide for transparency in psychological science,Collabra: Psychology, vol. 4, no. 1, p. 20, 2018.10.1525/collabra.158CrossRefGoogle Scholar
Heirene, R., “A call for replications of addiction research: which studies should we replicate & what constitutes a “successful” replication?,Addiction Research & Theory, vol. 1, pp. 19, 2020.Google Scholar
Rotton, J., Foos, P. W., Van Meek, L., and Levitt, M., “Publication practices and the file drawer problem: a survey of published authors,Journal of Social Behavior and Personality, vol. 10, no. 1, pp. 113, 1995.Google Scholar
Ross, J. S., “Clinical research data sharing: what an open science world means for researchers involved in evidence synthesis,Systematic Reviews, vol. 5, no. 1, p. 159, 2016.10.1186/s13643-016-0334-1CrossRefGoogle ScholarPubMed
Fontanarosa, P. B., Flanagin, A., and DeAngelis, C. D., “Reporting conflicts of interest, financial aspects of research, and role of sponsors in funded studies,JAMA, vol. 294, no. 1, pp. 110111, 2005.10.1001/jama.294.1.110CrossRefGoogle ScholarPubMed
Smith, R., “Beyond conflict of interest: transparency is the key,British Medical Journal, vol. 317, no. 7154, pp. 291292, 1998.10.1136/bmj.317.7154.291CrossRefGoogle ScholarPubMed
Cristea, I.-A. and Ioannidis, J. P., “Improving disclosure of financial conflicts of interest for research on psychosocial interventions,JAMA Psychiatry, vol. 75, no. 6, pp. 541542, 2018.10.1001/jamapsychiatry.2018.0382CrossRefGoogle ScholarPubMed
Hardwicke, T. E., Thibault, R. T., Kosie, J., Wallach, J. D., Kidwell, M. C., and Ioannidis, J., Estimating the prevalence of transparency and reproducibility-related research practices in psychology (2014-2017), MetaArXiiv, 2020.10.31222/osf.io/9sz2yCrossRefGoogle ScholarPubMed
Hardwicke, T. E., Wallach, J. D., Kidwell, M. C., Bendixen, T., Crüwell, S., and Ioannidis, J. P., “An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017),Royal Society Open Science, vol. 7, no. 2, article 190806, 2019.Google Scholar
Wallach, J. D., Boyack, K. W., and Ioannidis, J. P., “Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017,PLoS Biology, vol. 16, no. 11, article e2006930, 2018.10.1371/journal.pbio.2006930CrossRefGoogle ScholarPubMed
Rowhani-Farid, A. and Barnett, A. G., “Badges for sharing data and code at biostatistics: an observational study,F1000Research, vol. 7, p. 90, 2018.10.12688/f1000research.13477.1CrossRefGoogle ScholarPubMed
Piwowar, H., Priem, J., Larivière, V. et al., “The state of OA: a large-scale analysis of the prevalence and impact of open access articles,PeerJ, vol. 6, article e4375, 2018.10.7717/peerj.4375CrossRefGoogle ScholarPubMed
Vassar, M., Jellison, S., Wendelbo, H., and Wayant, C., “Data sharing practices in randomized trials of addiction interventions,Addictive Behaviors, vol. 102, p. 106193, 2020.10.1016/j.addbeh.2019.106193CrossRefGoogle ScholarPubMed
Garne, D., Watson, M., Chapman, S., and Byrne, F., “Environmental tobacco smoke research published in the journal indoor and built environment and associations with the tobacco industry,The Lancet, vol. 365, no. 9461, pp. 804809, 2005.Google Scholar
Munafò, M. R. and West, R., “E-cigarette research needs to adopt open science practices to improve quality,Addiction, vol. 115, no. 1, pp. 34, 2020.10.1111/add.14749CrossRefGoogle ScholarPubMed
West, R., “Open science and pre-registration of studies and analysis plans,Addiction, vol. 115, no. 1, pp. 55, 2020.10.1111/add.14894CrossRefGoogle ScholarPubMed
Michie, S., Thomas, J., Johnston, M. et al., “The Human Behaviour-Change Project: harnessing the power of artificial intelligence and machine learning for evidence synthesis and interpretation,Implementation Science, vol. 12, no. 1, p. 121, 2017.10.1186/s13012-017-0641-5CrossRefGoogle ScholarPubMed
Michie, S., Thomas, J., Mac Aonghusa, P. et al., “The Human Behaviour-Change Project: An artificial intelligence system to answer questions about changing behaviour,Wellcome Open Research, vol. 5, no. 122, p. 122, 2020.10.12688/wellcomeopenres.15900.1CrossRefGoogle ScholarPubMed
Michie, S., Richardson, M., Johnston, M. et al., “The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions,Annals of Behavioral Medicine, vol. 46, no. 1, pp. 8195, 2013.10.1007/s12160-013-9486-6CrossRefGoogle ScholarPubMed
Thomas, J., Brunton, J., and Graziosi, S., “EPPI-reviewer 4.0: software for research synthesis,” in EPPI Centre, London, England, 2010.Google Scholar
Bonin, F., Gleize, M., Finnerty, A. et al., “HBCP corpus: a new resource for the analysis of behavioural change intervention reports,” in Proceedings of The 12th Language Resources and Evaluation Conference, 19671975, Marseille, France, 2020.Google Scholar
Hayes, A. F. and Krippendorff, K., “Answering the call for a standard reliability measure for coding data,Communication Methods and Measures, vol. 1, no. 1, pp. 7789, 2007.10.1080/19312450709336664CrossRefGoogle Scholar
Severin, A., Egger, M., Eve, M. P., and Hürlimann, D., “Discipline-specific open access publishing practices and barriers to change: an evidence-based review,F1000Research, vol. 7, article 1925, 2020.10.12688/f1000research.17328.2CrossRefGoogle Scholar
Tennant, J. P., Waldner, F., Jacques, D. C., Masuzzo, P., Collister, L. B., and Hartgerink, C. H., “The academic, economic and societal impacts of open access: an evidence-based review,F1000Research, vol. 5, p. 632, 2016.10.12688/f1000research.8460.3CrossRefGoogle ScholarPubMed
Greco, T., Zangrillo, A., Biondi-Zoccai, G., and Landoni, G., “Meta-analysis: pitfalls and hints,Heart, Lung and Vessels, vol. 5, no. 4, pp. 219225, 2013.Google ScholarPubMed
Ioannidis, J. P., Patsopoulos, N. A., and Rothstein, H. R., “Reasons or excuses for avoiding meta-analysis in forest plots,British Medical Journal, vol. 336, no. 7658, pp. 14131415, 2008.10.1136/bmj.a117CrossRefGoogle ScholarPubMed
Bero, L. A., “Tobacco industry manipulation of research,Public Health Reports, vol. 120, no. 2, pp. 200208, 2005.10.1177/003335490512000215CrossRefGoogle ScholarPubMed
Malone, R. E. and Bero, L., “Chasing the dollar: why scientists should decline tobacco industry funding,Journal of Epidemiology & Community Health, vol. 57, no. 8, pp. 546548, 2003.10.1136/jech.57.8.546CrossRefGoogle ScholarPubMed
Nutu, D., Gentili, C., Naudet, F., and Cristea, I. A., “Open science practices in clinical psychology journals: an audit study,Journal of Abnormal Psychology, vol. 128, no. 6, pp. 510516, 2019.10.1037/abn0000414CrossRefGoogle ScholarPubMed
Michie, S., van Stralen, M. M., and West, R., “The behaviour change wheel: a new method for characterising and designing behaviour change interventions,Implementation Science, vol. 6, no. 1, p. 42, 2011.10.1186/1748-5908-6-42CrossRefGoogle ScholarPubMed
Orben, A., “A journal club to fix science,Nature, vol. 573, no. 7775, pp. 465465, 2019.10.1038/d41586-019-02842-8CrossRefGoogle ScholarPubMed
Kidwell, M. C., Lazarević, L. B., Baranski, E. et al., “Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency,PLoS Biology, vol. 14, no. 5, article e1002456, 2016.10.1371/journal.pbio.1002456CrossRefGoogle ScholarPubMed
Rowhani-Farid, A., Allen, M., and Barnett, A. G., “What incentives increase data sharing in health and medical research? A systematic review,Research Integrity and Peer Review, vol. 2, no. 1, p. 4, 2017.10.1186/s41073-017-0028-9CrossRefGoogle ScholarPubMed
Figure 0

Table 1: Measured variables and operationalization.

Figure 1

Figure 1: