Skip to main content
Advertisement
  • Loading metrics

Ten simple rules for open human health research

Introduction

We are witnessing a dramatic transformation in the way we do science. In recent years, significant flaws with existing scientific methods have come to light, including a lack of transparency, insufficient stakeholder involvement, disconnection from the public, and limited reproducibility of research findings [17]. These concerns have sparked the global Open Science movement, which seeks to revolutionize the practice of science. This new approach to science extends principles of openness to the entire research cycle, from hypothesis generation to data collection, analysis, interpretation, and dissemination. Open Science seeks to remove all barriers to conducting high quality, rigorous, and impactful scientific research by ensuring that the data, methods, and opportunities for collaboration are open to all. Emerging digital technologies and "big data" (see "Ten simple rules for responsible big data research" [8]) have further accelerated the Open Science movement by affording new approaches to data sharing, connecting researcher networks, and facilitating the dissemination of research findings.

Open scientific practices are also having a profound impact on the health sciences and medical research and, specifically, how we conduct clinical research with human participants. Human health research necessitates careful considerations for practicing science in an ethical manner. Given the particular urgency of human health research, a discipline with direct implications for people's health and wellbeing, doing good science takes on a different meaning than simply doing science well. It also requires the scientist to reassess the conventional view of human health research as a pursuit conducted by scientists on human subjects, and lays a greater emphasis on inclusive and ethical practices to ensure that the research takes into account the interests of those who would be most impacted by the research. Openness in the context of human health research comes with risks, raising concerns about privacy and security. However, openness also presents opportunities for people, including participants of research studies, to contribute in every capacity. At the core of open health research, scientific discoveries are not only the product of collaboration across disciplines, but must also be owned by the community that is inclusive of researchers, health workers, and patients and their families. To guide successful open health research practices, it is essential to carefully consider and delineate its guiding principles.

This Editorial is aimed at individuals participating in health science in any capacity, including but not limited to people living with medical conditions, health professionals, study participants, and researchers spanning all types of disciplines. We present ten simple rules (see Fig 1) that, while not comprehensive, offer guidance for conducting health research with human participants in an open, ethical, and rigorous manner. Implementing these rules can be difficult and resource intensive, and the rules can, at times. overlap with one another as well as conflict with one another. They present a challenge and may not be implemented all at once, but they are intended to accelerate and improve the quality of human health research. Work that fails to follow these rules is not necessarily poor quality research [9], especially if the reasons for breaking the rules are carefully considered and openly articulated (see Rule 6: document everything). While most of the responsibility of following these rules falls on researchers, anyone involved in human health research in any capacity [10] can apply them.

thumbnail
Fig 1. Our 10 simple rules for open human health research, presented graphically.

Image sources: [147156].

https://doi.org/10.1371/journal.pcbi.1007846.g001

For each rule, we provide a very brief background motivating inclusion of the rule, followed by a few recommendations.

Rule 1: Integrate ethical principles

Health research is no longer exclusive to scholars or medical professionals. Technology developers are increasingly engaging in and leading biomedical research, participants are taking on a more active role as partners in research, and nonscientists are even designing and deploying their own health research projects [11]. While this greater involvement of nontraditional parties in health research has the potential to advance the research in novel ways, it is critical for these parties to abide by ethical and responsible practices, to ensure privacy and safety. The tech industry continues to generate increasingly sophisticated digital technologies, such as wearable devices, mobile phone apps, and social media that can record more and more aspects of daily life [1215] without direct, voluntary consent or clear information about how data will be used, shared, or reported. This can lead to unintended consequences, such as inappropriate disclosure of personal information and the spread of inaccurate or misleading information.

In a rapidly changing research landscape with shifting roles, it is crucial to emphasize and enforce core ethical principles, including respect for persons, justice and beneficence (doing what is right) [16], and respect for law and the public interest [17]. The well-established, tried and tested rules and regulations for behavioral and biomedical research involving human participants [18] must demonstrate voluntary participation via informed consent [1922], perform risk assessment to determine if the probability and magnitude of potential harms are balanced against potential benefits, include those who may benefit most from knowledge gained, consider downstream societal implications, conduct an external review of study procedures before initiating any project, and develop additional protections for vulnerable stakeholders. We consider stakeholders broadly as any entity with the potential to be affected, directly or indirectly, by the project in question.

Include people on your team who bring expertise in research ethics, methods, and data management. This is especially important for successfully guiding open human health research, in which efforts to mitigate risks for human participants and uphold key ethical principles must be kept open and transparent. Carefully choose what data to collect and how to represent and store those data, remembering that while data storage costs have rapidly shrunk, other costs, including but not limited to compromised privacy and unauthorized access, are inherent in any data collection [2326].

Take responsibility for building a peer review process into each study design with periodic checks and balances. Do not simply delegate consideration of ethical and responsible research practices solely to research ethics boards (also known as institutional review boards [IRBs]). The Connected and Open Research Ethics (CORE) initiative [27] is a global community interested in collaborating to shape ethical practices in digital research, and their resource library contains shared IRB-approved protocols and a forum for sharing expertise and answering questions. The Citizen Science Association has also developed and shared materials for conducting an IRB review [28], to help build an ethics review process for the citizen science community. Further resources to advance our understanding of the ethical, legal, and social implications of this emerging digital research ecosystem are provided by the CORE initiative [29,30], MobileELSI [31], Pervasive Data Ethics for Computational Research (PERVADE) [32], and Clinical Trials Transformation Initiative (CTTI) [33].

Rule 2: Involve nonscientists

There are many roles nonscientists can take to advance human health research, beyond participation in traditional, computer task- or game-based citizen science projects [34,35]. First and foremost are the patients who are best served by the research, who can not only enroll as participants but also help define problems, goals, and measures of success. Any interested party, including patients, clinicians, ethicists, policy makers, funding agencies, and individuals from the general public [36], can and should partner with the research community at the different stages of research—soliciting ideas for funding, designing or coordinating studies, recruiting participants, collecting or analyzing data, interpreting or broadcasting results, participating in the peer review process [3739], and so on. The website for the Office of Research Integrity provides a resource for learning about responsible scientific methods, the Basic Research Concepts [40].

Include nonscientist stakeholders throughout the scientific process in meaningful, informative, accessible, and engaging ways. From the very inception of a study, encourage and support the active participation of patients and other interested parties in defining research questions. Patient-led innovation platforms and patient-driven networks in health, such as PatientsLikeMe [41], help connect people suffering from common diseases to share their experiences and have spawned scientific studies [42]. When technologies are involved, collaborate with technology developers and end-users to ensure products are scientifically validated, evidence-based, and user-friendly.

For community-facing projects, hold meetings with community members to allow for concerns and questions to be voiced and responded to. Seek out opportunities to bridge divides among communities and their access to resources. For example, work to match stakeholder ideas and needs to other stakeholders’ skills and resources. Make efforts to raise awareness of complementary literatures and overcome disciplinary divides. Participate in funding opportunities for projects that involve non-research stakeholders and patient-centered outcomes, such as from the Patient-Centered Outcomes Research Institute (PCORI) [43].

Invite nonscientist stakeholders to take part in scientific events, such as conferences, seminars, workshops, and lab meetings [44,45]. Participate in such events as a non-scientist in research outside of your areas of expertise (e.g., stepping outside of your “comfort zone”). Actively engage with nonscientists and participants outside one's discipline; listen to, respect, and value their perspectives and opinions. Strive to engage a diverse population (e.g., demographic, gender representation, employment, education, etc.). Such diversity will ensure a better informed approach to the research, a greater interest in the research results, and broader generalizability of the research findings. This is especially important because the views and perspectives of patient groups who stand to benefit most from research are rarely considered or acknowledged, representing a persistent challenge across many areas of health research.

Rule 3: Clarify roles and rewards

There are obvious benefits to clearly articulating what roles different contributors will play in a given research study and how they will be acknowledged or rewarded accordingly. Not only does it set up reasonable expectations for all parties, but it also avoids conflicts and misunderstandings commonly found in the academic research community related to authorship and allocation of funds and other resources [46]. Human health research raises the stakes considerably, given that it involves human participants, who are rarely acknowledged for their participation. Open human health research raises the bar further, as it engages many different stakeholders and increases the number of potential contributors who should be rewarded for their contributions.

Rewards in research for nonscientists, aside from the satisfaction of having contributed to science and possible monetary compensation or prizes [4750], typically include information about their health or access to experimental treatment. Rewards for scientists are also often driven by forces beyond the individual’s control, such as funding, promotion, and tenure. While individual scientists do write proposals to request resources and support, it is rare for them to take a more hands-on approach and launch a crowdsourcing campaign, and many are apathetic to self-promotion through social media. We therefore focus our recommendations, for both scientists and nonscientists, on different forms of recognition as means of conferring and receiving rewards, rather than direct monetary or career gains.

At the outset of a research project, clarify contributor roles, acknowledgments, rewards, and code of conduct, e.g., see Conference Code of Conduct [51]. Use resources like "Ten Simple Rules for a Successful Collaboration" [52] and Collaboration and Team Science: A Field Guide [53] for guidance defining these roles. Also clarify when data or software can be released, how it will be released (e.g., Github, Figshare, Google Drive), and cite the resources you use [54]. Think beyond the usual contributor acknowledgments of "author," "editor," "contributor," "acknowledgment," etc. [55] and reconsider author order. In other words, clearly define and state what contributions would lead to what acknowledgments or rewards [56]. The International Committee of Medical Journal Editors provides guidance (the Vancouver Recommendations) that many journals require for submissions and are good practices to follow regardless of publisher requirements [57,58]. The Committee on Publication Ethics also provides hundreds of guiding documents, including flowcharts, specifically relating to authorship and contributorship [5961].

Even outside of your own research, acknowledge where good, open, ethical, inclusive human health research practices are conducted. Be especially mindful to acknowledge open practices [6264], research in languages in addition to English [65], and research from nontraditional actors [66]. Point out where greater efforts could be made toward better scientific practices. Lead by example, but also, when attending another’s talk or lecture, do ask for clarification on who contributed what, so as to encourage this practice in others.

Engage in more quantitative approaches of acknowledgment and reward. For example, rigorously quantify the degree to which your research and contributed or adopted resources that you use embrace openness, ethical practices, inclusiveness, etc. Think carefully about what "impact" means in relation to your work. For example, rather than (or in addition to) tracking academic citations, you may be more interested in fostering collaboration between particular previously siloed, isolated bodies of knowledge or in tracking some aspect of your research into practice. Make use of indicators that measure or estimate those types of impact [6,6773].

Rule 4: Replicate prior work

It is incumbent on researchers to ground their research in the context of prior work. The first step is often to confirm prior work by reproducing past results (apply the same methods to the same data to get the same results). To ensure that this prior work translates to a new study population or reimplementation of old methods, a researcher tries to corroborate prior work by replicating past results (collect new data, apply similar methods, to get similar results). Replication in science is presently in a woeful but improving state [74,75]. Science is by its nature uncertain, improving and replacing current models with better models over time. Replicating prior work helps to reduce this uncertainty and increase our confidence in the findings [7679]. Conversely, past work can be reassessed in light of new findings as well [80], and past data collected by others can be independently reused or integrated with newer datasets [81,82].

Replication does not necessarily mean running a past study or its analysis again in exactly the same way—this may be a waste of resources if the original study was conducted on a small, nonrepresentative population using outdated approaches. Instead, use best available practices and sufficiently powered sample sizes from relevant populations to evaluate the state of knowledge and establish a sound foundation for a research program. Some conferences, such as the Organization for Human Brain Mapping, have given replication awards to encourage such studies [83].

Designate some of your time and research efforts to replication and confirmatory studies. Find prior work related to your research questions. Carry out replication studies by following published methods with new or existing open data, explaining your deliberate data acquisition choices [82]. Be mindful of the fact that validity and replicability are different, and that the goal of replication is to test validity or generalizability of the models in question [80]. Perform complementary analyses on published open data to further explore the data behind published findings [44].

Rule 5: Make research reproducible

Just as it is crucial to try to replicate prior work to ground current research, it is likewise crucial to make your own research work reproducible as a foundation for future research. While replicability is the ability of a method to be repeated to obtain a consistent result, reproducibility is the extent to which the same conclusions can be drawn from the same data by using either the same or different methods. Data and methods must be subjected to scrutiny and evaluated for robustness and generalizability. This practice is not an act of generosity—if you do not make your data and methods available and clear to others, you undermine the credibility of your work and hinder the advance of science.

Follow FAIR (findable, accessible, interoperable, and reusable) principles in your scientific practices [84,85]. The following two rules regarding documentation and accessible presentation are most closely related to reproducibility. Specifically, for documentation to aid reproducibility it must be shared, just like presentations, and shared in formats (languages, descriptions, file types) that are easily accessible [86]. In practice, "data […] exist in small units, are linked to many other related units, and are difficult to interpret without considerable documentation and context" [54]. Adequate data documentation can be difficult and resource intensive [82,87]; while inadequate data management can severely compromise the scientific value and interpretability of the associated research. See "Ten Simple Rules for the Care and Feeding of Scientific Data" [88] for guidance.

Share data, methods, and documentation in open-access repositories [88]. At the very least, this practice enables consumers of your research to scrutinize your work. More importantly, other methods can be applied to your data, and your methods can be applied to other data, to test assumptions, hypotheses, methods, as well as data quality and generalizability. Digital containers (e.g., Docker and Singularity) make it much easier to conduct reproducible research within self-contained environments and help mitigate concerns about maintaining software and dependencies in different computing environments.

Since data breaches are a persistent challenge, provide participants with clear and accessible information about how data will be collected, stored, shared, and used in the future, while making it clear that no one can provide absolute guarantees about future data security. Do not collect identifiable information if you do not need it for your research. Otherwise, separate identifiable from currently nonidentifiable information and, if possible, destroy the identifiable information at the conclusion of the study. Be sure to scrub software and other documentation of any references to participant-specific information. Apply best practices for data deidentification, such as mixing data or adding noise to data (differential privacy, face removal from images, etc.). In cases where data cannot be made fully open, deposit metadata-only records in a repository with instructions for who can gain access to the data and how. There is a variety of options available when choosing a data repository to store and share data and metadata, such as Open Science Framework (osf.io), Zenodo (zenodo.org), Synapse (synapse.org), Dryad (datadryad.org), and Harvard Dataverse (dataverse.harvard.edu). Directories of data repositories include re3data (re3data.org) and OpenDOAR (v2.sherpa.ac.uk/opendoar).

In your published methods and results, be as clear as you can about your assumptions, hypotheses, measures, and methods. Summary statistics and thresholds can be useful, but commonly-reported statistics such as p values are not one-size-fits-all measures of research quality or reproducibility [8991]. Where word limits or other constraints prevent adequate articulation for clarity, publish as supplementary information elsewhere (see Rule 6: document everything and Rule 7: publish and present accessibly).

Rule 6: Document everything

In service of the "kind of transparency which is the opposite of secrecy" definition of openness [92], each step of research requires clear, accurate, and precise documentation. Comprehensive, clear, and accurate documentation is critical for replicability and reproducibility of research but is also critical for communicating to a larger audience than the research community and can encompass elements beyond those required to conduct the research. People can benefit from insights into the entire process, such as how and why a research question was formulated, what significance and impact answering the question could have, how the question relates to prior work, how the study was designed and executed, how the results were interpreted and presented, and what lessons were learned (92).

Prior to recruiting any participants and collecting any data, preregister your literature review, ethics statement [93], and methods. Preregistration can consist simply of documentation of plans for conducting a study, independent of peer review or a publisher, and can be submitted to an online preregistration site (for example osf.io, aspredicted.org, or the PROSPERO registry for systematic reviews at crd.york.ac.uk/prospero/). Preregistration can also involve submitting to a publisher to be "externally reviewed, and those that meet criteria will be accepted in principle prior to data collection" [94] (See Fig 2). A preregistration manuscript submitted for review and accepted in principle by a publisher is called a "registered report" [95,96]. This requirement for preregistration is not always optional, as human health research that involves a clinical trial typically requires preregistration of study plans (e.g. clinicaltrials.gov in the US, the WHO International Clinical Trials Registry Platform—ICTRP, as well as many other country-specific clinical trials registries across the globe).

thumbnail
Fig 2. Each blue arrow in the diagram represents a research step that requires documentation.

Each red arrow is an opportunity for a preprint. Image source: [157].

https://doi.org/10.1371/journal.pcbi.1007846.g002

Document any change or amendment as a project progresses. To make the documentation process easier, seek out established templates [97101]. Strive toward reproducibility [74] (even for oneself in the future!) by providing self-contained, clear, and updated documentation and retaining data, code, recruitment documents, and other research artifacts to build upon in the future [102104].

After submitting registered reports and articles for publication, post your articles to preprint servers such as bioRxiv or arXiv, to share your knowledge and stake your claims without waiting for the full publication cycle [105,106]. Publish raw materials of your research such as data, lab notebooks, and software in appropriate venues, such as data and methods journals, and in the trusted repositories mentioned above.

Document and publish often and in detail, including experimental designs and negative results, to receive feedback and detect and resolve errors early in the process [107]. Errors occur in public and in private, and while "making code and data open does not prevent errors, […] it does make it possible to detect them. […] People often worry that if they make their code and data open, errors will be found, but that is really the whole point: We need to make code and data open because this is how the errors can be found" [108]. The individuals who document the research don’t have to be the same people who conduct the research: assigning different people to document versus run a study encourages generally understandable documentation. Finally, link to your publications, shared data, and other documentation on your professional website, social media, and curriculum vitae (CV) [106]. Let colleagues know about innovative documentation practices you are trying.

Rule 7: Publish and present accessibly

To best serve health research, communications at every stage of the research endeavor must be findable, accessible, interoperable, and reusable (FAIR; see Rule 5: make research reproducible) [84,85]. By accessible, we mean both easily retrievable and expressed in a manner that is clear and intelligible to the widest possible audience without unduly compromising the integrity of the information to be conveyed. This is a challenge not only because there are technical and abstract elements to any scientific study, but also because many scientists consider scientific journals as the sole conduit by which they convey results of their research.

When you have control over the license under which your work is published, choose a permissive license (e.g., [109,110]) and encourage consumers to use and share your work. Publish in open-access journals, being careful to choose appropriate, nonpredatory publications. Use checklists [111,112] to evaluate potential venues. Unless you must submit to a journal that disallows preprints, always submit your manuscript to a preprint server as well as a peer-reviewed journal. Tools like RoMEO (online, community-driven database of open-access policies) can help you navigate publisher licensing policies [113].

When you must publish under a closed license, deposit your article in a postpublication archive (eg, Hyper Articles en Ligne, HAL [114]) or on your own website once you are legally able. Some jurisdictions legally grant you the right to openly publish your closed license work after a specified embargo period; these laws may specify different embargo periods for different disciplines [115,116]. Some institutions (e.g., Harvard University, [117]) require open access for non-commercial use of their research. Consider making your work available in real-time on public platforms, such as Open Lab Notebooks, Open Science Framework, Labstep, GitHub, GitLab, Figshare, Zenodo, Dryad, protocols.io, and Aperture [104,118126]. By making these products open and accessible, the scientific community will be able to build on your research more rapidly and more effectively.

Research publications and other informational websites are often dominated by a few languages, especially English. Translate your work and the work of others into different languages, and account for cultural and social factors; the French-language open-access publisher Science Afrique [127] is an example of a regionally focused effort. Create or update Wikipedia pages on published research findings [128,129], in multiple languages.

Strive to make research, not just your own, accessible to nonscientists and scientists alike. For broader dissemination, feedback, and engagement than traditional publishing venues provide, researchers should also consider publishing in social media, blogs, and other platforms as a project progresses [104,118125]. Evidence indicates benefits to both data creators and the wider research community when research objects beyond books and articles are openly shared [70,130]. Even when submitting a manuscript to a traditional publisher, you can write a summary and/or a glossary of key terms [131], using language devoid of scientific jargon [132], add it as supplementary information to your manuscript, post it on your lab website, and share a link through social media to relevant groups. Consider using annotation tools [133135] to make papers you are interested in accessible to a wider community. Demystify the scientific funding process by reporting research costs and citing successful examples of return on investment (ratio of benefit to cost).

Finally, in the future, accessibility will increasingly refer to machine readability for computer mining and interpretation of the literature. Placing data, metadata, and any other structured documentation into data repositories will make them more easily discovered, cited, and tracked by humans today and machines tomorrow. Permanent, versioned, and unique identifiers (such as DOIs) will make it easier for computers to help us more rapidly navigate and analyze the vast literature in the future.

Rule 8: Emphasize research significance

Researchers all too often take for granted that the audience for their work is restricted to a narrow group of specialists who read and review their scientific articles and that the implications and significance of their work is readily apparent. However, because human health is a topic of immense interest, there will always be a great deal of attention on topics that relate to people’s hopes and concerns, especially by news media, and therefore there is a danger that the significance of a body of research will be misinterpreted. The onus is therefore often on researchers to communicate the meaning of their results and a clear context for their work and convey a strong sense of purpose and meaning that motivates an experiment’s design and drives any applications that are derived from the work.

For participants, let others know why you are participating in the research you participate in. For researchers, succinctly state the goals of each project, so that participants may understand not just their direct benefit, but how their contributions promote positive scientific research outcomes. Clearly publicize to all stakeholders the physical, realizable benefits of individual involvement in the research. Report on the implications of your research to wider audiences through traditional and nontraditional venues, from “news and views” pieces and press releases to Tweets, YouTube videos, and Science Cafe presentations.

For a researcher, the term “significance” confers an additional meaning distinct from importance and is referred to as “statistical significance,” Statistical significance is a commonly misunderstood and widely reported benchmark for believability of a study’s results. A critique of statistical significance reporting is beyond the scope of this editorial, but generally in statistical analyses, reporting a p value and using that value as a binary threshold is insufficient at best [1,5,90,91,136,137]. Thoroughly articulate statistical significance, including an explanation for both the selection of and practical interpretation of the statistical tests you performed in the context in which you performed those tests, the assumptions involved, and any alternative tests and assumptions that were considered but rejected. Put your research findings in context and communicate them clearly and cautiously and with appropriate caveats and considerations. Consider the relative size of the observed effects and consider and discuss not only the statistical but also the biological significance of your results.

Rule 9: Advocate open principles

Practicing open science is best done not in isolation, but in a community of open science practitioners. This is never more true than in human health research, where health data can be difficult to collect, share, and analyze, and the research itself is most often done in silos. Coordinating the activities among people, the interoperability of methods, the sharing of data, and the inclusion of more diverse stakeholders is not only desirable, but essential. For open health research to be successful we must build such a community, and this is possible only if we strongly and persistently advocate for principles that underpin it. To assure our efforts are effective and genuine, we must identify and focus on priorities for advocacy. The Transparency and Openness Promotion (TOP) guidelines, released in 2015, provide community-driven standards for publishers and funders [138140]. For individuals, promoting open health research can be as simple as initiating discussions in classrooms, conferences, and social events, and can be exercised in informal gatherings, such as dedicated Wikipedia editing sessions on open science topics, or open review sessions of articles on PREreview [141,142]. There are many steps that you can take to lead by example and promote the practice of open science today. We include some examples below from a list of recommendations we have curated [143].

Within your home institution: Catalyze open science practices through seminars, workshops, hackathons, and contests. Join groups that advocate evaluation or promotion criteria in support of open science. Pursue funding opportunities that require or permit open intellectual property. Opt for open methods rather than proprietary, licensed products. Apply liberal licenses to documents and software. Store data in free and open-access repositories.

In collaborations: Forge ties across labs to share resources. Collaborate with institutions that require open standards. Use collaborative software and collaborative software engineering practices. Publish a code of conduct for each project to clarify roles and help resolve disputes. Clarify contributor roles at the outset of a project to assign appropriate credit and accountability, especially for open contributions. Clarify when contributions to a project can be released. Avail yourself of experts in alternative and complementary methods to reduce bias, evaluate methods, and corroborate results. Participate in interdisciplinary open science and collaboration events.

In publications and presentations, publish in open-access venues and follow FAIR principles. Publish in open data and open methods journals. Follow community-supported data format and reporting guidelines. Insist on publishing experimental protocols and negative results. Boycott review or submission for publishers and publications that flout open standards. When reviewing others’ work, acknowledge attempts and provide recommendations toward more open science practices. Participate in open peer review, especially in languages other than English. Include an ethics section to articulate ethical considerations and implications. Make it clear where people can access open resources that you mention. When someone else mentions a resource, ask about access and usage restrictions. Include open resources on your webpage and CV.

Rule 10: Take calculated risks

A variety of risks are inherent in research with human subjects, in communications that can influence health practices, and in open practices. Honest and open deliberation of these potential risks across the lifespan of the research is essential to trustworthy, impactful human health research. These various risks can arrive in isolation or combination and be known in advance or realized over time. As such, we should justify the decision of whether to assume these risks based on the ability to mitigate potential harms against benefits of knowledge gained.

Return on investment must be considered in choosing which risks to take, and some risks may be too costly even if the potential rewards are great [80,144]. Openness is a buzzword today, particularly in science, and as such openwashing ("to spin a product or company as open, although it is not" [145]) is both a practice to watch out for and an example of a risk that would be hard to justify in terms of value but easy to justify in terms of cost. Legal frameworks, particularly as relating to personal data and privacy, are a rapidly changing factor in assessing these risks. Consequently, cost-benefit analyses should be undertaken frequently. These analyses should be documented (see Rule 6: document everything) and shared (see Rule 8: emphasize research significance).

Acknowledge good-faith efforts that fail and encourage publication of negative results. Pushback against closed institutional traditions, challenge secretive practices [146], and explore nontraditional methods. Risks related to the other rules include going beyond accepted norms of ethical protections and partnership with nonscientists, systematically establishing greater clarity and accessibility of who does what and how for better appreciation, understanding, reproducibility, and advocacy.

Seek feedback from external stakeholders (i.e., target populations, funding agencies, local government and university officials) for your experimental design and methods before participating in or conducting an experiment; act on the feedback collected if deemed wise and not merely opinions or conventional wisdom. Also seek outside training for students and employees that includes options for nonacademic paths.

Seek interdisciplinary collaborations and spend a percentage of time and research effort working on projects outside your comfort zone. For example, have researchers spend 10% to 20% of their time on other projects of interest that they are passionate about. These can include topics of research that have received pushback in the field, deemed "too large to tackle," or those unlikely to produce confirmatory results but have the potential to incite new areas of research.

Conclusion

We hope that the above list of simple rules is a helpful guide to follow best practices in open human health research. More importantly, we hope that you will use these as a starting point to address broken conventional practices of science and, where these rules fall short, share your own rules to improve the state of open, ethical, inclusive human health research. These rules are not comprehensive, but we are confident they capture many of the most salient, timely, and important principles that can guide open health research going forward. Be the change you seek in science [143] and strive to make human health research a more humane, effective, and, importantly, open endeavor.

Acknowledgments

Thanks to the Bettencourt Schueller Foundation long term partnership, the workshop that gave rise to this paper was partially supported by the funding from CRI Research Collaboratory. We would also like to thank all participants of the Open Health Research workshop (https://research.cri-paris.org/workshops#04._open-health) who contributed to discussions, and to the CRI Paris for hosting this three-day event!

References

  1. 1. Gigerenzer G. Statistical Rituals: The Replication Delusion and How We Got There. Adv Methods Pract Psychol Sci. 2018;1: 198–218.
  2. 2. Owens B. Replication failures in psychology not due to differences in study populations. Nature. 2018 [cited 3 Dec 2018].
  3. 3. Law Y-H. Replication Failures Highlight Biases in Ecology and Evolution Science. The Scientist Magazine. 1 Aug 2018. Available: https://www.the-scientist.com/features/replication-failures-highlight-biases-in-ecology-and-evolution-science-64475. Accessed 3 Dec 2018.
  4. 4. Baker M. Over half of psychology studies fail reproducibility test. Nature News. 27 Aug 2015.
  5. 5. Colling LJ, Szűcs D. Statistical Inference and the Replication Crisis. Rev Philos Psychol. 2018 [cited 7 Aug 2019].
  6. 6. Young NS, Ioannidis JPA, Al-Ubaydli O. Why Current Publication Practices May Distort Science. PLoS Med. 2008;5: e201. pmid:18844432
  7. 7. Ioannidis JPA. Why Most Published Research Findings Are False. PLoS Med. 2005;2: e124. pmid:16060722
  8. 8. Zook M, Barocas S, boyd danah, Crawford K, Keller E, Gangadharan SP, et al. Ten simple rules for responsible big data research. PLoS Comput Biol. 2017;13: e1005399. pmid:28358831
  9. 9. Levy KE, Johns DM. When open data is a Trojan Horse: The weaponization of transparency in science and governance. Big Data Soc. 2016;3: 2053951715621568.
  10. 10. How to Contribute to Open Source. In: Open Source Guides [Internet]. 10 Jul 2019 [cited 12 Aug 2019]. Available: https://opensource.guide/how-to-contribute/
  11. 11. Grant AD, Wolf GI, Nebeker C. Approaches to governance of participant-led research: a qualitative case study. BMJ Open. 2019;9: e025633. pmid:30944134
  12. 12. Gregory K. Big data, like Soylent Green, is made of people. In: Digital Labor Working Group [Internet]. 5 Nov 2014 [cited 12 Aug 2019]. Available: https://digitallabor.commons.gc.cuny.edu/2014/11/05/big-data-like-soylent-green-is-made-of-people/
  13. 13. Clough PT, Gregory K, Haber B, Scannell RJ. The Datalogical Turn. Non-Representational Methodologies: Re-Envisioning Research. 2015. pp. 146–164.
  14. 14. Leonelli S. What difference does quantity make? On the epistemology of Big Data in biology. Big Data Soc. 2014;1: 2053951714534395. pmid:25729586
  15. 15. Sadowski J. When data is capital: Datafication, accumulation, and extraction. Big Data Soc. 2019;6: 2053951718820549.
  16. 16. National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. 1979 Apr. Available: https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/index.html
  17. 17. Dittrich D, Kenneally E. The Menlo Report: Ethical Principles Guiding Information and Communication Technology Research. 2012 Aug. Available: http://www.caida.org/publications/papers/2012/menlo_report_actual_formatted/index.xml
  18. 18. Office for Human Research Protections. International Compilation of Human Research Standards. 2019 Edition. U.S. Department of Health and Human Services; 2018. Available: https://www.hhs.gov/ohrp/sites/default/files/2019-International-Compilation-of-Human-Research-Standards.pdf
  19. 19. Informed consent. Wikipedia. 2019. Available: https://en.wikipedia.org/w/index.php?title=Informed_consent&oldid=883882699
  20. 20. Manson NC. Rethinking Informed Consent in Bioethics. 1 edition. Cambridge; New York: Cambridge University Press; 2007.
  21. 21. Informed assent. Wikipedia. 2018. Available: https://en.wikipedia.org/w/index.php?title=Informed_assent&oldid=864244196
  22. 22. General requirements for informed consent. CFR. Sect. 46.116 Jul 19, 2018. Available: https://www.ecfr.gov/cgi-bin/retrieveECFR?gp=&SID=83cd09e1c0f5c6937cd9d7513160fc3f&pitd=20180719&n=pt45.1.46&r=PART&ty=HTML#se45.1.46_1116
  23. 23. Radin J. “Digital Natives”: How Medical and Indigenous Histories Matter for Big Data. Osiris. 2017;32: 43–64.
  24. 24. The Π Research Network. The Philosophy of Information: An Introduction. 1.0. 2013. Available: https://socphilinfo.github.io/resources/i2pi_2013.pdf
  25. 25. Shmueli G. Research Dilemmas with Behavioral Big Data. Big Data. 2017;5: 98–119. pmid:28632441
  26. 26. boyd danah, Crawford K. Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Inf Commun Soc. 2012;15: 662–679.
  27. 27. Torous J, Nebeker C. Navigating Ethics in the Digital Age: Introducing Connected and Open Research Ethics (CORE), a Tool for Researchers and Institutional Review Boards. J Med Internet Res. 2017;19: e38. pmid:28179216
  28. 28. Santos-Lang C. Initial Formation of a Committee. Belleville Research Ethics Committee Procedures. 2017.
  29. 29. CORE T. Platform. In: The CORE [Internet]. 2019 [cited 5 Aug 2019]. Available: https://thecore-platform.ucsd.edu/
  30. 30. Research Center for Optimal Digital Ethics Health (ReCODE Health). Welcome. In: ReCODE Health [Internet]. [cited 5 Aug 2019]. Available: https://recode.health/
  31. 31. University of Louisville. MobileELSI. In: Mobile ELSI Research Project Website [Internet]. 2017 [cited 5 Aug 2019]. Available: https://louisville.edu/mobileelsi
  32. 32. CSST 2016 Summer Institute Catalyst Working Group T. In: PERVADE–Pervasive Data Ethics for Computational Research [Internet]. 2018 [cited 5 Aug 2019]. Available: https://pervade.umd.edu/
  33. 33. Clinical Trials Transformation Initiative. In: Clinical Trials Transformation Initiative [Internet]. 2019 [cited 5 Aug 2019]. Available: https://www.ctti-clinicaltrials.org/
  34. 34. Foldit. Available: https://fold.it/portal/
  35. 35. Zooniverse. Available: https://www.zooniverse.org
  36. 36. Landis SC, Amara SG, Asadullah K, Austin CP, Blumenstein R, Bradley EW, et al. A call for transparent reporting to optimize the predictive value of preclinical research. Nature. 2012;490: 187–191. pmid:23060188
  37. 37. Frontiers for Young Minds. In: Frontiers for Young Minds [Internet]. 2018 [cited 20 Nov 2018]. Available: https://kids.frontiersin.org/
  38. 38. Patient and public partnership. In: The BMJ [Internet]. 2019 [cited 13 Aug 2019]. Available: https://www.bmj.com/campaign/patient-partnership
  39. 39. Schroter S, Price A, Flemyng E, Demaine A, Elliot J, Harmston RR, et al. Perspectives on involvement in the peer-review process: surveys of patient and public reviewers at two journals. BMJ Open. 2018;8: e023357. pmid:30185581
  40. 40. Nebeker C, Simon G, Kalichman M, Talavera A, Booen E, Lopez-Arenas A. Basic Research Concepts (BRC). Building Research Integrity and Capacity (BRIC): An Interactive Guide for Promotores/Community Health Workers. San Diego, CA: BRIC Academy; 2015. Available: https://ori.hhs.gov/content/basic-research-concepts-brc
  41. 41. About PatientsLikeMe. In: PatientsLikeMe [Internet]. 2018 [cited 20 Nov 2018]. Available: https://news.patientslikeme.com/about
  42. 42. PatientsLikeMe. Research manuscripts bibliography: The complete collection of PatientsLikeMe research publications. Cambridge, MA; 2019 Jan. Available: https://patientslikeme-bibliography.s3.amazonaws.com/PLM%20Research%20Manuscripts%20Bibliography.pdf
  43. 43. Patient-Centered Outcomes, Research Institute. PCORI. 2018 [cited 20 Nov 2018]. Available: https://www.pcori.org/
  44. 44. Silberzahn R, Uhlmann EL. Crowdsourced research: Many hands make tight work. Nat News. 2015;526: 189. pmid:26450041
  45. 45. Accredited conferences. In: Patients Included [Internet]. 2019 [cited 13 Aug 2019]. Available: https://patientsincluded.org/conferences/accredited-conferences/
  46. 46. Chivers T. Does psychology have a conflict-of-interest problem? Nature. 2019;571: 20–23. pmid:31267062
  47. 47. Mozilla, LRNG, IMS Global Learning Consortium. What’s an Open Badge? In: Open Badges [Internet]. 2016 [cited 13 Aug 2019]. Available: https://openbadges.org/get-started/
  48. 48. Thürridl C, Kamleitner B. What Goes around Comes Around? Rewards as Strategic Assets in Crowdfunding. Calif Manage Rev. 2016;58: 88–110.
  49. 49. Weinschenk S. How to Get People to Do Stuff: Master the art and science of persuasion and motivation. 1 edition. Berkeley, CA: New Riders; 2013.
  50. 50. Reinforcement. Wikipedia. 2019. Available: https://en.wikipedia.org/w/index.php?title=Reinforcement&oldid=910942384
  51. 51. Conference Code of Conduct. A code of conduct template for conferences. Conference Code of Conduct; 2018. Available: https://github.com/confcodeofconduct/confcodeofconduct.com
  52. 52. Vicens Q, Bourne PE. Ten Simple Rules for a Successful Collaboration. PLoS Comput Biol. 2007;3. pmid:17397252
  53. 53. Bennett LM, Gadlin H, Marchand C. Collaboration and Team Science: A Field Guide. Second Edition. NIH National Cancer Institute Center for Research Strategy; 2018. Available: https://www.cancer.gov/about-nci/organization/crs/research-initiatives/team-science-field-guide
  54. 54. Wallis JC, Rolando E, Borgman CL. If We Share Data, Will Anyone Use Them? Data Sharing and Reuse in the Long Tail of Science and Technology. PLoS ONE. 2013;8: e67332. pmid:23935830
  55. 55. Allen L, Scott J, Brand A, Hlava M, Altman M. Publishing: Credit where credit is due. Nat News. 2014;508: 312. pmid:24745070
  56. 56. Holcombe A. Farewell authors, hello contributors. Nature. 2019;571: 147–147. pmid:31278394
  57. 57. International Committee of Medical Journal Editors. Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals. In: ICMJE [Internet]. Dec 2018 [cited 5 Aug 2019]. Available: http://www.icmje.org/recommendations/
  58. 58. ICMJE recommendations—Wikipedia. [cited 5 Aug 2019]. Available: https://en.wikipedia.org/wiki/ICMJE_recommendations
  59. 59. Committee on Publication Ethics. Guidelines. In: COPE [Internet]. 2019 [cited 5 Aug 2019]. Available: https://publicationethics.org/guidance/Guidelines
  60. 60. Committee on Publication Ethics. Guidance: Authorship and contributorship. In: COPE [Internet]. 2019 [cited 5 Aug 2019]. Available: https://publicationethics.org/guidance?classification=2772
  61. 61. Committee on Publication Ethics. Promoting integrity in research and its publication. In: COPE [Internet]. 2019 [cited 5 Aug 2019]. Available: https://publicationethics.org/
  62. 62. Center for Open Science. Badges to Acknowledge Open Practices. 19 Feb 2013 [cited 13 Aug 2019]. Available: https://osf.io/tvyxz/
  63. 63. Roediger HL III, Eich E. What’s New at Psychological Science. APS Observer. 31 Oct 201326. Available: https://www.psychologicalscience.org/observer/whats-new-at-psychological-science. Accessed 13 Aug 2019.
  64. 64. Baker M. Digital badges motivate scientists to share data. Nat News. 2016 [cited 13 Aug 2019].
  65. 65. Woolston C, Osório J. When English is not your mother tongue. Nature. 2019;570: 265–267. pmid:31182832
  66. 66. Patients Included. In: Patients Included [Internet]. 2018 [cited 13 Aug 2019]. Available: https://patientsincluded.org/
  67. 67. San Francisco Declaration on Research Assessment. In: Read the declaration—DORA [Internet]. 2012. Available: https://sfdora.org/read/
  68. 68. What are altmetrics? In: Altmetric [Internet]. 2 Jun 2015 [cited 24 Nov 2018]. Available: https://www.altmetric.com/about-altmetrics/what-are-altmetrics/
  69. 69. Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I. Bibliometrics: The Leiden Manifesto for research metrics. Nat News. 2015;520: 429. pmid:25903611
  70. 70. Milham MP, Craddock RC, Son JJ, Fleischmann M, Clucas J, Xu H, et al. Assessment of the impact of shared brain imaging data on the scientific literature. Nat Commun. 2018;9: 2818. pmid:30026557
  71. 71. Ali-Khan SE, Jean A, MacDonald E, Gold ER. Defining Success in Open Science. MNI Open Res. 2018;2: 2. pmid:29553146
  72. 72. 5-star Open Data. [cited 9 Nov 2018]. Available: http://5stardata.info/en/
  73. 73. Track Impact with ALMs. In: PLoS [Internet]. [cited 13 Aug 2019]. Available: https://www.plos.org/article-level-metrics
  74. 74. Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015;349: aac4716. pmid:26315443
  75. 75. Pashler H, Wagenmakers E. Editors’ Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? Perspect Psychol Sci. 2012;7: 528–530. pmid:26168108
  76. 76. McNutt M. Reproducibility. Science. 2014;343: 229–229. pmid:24436391
  77. 77. Ince DC, Hatton L, Graham-Cumming J. The case for open computer programs. Nature. 2012;482: 485–488. pmid:22358837
  78. 78. Wicherts JM, Bakker M, Molenaar D. Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results. PLoS ONE. 2011;6: e26828. pmid:22073203
  79. 79. Calin-Jageman B. The Cookie-Monster Study: The highly influential memory of a long-lost study. In: Introduction to the New Statistics [Internet]. 22 May 2019 [cited 13 Aug 2019]. Available: https://thenewstatistics.com/itns/2019/05/22/the-cookie-monster-study-the-highly-influential-memory-of-a-long-lost-study/
  80. 80. Shiffrin RM, Börner K, Stigler SM. Scientific progress despite irreproducibility: A seeming paradox. Proc Natl Acad Sci. 2018;115: 2632–2639. pmid:29531095
  81. 81. Pasquetto I, Randles B, Borgman C. On the Reuse of Scientific Data. Data Sci J. 2017;16: 8.
  82. 82. Tenopir C, Dalton ED, Allard S, Frame M, Pjesivac I, Birch B, et al. Changes in Data Sharing and Data Reuse Practices and Perceptions among Scientists Worldwide. PLoS ONE. 2015;10: e0134826. pmid:26308551
  83. 83. Organization for Human Brain Mapping. Replication Award. 2019 [cited 5 Aug 2019]. Available: https://www.humanbrainmapping.org/m/pages.cfm?pageid=3731
  84. 84. Wilkinson MD, Dumontier M, Aalbersberg IjJ, Appleton G, Axton M, Baak A, et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016;3: 160018. pmid:26978244
  85. 85. FAIR Principles. In: GO FAIR [Internet]. [cited 9 Aug 2019]. Available: https://www.go-fair.org/fair-principles/
  86. 86. Federer LM, Belter CW, Joubert DJ, Livinski A, Lu Y-L, Snyders LN, et al. Data sharing in PLOS ONE: An analysis of Data Availability Statements. PLoS ONE. 2018;13: e0194768. pmid:29719004
  87. 87. Tenopir C, Allard S, Douglass K, Aydinoglu AU, Wu L, Read E, et al. Data Sharing by Scientists: Practices and Perceptions. PLoS ONE. 2011;6: e21101. pmid:21738610
  88. 88. Goodman A, Pepe A, Blocker AW, Borgman CL, Cranmer K, Crosas M, et al. Ten Simple Rules for the Care and Feeding of Scientific Data. PLoS Comput Biol. 2014;10. pmid:24763340
  89. 89. Amrhein V, Gelman A, Greenland S, McShane BB. Abandoning statistical significance is both sensible and practical. PeerJ Inc.; 2019 Apr. Report No.: e27657v1.
  90. 90. Wasserstein RL, Schirm AL, Lazar NA. Moving to a World Beyond “p < 0.05.” Am Stat. 2019;73: 1–19.
  91. 91. Cassidy SA, Dimova R, Giguère B, Spence JR, Stanley DJ. Failing Grade: 89% of Introduction-to-Psychology Textbooks That Define or Explain Statistical Significance Do So Incorrectly. Adv Methods Pract Psychol Sci. 2019; 2515245919858072.
  92. 92. Peters MA. Open Education and Education for Openness. Encyclopaedia of Educational Philosophy and Theory. 2014. Available: http://archive.fo/JaBJt
  93. 93. Inkster B. Ethics-In-Action. In: Dr Becky Inkster [Internet]. [cited 9 Nov 2018]. Available: https://www.beckyinkster.com/ethicsinaction/
  94. 94. Lindsay DS. Preregistered Direct Replications in Psychological Science. Psychol Sci. 2017;28: 1191–1192. pmid:28793201
  95. 95. Chambers CD. Registered reports: a new publishing initiative at Cortex. Cortex J Devoted Study Nerv Syst Behav. 2013;49: 609–610. pmid:23347556
  96. 96. Center for Open Science. Registered Reports. 2017 [cited 9 Nov 2018]. Available: https://cos.io/rr/
  97. 97. Mensh B, Kording K. Ten simple rules for structuring papers. PLoS Comput Biol. 2017;13: e1005619. pmid:28957311
  98. 98. UW eScience Institute. shablona. UW eScience Institute; 2018. Available: https://github.com/uwescience/shablona
  99. 99. Chan A-W, Tetzlaff JM, Gøtzsche PC, Altman DG, Mann H, Berlin JA, et al. SPIRIT 2013 explanation and elaboration: guidance for protocols of clinical trials. BMJ. 2013;346: e7586. pmid:23303884
  100. 100. EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network T. The EQUATOR Network | Enhancing the QUAlity and Transparency Of Health Research. 2018 [cited 9 Nov 2018]. Available: https://www.equator-network.org/
  101. 101. Poldrack RA, Gorgolewski KJ, Varoquaux G. Computational and informatics advances for reproducible data analysis in neuroimaging. ArXiv180910024 Cs Q-Bio Stat. 2018 [cited 9 Nov 2018]. Available: http://arxiv.org/abs/1809.10024
  102. 102. Ghosh SS, Klein A, Avants B, Millman KJ. Learning from open source software projects to improve scientific review. Front Comput Neurosci. 2012;6. pmid:22529798
  103. 103. Bechhofer S, Buchan I, De Roure D, Missier P, Ainsworth J, Bhagat J, et al. Why linked data is not enough for scientists. Future Gener Comput Syst. 2013;29: 599–611.
  104. 104. Schapira M, The Open Lab Notebook Consortium, Harding R. Open laboratory notebooks: good for science, good for society, good for scientists. F1000Research. 2019;8. https://doi.org/10.12688/f1000research.17710.2
  105. 105. Wagenmakers E-J, Dutilh G, de Kort S. Seven Selfish Reasons for Preregistration. APS Observer. 31 Oct 201629. Available: https://www.psychologicalscience.org/observer/seven-selfish-reasons-for-preregistration. Accessed 9 Nov 2018.
  106. 106. Tennant J. Promoting your articles to increase your digital identity and research impact. In: ScienceOpen Blog [Internet]. 29 Sep 2017 [cited 9 Nov 2018]. Available: http://blog.scienceopen.com/2017/03/promoting-your-articles-to-increase-your-digital-identity-and-research-impact/
  107. 107. Lindsay DS. Sharing Data and Materials in Psychological Science. Psychol Sci. 2017;28: 699–702. pmid:28414920
  108. 108. Bishop DVM. Fallibility in Science: Responding to Errors in the Work of Oneself and Others. Adv Methods Pract Psychol Sci. 2018;1: 432–438.
  109. 109. Creative Commons. About The Licenses. In: Creative Commons [Internet]. 2017 [cited 11 Nov 2018]. Available: https://creativecommons.org/licenses/
  110. 110. Creative Commons license. Wikipedia. 2019. Available: https://en.wikipedia.org/w/index.php?title=Creative_Commons_license&oldid=909655146
  111. 111. Think. Check. Submit. Check. In: thinkchecksubmit [Internet]. 2019 [cited 6 Aug 2019]. Available: http://thinkchecksubmit.org/check/
  112. 112. Think. Check. Attend. Check. In: Think Check Attend [Internet]. 2019 [cited 6 Aug 2019]. Available: https://thinkcheckattend.org/check/
  113. 113. SHERPA. About RoMEO. In: SHERPA/RoMEO [Internet]. [cited 7 Aug 2019]. Available: http://sherpa.ac.uk/romeo/about.php?la=en&fIDnum=|&mode=simple
  114. 114. Le Centre pour la Communication Scientifique Directe (CCSD). In: Archive ouverte HAL [Internet]. 2019. Available: https://hal.archives-ouvertes.fr/
  115. 115. Gruson-Daniel C. Numérique et régime français des savoirs en~action: l’open en sciences. Le cas de la consultation « République numérique ». Université Paris Descartes. 2018. https://doi.org/
  116. 116. Lomazzi L, Chartron G. The implementation of the European Commission recommendation on open access to scientific information: Comparison of national policies. Inf Serv Use. 2014;34: 233–240.
  117. 117. Harvard Office for Scholarly Communication. Open Access Policies. In: Harvard OSC [Internet]. 2018 [cited 7 Aug 2019]. Available: https://osc.hul.harvard.edu/policies/
  118. 118. Center for Open Science. Open Science Framework. 2018 [cited 9 Nov 2018]. Available: https://osf.io/
  119. 119. Labstep Ltd. Labstep. 2018 [cited 9 Nov 2018]. Available: https://www.labstep.com/
  120. 120. protocols.io. protocols.io—Life Sciences Protocol Repository. 2018 [cited 9 Nov 2018]. Available: https://www.protocols.io/
  121. 121. GitHub, Inc. GitHub. 2018. Available: https://github.com/
  122. 122. GitLab. The first single application for the entire DevOps lifecycle. In: GitLab [Internet]. [cited 9 Nov 2018]. Available: https://about.gitlab.com/
  123. 123. figshare. figshare—credit for all your research. 2018 [cited 9 Nov 2018]. Available: https://figshare.com/
  124. 124. CERN Data Centre, Invenio. Zenodo—Research. Shared. 2018 [cited 9 Nov 2018]. Available: https://zenodo.org/
  125. 125. Dryad. Dryad Digital Repository. 2018.
  126. 126. Stikov N, Poline J-B. Announcing Aperture—the OHBM Publishing Platform. In: organization for human brain mapping [Internet]. 15 Jun 2018 [cited 9 Nov 2018]. Available: http://www.ohbmbrainmappingblog.com/1/post/2018/06/announcing-aperture-the-ohbm-publishing-platform.html
  127. 127. Science Afrique. Le Grenier des savoirs–Des revues africaines en libre accès pour nourrir l’humanité de savoirs de qualité. [cited 13 Aug 2019]. Available: https://www.revues.scienceafrique.org/
  128. 128. Citing Yourself. Wikipedia:Conflict of interest. 2018. Available: https://en.wikipedia.org/w/index.php?title=Wikipedia:Conflict_of_interest&oldid=869559811
  129. 129. Wikipedia:No original research. Wikipedia. 2018. Available: https://en.wikipedia.org/w/index.php?title=Wikipedia:No_original_research&oldid=868806183
  130. 130. Piwowar HA, Vision TJ. Data reuse and the open data citation advantage. PeerJ. 2013;1: e175. pmid:24109559
  131. 131. O’Reilly T, Wang Z, Sabatini J. How Much Knowledge Is Too Little? When a Lack of Knowledge Becomes a Barrier to Comprehension. Psychol Sci. 2019 [cited 19 Aug 2019]. pmid:31343951
  132. 132. Rakedzon T, Segev E, Chapnik N, Yosef R, Baram-Tsabari A. Automatic jargon identifier for scientists engaging with the public and science communication educators. PLoS ONE. 2017;12: e0181742. pmid:28792945
  133. 133. American Association for the Advancement of Science. Science in the Classroom—Annotated research papers and accompanying teaching materials. In: Science in the Classroom [Internet]. 2018 [cited 20 Nov 2018]. Available: https://www.scienceintheclassroom.org/
  134. 134. Hartnell J. To enable a conversation over the world’s knowledge. In: Hypothesis [Internet]. 4 Sep 2018 [cited 20 Nov 2018]. Available: https://web.hypothes.is/about/
  135. 135. Udell J. Science in the Classroom. In: Hypothesis [Internet]. 6 Jun 2016 [cited 20 Nov 2018]. Available: https://web.hypothes.is/blog/science-in-the-classroom/
  136. 136. Cumming G. Understanding The New Statistics: Effect Sizes, Confidence Intervals, and Meta-Analysis. Routledge; 2013. https://doi.org/10.4324/9780203807002
  137. 137. Amrhein V, Greenland S, McShane B. Scientists rise up against statistical significance. Nature. 2019;567: 305–307. pmid:30894741
  138. 138. Gewin V. Data sharing: An open mind on open data. Nature. 2016;529: 117–119. pmid:26744755
  139. 139. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, et al. Promoting an open research culture. Science. 2015;348: 1422–1425. pmid:26113702
  140. 140. Center for Open Science. TOP Guidelines. In: Center for Open Science [Internet]. 2015 [cited 13 Aug 2019]. Available: https://cos.io/top/
  141. 141. PREreview Team, Saderi D, Hindle S, Nicholson J. Welcome to PREreview! In: Authorea [Internet]. 1 Mar 2019 [cited 16 Aug 2019]. Available: https://www.authorea.com/users/8850/articles/198235-welcome-to-prereview
  142. 142. Authorea. In: PREreview [Internet]. 2019 [cited 16 Aug 2019]. Available: https://www.prereview.org/
  143. 143. Milham MP, Klein A. Be the change you seek in science. BMC Biol. 2019;17: 27. pmid:30914050
  144. 144. Knappenberger B. The Internet’s Own Boy: The Story of Aaron Swartz. Participant Media; 2014.
  145. 145. Thorne M. Openwashing. In: Michelle Thorne [Internet]. 14 Mar 2009 [cited 13 Aug 2019]. Available: https://michellethorne.cc/2009/03/openwashing/
  146. 146. Savage CJ, Vickers AJ. Empirical Study of Data Sharing by Authors Publishing in PLoS Journals. PLoS ONE. 2009;4: e7078. pmid:19763261
  147. 147. Altmann G. Gear, Globe, Europe, Asia, Africa, Continents, America [Internet]. 2015 [cited 2019 Aug 19]. Available from: https://pixabay.com/illustrations/gear-globe-europe-asia-africa-995772/ Pixabay License. https://pixabay.com/service/license/.
  148. 148. ben. Microscope [Internet]. 2014 [cited 2019 Aug 19]. Available from: https://openclipart.org/detail/192464/microscope-by-ben-192464 Unlimited commercial use. https://openclipart.org/share.
  149. 149. Clker-Free-Vector-Images. Paper, Looseleaf, Notebook, Blank, Notepad, Lined [Internet]. 2012 [cited 2019 Aug 19]. Available from: https://pixabay.com/vectors/paper-looseleaf-notebook-blank-48639/ Pixabay License. https://pixabay.com/service/license/.
  150. 150. CycledeKrebs. Microscope, Science, Scientist [Internet]. 2015 [cited 2019 Aug 19]. Available from: https://pixabay.com/illustrations/microscope-science-scientist-1079880/ Pixabay License. https://pixabay.com/service/license/.
  151. 151. DigitaLink. Blank T-Shirt [Internet]. 2007 [cited 2019 Mar 16]. Available from: https://openclipart.org/detail/3491/blank-tshirt Unlimited commercial use. https://openclipart.org/share.
  152. 152. Google Inc. Noto Sans [Internet]. Google Noto Fonts. 2017 [cited 2019 Sep 3]. Available from: https://www.google.com/get/noto/#sans-lgc SIL Open Font License, Version 1.1. http://scripts.sil.org/cms/scripts/page.php?site_id=nrsi&id=OFL.
  153. 153. Hassan mohamed. Silhouette, Marketing, Megaphone, Woman, Screaming [Internet]. 2018 [cited 2019 Aug 19]. Available from: https://pixabay.com/vectors/silhouette-marketing-megaphone-3265766/ Pixabay License. https://pixabay.com/service/license/.
  154. 154. OpenClipart-Vectors. Pencil, Pen, Write, Education, Drawing [Internet]. 2017 [cited 2019 Aug 19]. Available from: https://pixabay.com/vectors/pencil-pen-write-education-drawing-2026452/ Pixabay License. https://pixabay.com/service/license/.
  155. 155. Treis T. Calendar, Icon, Minimalist, Time, Black, White [Internet]. 2016 [cited 2019 Aug 19]. Available from: https://pixabay.com/illustrations/calendar-icon-minimalist-time-1559935/PixabayLicense. https://pixabay.com/service/license/.
  156. 156. Venita O. Justice, Scale, Scales Of Justice, Judge, Law, Balance [Internet]. 2015 [cited 2019 Aug 19]. Available from: https://pixabay.com/illustrations/justice-scale-scales-of-justice-914228/PixabayLicense. https://pixabay.com/service/license/.
  157. 157. Center for Open Science. Registered Reports [Internet]. 2017 [cited 2018 Nov 9]. Available from: https://cos.io/rr/CCBY4.0. https://creativecommons.org/licenses/by/4.0/.