Good science requires transparency

Ideally, science is characterized by a ‘show me’ norm, meaning that claims should be based on observations that are reported transparently, honestly and completely1. When parts of the scientific process remain hidden, the trustworthiness of the associated conclusions is eroded. This erosion of trust affects the credibility not only of specific articles, but—when a lack of transparency is the norm—perhaps even entire disciplines. Transparency is required not only for evaluating and reproducing results (from the same data), but also for research synthesis and meta-analysis from the raw data and for effective replication and extension of that work. Particularly when the research is funded by public resources, transparency and openness constitute a societal obligation.

In recent years many social and behavioural scientists have expressed a lack of confidence in some past findings2, partly due to unsuccessful replications. Among the causes for this low replication rate are underspecified methods, analyses and reporting practices. These research practices can be difficult to detect and can easily produce unjustifiably optimistic research reports. Such lack of transparency need not be intentional or deliberately deceptive. Human reasoning is vulnerable to a host of pernicious and often subtle biases, such as hindsight bias, confirmation bias and motivated reasoning, all of which can drive researchers to unwittingly present a distorted picture of their results.

The practical side of transparency

How can scientists increase the transparency of their work? To begin with, they could adopt open research practices such as study preregistration and data sharing3,4,5. Many journals, institutions and funders now encourage or require researchers to adopt these practices. Some scientific subfields have seen broad initiatives to promote transparency standards for reporting and summarizing research findings, such as START, SPIRIT, PRISMA, STROBE and CONSORT (see https://www.equator-network.org). A few journals ask authors to answer checklist questions about statistical and methodological practices (e.g., the Nature Life Sciences Reporting Summary)6 and transparency (for example, Psychological Science). Journals can signal that they value open practices by offering ‘badges’ that acknowledge open data, code and materials7. The Transparency and Openness Promotion (TOP) guidelines8, endorsed by many journals, promote the availability of all research items, including data, materials and code. Authors can declare their adherence to these TOP standards by adding a transparency statement in their articles (TOP Statement)9. Collectively, these somewhat piecemeal innovations illustrate a science-wide shift toward greater transparency in research reports.

Transparency Checklist

We provide a consensus-based, comprehensive transparency checklist that behavioural and social science researchers can use to improve and document the transparency of their research, especially for confirmatory work. The checklist reinforces the norm of transparency by identifying concrete actions that researchers can take to enhance transparency at all the major stages of the research process. Responses to the checklist items can be submitted along with a manuscript, providing reviewers, editors and, eventually, readers with critical information about the research process necessary to evaluate the robustness of a finding. Journals could adopt this checklist as a standard part of the submission process, thereby improving documentation of the transparency of the research that they publish.

We developed the checklist contents using a preregistered ‘reactive-Delphi’ expert consensus process10, with the goal of ensuring that the contents cover most of the elements relevant to transparency and accountability in behavioural research. The initial set of items was evaluated by 45 behavioural and social science journal editors-in-chief and associate editors, as well as 18 open-science advocates. The Transparency Checklist was iteratively modified by deleting, adding and rewording the items until a sufficiently high level of acceptability and consensus were reached and no strong counter arguments for single items were made (for the selection of the participants and the details of the consensus procedure see Supplementary Information). As a result, the checklist represents a consensus among these experts.

The final version of the Transparency Checklist 1.0 contains 36 items that cover four components of a study: preregistration; methods; results and discussion; and data, code and materials availability. For each item, authors select the appropriate answer from prespecified options. It is important to emphasize that none of the responses on the checklist is a priori good or bad and that the transparency report provides researchers the opportunity to explain their choices at the end of each section.

In addition to the full checklist, we provide a shortened 12-item version (Fig. 1). By reducing the demands on researchers’ time to a minimum, the shortened list may facilitate broader adoption, especially among journals that intend to promote transparency but are reluctant to ask authors to complete a 36-item list. We created online applications for the two checklists that allow users to complete the form and generate a report that they can submit with their manuscript and/or post to a public repository (Box 1). The checklist is subject to continual improvement, and users can always access the most current version on the checklist website; access to previous versions will be provided on a subpage.

Fig. 1
figure 1

The Shortened Transparency Checklist 1.0. After each section, the researchers can add free text if they find that further explanation of their response is needed. The full version of the checklist can be found at http://www.shinyapps.org/apps/TransparencyChecklist/.

This checklist presents a consensus-based solution to a difficult task: identifying the most important steps needed for achieving transparent research in the social and behavioural sciences. Although this checklist was developed for social and behavioural researchers who conduct and report confirmatory research on primary data, other research approaches and disciplines might find value in it and adapt it to their field’s needs. We believe that consensus-based solutions and user-friendly tools are necessary to achieve meaningful change in scientific practice. While there may certainly remain important topics the current version fails to cover, nonetheless we trust that this version provides a useful to facilitate starting point for transparency reporting. The checklist is subject to continual improvement, and we encourage researchers, funding agencies and journals to provide feedback and recommendations. We also encourage meta-researchers to assess the use of the checklist and its impact in the transparency of research.