Science accumulates evidence through time, but how do you know when you have enough evidence to make robust generalizations about a research question? Synthesis tools borrowed from the medical sciences, such as meta-analysis, have emerged as a powerful means to assess the rapidly growing literature in ecology and evolution. However, this rapidity may itself create problems if the results of syntheses quickly become outdated as new data refine the evidence base1.

Two Comment articles in this issue of Nature Ecology & Evolution about different stages of the research process propose new strategies for how synthesis methods can tackle this flux in knowledge. They also raise important questions about the degree to which the ecology, evolution and conservation communities are sufficiently prepared to embrace a major shift in the way synthesis projects are embedded within research workflows.

In a first Comment, Grainger et al. argue that synthesis tools should be used to guide the question-setting stage of research, in order to avoid efforts being wasted on studying questions for which the weight of evidence is already well supported. Among the tools they propose for this is cumulative meta-analysis: a means by which study effect sizes extracted from a literature search are aggregated and ordered by time, so that when they stabilize, it becomes clear that further new evidence would not alter the main conclusion. By routinely applying such tools prior to study commencement, they hope to reduce the accumulation of redundant information, so that efforts can be concentrated on higher-priority questions. One example the authors explore is an applied question — can acoustic recorders replace human observers for estimating bird abundance? — but identifying shifts in evidence are just as applicable to more fundamental questions. Indeed, a recent re-appraisal of the literature investigating whether male house sparrows use their black chest bibs as a signal of social status found very little support for this widely known example in behavioural ecology2.

Although tools like meta-analysis can be powerful in identifying generalities, studies like the sparrow bib example also highlight some of their limitations. A lack of negative results in the published literature (the ‘file drawer effect’), may potentially alter the magnitude or even direction of a reported effect. Taxonomic and geographic biases that are widespread in the ecological literature are also likely to influence syntheses3, and even the effect of highly influential research networks may skew the type of research questions being asked4. Research published in non-English languages or in the grey literature also presents a potentially huge source of overlooked information5 and may often be excluded from search strategies.

To some extent these issues can be minimized through transparent reporting of literature searches, but even here biases may only be identified post-publication6. One possible solution that has been used as a gold standard in the medical sciences (such as the Cochrane Reviews) is to scrutinize systematic review protocols prior to data being extracted from the literature. Although it is possible to publish such study protocols for environmental meta-analysis (for example, the Environmental Evidence journal), it is not clear how widespread this practice is, but there may certainly be an argument for incentivizing more widespread adoption as a gold standard in ecology and evolution too7. Publication of Systematic Review protocols may also bring the added benefit of potentially reducing reviewing burden on the current pool of synthesis experts, as study methodologies may only need reviewing once, with any biological interpretation evaluated by subject-specific experts.

Further biases can by mitigated by the adoption of specific reporting checklists, such as PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses), TOP (Transparency and Openness Promotion) Guidelines, or ROSES (RepOrting standards for Systematic Evidence Syntheses), and ensuring that synthesis articles are assessed by referees with specific expertise in these techniques. For example, at Nature Ecology & Evolution, every relevant article must be accompanied by one of these specific checklists, in addition to our own Reporting Summary checklist. We also routinely require that every meta-analysis or synthesis article submitted to the journal is assessed by at least one expert with demonstrable background in evidence synthesis.

An alternative vision of the future of research workflows is proposed in a second Comment this month by Nakagawa et al., who argue that synthesis projects need to be embedded within all aspects of the scientific process. Such integrated workflows could democratize the process by bringing together empirical scientists and synthesists across entire disciplines to create ‘living syntheses’ that evolve in real-time with the evidence base, incorporating null results and unpublished works, and providing an open and transparent platform for policymakers to access.

Such a system is a bold vision that would likely involve a major re-organization in the way science is conducted — and present challenges and opportunities for re-thinking the way in which science is appraised. For example, when whole disciplines become integrated in this way, evidence gaps may become clearer, but the boundaries between independent assessments and conflicts of interest may become blurred. Open projects like this will provide opportunities to close the gap with underrepresented groups, but this will also require that appropriate mechanisms are established for apportioning credit fairly among many collaborators, and training to ensure that all members of the community are able to critically appraise the synthesis methodology of the projects in which they are embedded.

With an ever-increasing volume of published synthesis papers, and an ever-increasing volume of tools and methodologies available, it is vital that the field continues to empower the next generation of ecologists and evolutionary biologists with the skills needed — for example by ensuring that every graduate student receives training in synthesis methods. As the current global COVID-19 pandemic continues, it seems likely that mining literature and open datasets will be a viable alternative to the field campaigns and lab work that have been disrupted now that a large proportion of the research community is confined to their homes. Whatever direction research workflows evolve in the future, what is clear is that these tools are likely to continue to play a major role, and that the whole discipline will benefit from a fuller understanding and appreciation of their methods.