Communicating the absence of evidence for microplastics risk: Balancing sensation and reflection

Communicating scientific evidence to decision makers and other stakeholders is an important task for scientists (SAPEA, 2019a, 2019b; Environmental and Health Risks of Microplastic Pollution, 2019). In this context, and specifically referring to recent evidence reviews on microplastics (SAPEA, 2019b; World Health Organization, 2019). Leslie and Depledge (2020) argue that both SAPEA (a consortium of European academies, part of the European Commission’s Scientific Advice Mechanism) and the UN’s World Health Organisation make the mistake of “assuming risk is absent in the absence of evidence”. In what follows, we reflect on their criticism and raise some broader issues about communicating science on an emerging issue. We make four points. Of the four, the first two reply directly to Leslie and Depledge. The third point briefly discusses the philosophical issue of whether, and to what extent, an absence of evidence constitutes evidence of absence, a topic whose origins can be traced back to the Enlightenment philosopher Locke (1689/1823). The fourth point provides a broader perspective on science communication for policy. We focus largely on Leslie and Depledge’s criticisms of the way the SAPEA report on microplastics (SAPEA, 2019b) was communicated, although we expect that similar responses might be made to their criticisms directed towards the WHO. Our effort is of course undertaken in Locke’s spirit of “the common offices of humanity and friendship in the diversity of opinions” (Locke, 1689/1823).

The reports acknowledge the current challenges facing scientists attempting to gather robust information and recommend proceeding to fill knowledge gaps. The SAPEA report states on p. 116 that 'the absence of evidence of microplastics risks currently does not allow one to conclude that risk is either present or absent with sufficient certainty' (SAPEA, 2019b). In this absence of evidence, it is then surprising to find statements on SAPEA's homepage that the final 'verdict' of SAPEA's Evidence Review Report is that 'The best available evidence suggests that microplastics and nanoplastics do not pose widespread risk to humans and the environment'.
But their quote from SAPEA is in fact a half-quote. The complete sentences from the SAPEA website (https://www.sapea.info/top ics/microplastics), also reproduced in printed materials intended for a general audience, are: The best available evidence suggests that microplastics and nanoplastics do not pose a widespread risk to humans or the environment, except in small pockets. But that evidence is limited, and the situation could change if pollution continues at the current rate.
The words that L&D omit from their quote (here highlighted in bold) are crucial. The SAPEA Working Group's position, as expressed both by the lay summary on the website and by the report itself, is not that the evidence tells us that there is no widespread risk. It is that the evidence we have so far suggests there is no risk, with exceptions, but that evidence is limited. Only with more evidence could we draw a firmer conclusion, either that there is no risk or that the risk is significant.
When the summary is read in its entirety, including the words omitted by L&D, its meaning is quite different, because of both the word "suggests" and the careful qualification "but that evidence is limited", and "the situation could change if pollution continues at the current rate". So it is certainly not the case (as L&D state) that the SAPEA Working Group has made a "slip up", nor that the groupclaims (as L&D imply) that "a conclusion of 'no risk' [can] be supported by 'no data' ". Of course, a two-sentence lay summary cannot capture all the intricacies of a 176-page evidence review report that cites 459 references. But it is not the case that the summary and the report present the strength or meaning of the evidence differently. The summary is nuanced and makes no stronger claims about the evidence than the report itself. The 'verdict' is not a 'conclusion of no risk'.

Overstating Scientific Ignorance
L&D contrast an overstated characterisation of SAPEA's lay position with an overstated characterisation of the limitations of current evidence.
If the current state of the evidence was really that there is "no data", or that "having plastic particles in your body is safe" (both phrases that L&D present in quotation marks while discussing the SAPEA and WHO reports, but do not attribute to any source), then a summary suggesting otherwise would indeed be questionable. But that is not the current state of the evidence as reported by the expert working group that wrote the SAPEA evidence review report. While the report does not find that we have conclusive evidence about the risks posed by microplastics, it also does not find that we are "flying blind", as L&D put it.
Rather, the report surveys a growing body of evidence about the levels of microplastics in the natural environment, and about the risks that such levels pose to the environment and to human health. It charts a careful middle course, drawing qualified conclusions where the evidence justifies them, but highlighting where it does not. In fact, assessments of the strength of the evidence run through the entire report, with the authors adopting a visual device (margins shaded in different colours) in the chapter on the natural sciences, to indicate which claims are strongly supported by current evidence, which are weakly supported, and where there are gaps in knowledge.
The report's conclusion is indeed that the evidence we have so far, while limited and clearly insufficient to settle the question of risk, Received 20 August 2020; Accepted 2 September 2020 nonetheless does suggest that there is no widespread risk yet. Now, L&D or others can dispute that conclusion. For instance, they might argue that important evidence was not considered; or that the evidence was misjudged and should be interpreted differently. This is part of the normal day-to-day process of doing science. But to disagree with the conclusions of a report is not the same as to suggest that a summary of that report is inaccurate.

L&D write:
Many mainstream media have picked up the 'no risk' soundbite. These statements raise a fundamental epistemological problem. Can the conclusion of 'no risk' be supported by 'no data'? One of the common pitfalls in critical thinking is to neglect the logic that the absence of evidence is not evidence of absence. The 'having plastic particles in your body is safe' conclusion conjures up a classic error known as the 'appeal to ignorance' fallacy Locke (1690), which is, 'there is no evidence against x. Therefore x is true.' This type of statement has no place in rational thinking. Note that to propagate claims of this type is to unduly shift the burden of proof onto those seeking conclusive 'evidence'.
This passage misses some important subtleties of the appeal to ignorance fallacy. The phrase "absence of evidence is not evidence of absence" is often recited as a truism, but strictly speaking it is false: in the result of a well-designed experiment, absence of evidence is indeed evidence, though not proof, of absence. This is because, according to the standard (Bayesian) understanding of experimental design, each observation which fails to confirm the alternative hypothesis confirms (to a degree) the null hypothesis.
To put it simply, it makes a difference whether you have looked for evidence in the right place or not. Consider, as a random example, the hypothesis '5G technology increases cancer risk in humans'. Now if there had been no serious investigation of this hypothesis, and if we had no background information about its likelihood, then of course it would be fallacious to reject it out of hand. At the other end of the spectrum, if there had been infinitely many good experiments that have all failed to come up with evidence, we would be maximally justified in rejecting the hypothesis. Between these two extremes lies a continuum of evidence strength: for each well-designed experiment that we conduct, each result which favours the null hypothesis nudges our credence in the alternative closer to 0, and each result which favours the alternative nudges our credence closer to 1.
As a second, perhaps even clearer example which is sometimes used in texts on philosophy of knowledge, consider the hypothesis 'Leprechauns live under toadstools'. Every time we look under a toadstool and find no leprechaun, this absence of evidence is indeed evidence of absence. It is not proof of absence, unless we can be sure that we have reliably checked every toadstool in the universe (that point, by the way, was made by Locke)-but the evidence of absence gets a bit stronger each time.
A fallacious appeal to ignorance is dangerous not because it tempts us to reject a hypothesis in the absence of evidence, but that it tempts us to reject a hypothesis in the absence of testing. And it is particularly important to think carefully about this when we consider questions of harm. As noted above, there is room for disagreement over the strength of the evidence about the risks posed by microplastics. But the main conclusion of the SAPEA report is that the evidence we have so far is patchy. We have reason to believe that the current level of microplastics pollution in the natural environment does not yet pose significant risk, but that is uncertain. The level of risk will certainly change if pollution levels continue to rise, and our understanding will improve as more evidence comes in.

Communicating Scientific Evidence for Policy in Practice
Engaging in scientific research and gathering evidence is a worthwhile endeavour in itself, but will only have impact outside of academia if it is communicated and made accessible to relevant audiences. L&D highlight the perils of miscommunication in their commentary.
When science communication is counterproductive, this can be for one or more of several reasons. Broadly, (1) it may be inaccurate in itself-e.g. it may summarise existing evidence inaccurately; or (2) it may be accurate, but irresponsible-e.g. it may correctly summarise the evidence but make an unhelpful contribution to the public debate or decision making process. In this latter case, the problem could lie either in the way the communication is worded by the author, or in the way it is interpreted by the reader, or both.
L&D object to the way the report summary was written, but they also make claims about the way it has been received, such as when they write "it is important to realize that a statement of absence of evidence of risk can be all too easily perceived as a statement of no risk" or "many mainstream media have picked up the 'no risk' soundbite".
We would caution against the latter speculations. It would indeed be valuable to study the reception of the report and its lay summary scientifically, to understand which specific messages and conclusions have been picked up and shared further (e.g., no risk, absence of evidence or presence of risk). In the absence of such a rigorous analysis specifically for the SAPEA report, we should not be so confident about the balance of positive and negative outcomes.
But we can draw on decades of scientific evidence on science communication to understand some basic principles (SAPEA, 2019a;Fischhoff, 2013). Negativity bias describes how negative information dominates human information processing, e.g., by drawing more attention and being more memorable, compared to neutral or positive information (Rozin and Royzman, 2001). Motivated reasoning and confirmation bias describe how humans have a natural tendency to seek confirmation of their preexisting beliefs and expectations (Nickerson, 1998). Scientists can take account of this and exert caution, especially when formulating messages of risk that correspond to concerns voiced by the public and other stakeholders such as NGOs. Transparent and honest communication maintains trust in science, and this includes communicating uncertainty and lack of evidence. Where previous work had suggested that communicating uncertainty lowers trust, because it is interpreted as lack of competence, recent research has demonstrated that such a decrease is small and only observed for verbal expressions of uncertainty (rather than numerical) (Van Der Bles et al., 2020).
We believe that the first priority of science advice is to present the evidence honestly and in its entirety. We fully acknowledge that there are other priorities too, when bodies such as SAPEA publish their evidence reviews in the interests of transparency and in the hope that they will contribute to the wider public debate on important topics; careful thought should be given to how this might be interpreted by non-experts and by those on different sides of any ongoing debate. But this does not overrule the basic mandate to present the evidence to policymakers 'as it is'.
Here, we wholeheartedly agree with L&D that "miscommunicating" the strength and significance of evidence is dangerous. Communication efforts must tread a fine line between raising concerns and not being alarmist, in a context of different audiences and different risks competing for attention. Our contention is that the way we have communicated the evidence examined in the SAPEA report treads that fine line in as careful a way as possible.

Prospect
As the risks are surrounded by uncertainty and the concerns considerable, plastics have rightly become an issue of public and political interest (Koelmans et al., 2017). The challenge for decision makers is to balance the risks and benefits of action versus inaction, often under such conditions of uncertainty. We agree that acting with foresight and precaution is a crucial principle. Indeed, the SAPEA report has already triggered positive change in this direction, for instance: In January 2019, the European Chemicals Agency (ECHA) proposed a wide-ranging restriction on intentional uses of microplastics in products placed on the EU/EEA market to avoid or reduce environmental pollution. Alongside their evidence gathering and consultations, they cite this SAPEA report as evidence to support that decision (https://echa.europa.eu/registry-o f-restriction-intentions/-/dislist/details/0b0236e18244cd73). The SAPEA report also provides an evidence base for calls for further research, for example under the EU's Horizon 2020 programme (htt ps://etendering.ted.europa.eu/cft/cft-display.html?cftId=6492); intended to plug some of the evidence gaps.
Many gaps remain in our understanding of the fate, effects and risks of microplastics. The field is arguably still in its infancy, and more science regarding the questions of risk is crucial. At the same time, policy makers, industry and society have begun to take action on macro-and microplastics, while scientists work together across opinions and disciplines to fill research gaps (SAPEA, 2019b;Rochman et al., 2016). This enables policy makers and regulators to continuously refine and prioritize options for mitigation (SAPEA, 2019b). The good news is that when it comes to the problem of plastic debris, all stakeholders seem to be on the same page: plastic does not belong in the environment and its leakage to the environment should be stopped. There may be some debate on the detail, but not on the important issues: there is no 'plastic denial'. So we will continue in the quest for "true knowledge between sensation and reflection" (Locke, 1689/1823) -undertaking both empirical measurement of effects associated with microplastics to improve the body of evidence, and reflecting on the meaning of this evidence for the environment and society, to do justice to our role as science advisors and communicators.

Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.