Exploring the quality of evidence for complex and contested policy decisions

Policy decisions on complex environmental risks often involve contested science. Typically there are no ‘facts’ that entail a unique correct policy. The evidence that is embodied in scientific policy advice requires quality assessment. Advice should be relevant to the policy issue, scientifically tenable and robust under societal scrutiny. In 2003, the Netherlands Environmental Assessment Agency adopted a standardized method, referred to as ‘guidance’, whereby key quality aspects of knowledge production and use are exhibited through a checklist for uncertainty assessment and communication. Although the guidance is not fully used within all projects yet, it is increasingly used, attitudes towards dealing with uncertainty in performing and reporting environmental assessments have changed, and communication on uncertainty in the agency’s reports has improved over the past five years. In this letter, we present results from the application of the guidance to controversies on the risks of ambient particulate matter. The active deliberation on uncertainty in the policy–advisory setting brings about a joint learning process for advisors and policy makers, which leads to a deeper understanding and increased awareness of the phenomenon of uncertainty and its policy implications.


Introduction
Policy issues such as climate change, biodiversity loss, genetically modified crops, and environmental health risks of ambient particulate matter are complex and contested (e.g., Funtowicz and Ravetz 1990, Funtowicz 2006, Pilkey and Pilkey-Jarvis 2007, Beck 2007. For lengthy periods of debate, there are few 'facts' that command universal assent. Every side may possess total certainty in the validity of their arguments, but they cannot all be correct in their conviction. Decisions need to be made before conclusive supporting evidence is available, while at the same time the potential impacts of wrong decisions can be huge. Questions that cannot be answered due to inconclusive evidence include: how likely are human-caused abrupt climate changes, such as nonlinear sea level rise (Hansen 2007)? How much mitigation of greenhouse gas emissions is needed to prevent dangerous anthropogenic interference with the climate system (Harvey 2007)? What will be the future impact of human activities on biodiversity? Can there be science-based precaution (Weiss 2006, van der Sluijs 2007? What is the fraction of particulate matter that causes health risks (Pope and Dockery 2006)?
Governmental and intergovernmental agencies that inform the public about such risks increasingly recognize that uncertainty and disagreement can no longer be suppressed or denied, but needs to be dealt with in a transparent and effective manner. For instance, in its recent report Models in Environmental Regulatory Decision Making (NRC 2007), the US National Research Council recommends that the US-EPA pay more attention to the systematic treatment and communication of uncertainties.
The problems of inconclusive and uncertain evidence in science-for-policy can be addressed along different lines. On the one hand, there are formal methods for sensitivity and uncertainty analysis (Saltelli et al 2008) as well as methods for making inferences from (uncertain) evidence. For instance, in Bayesian methods, evidence is used to update or to newly infer the probability that a hypothesis is true. In Dempster-Shafer theory (Dempster 1967, Shafer 1976, Bayesian probabilities are assigned to fuzzy sets, whereby evidence can be associated with multiple possible events or sets of events. The theory provides rules for combining evidence from multiple sources and conflicting evidence. To give a third example of formal approaches, in epidemiological work there is a practice of performing meta-analysis to combine evidence from multiple studies (e.g. Schwartz 1994). On the other hand, it is increasingly recognized that not all uncertainties can be quantified or handled in a formal way, and complementary, reflective approaches to explore the quality of evidence have been developed. Examples of such methods are pedigree analysis (van der Sluijs et al 2005a(van der Sluijs et al , 2005b, model quality checklist , data quality indicators (SETAC 1994), and data attribute rating system (Beck and Wilson 1997).
In response to emerging needs, several institutions that interface science and policy have adopted Knowledge Quality Assessment (KQA) approaches, which include both formal and reflective methods (IPCC 2005, EPA 2003, UK Strategy Unit 2002, MNP/UU 2003, Kinzig et al 2003, Janssen et al 2005, Ha-Duong et al 2007. One of these is the Netherlands Environmental Assessment Agency (PBL in Dutch; until May 2008 the Dutch acronym-used in this letter-was MNP, a part of RIVM until 2006), a governmental agency that performs independent scientific assessments and policy evaluations. MNP has recently implemented a comprehensive, multi-disciplinary approach to KQA, which takes into account the societal context of knowledge production. This approach constitutes a major innovation. It has resulted in a more transparent and systematic treatment and communication of uncertainties in Dutch environmental assessments . In this letter, we present results from the application of this approach to an environmental problem. First, we outline the theoretical and conceptual background of the approach and how it is implemented in a diagnostic checklist. Subsequently, we illustrate how it can be applied in the case of the environmental health risks pertaining to ambient particulate matter. We use the KQA approach to explore the crucial uncertainties in this particular environmental problem and how these can be made explicit and dealt with in a transparent and responsible way.

Evidence in policy advice
In the modern view of scientific policy advice, science produces objective, valid, and reliable knowledge. That view resembles the situation in a mono-disciplinary textbook: for each problem, there is just one correct solution, derived from the facts produced by science. But in the real world of policy advice on complex issues, we face uncertainty and controversy and questions arise to what extent the information can be really objective, valid, and reliable (Oreskes et al 1994, Petersen 2000, Funtowicz 2006). Scientific assessments of complex policy issues have to integrate information covering the entire spectrum from well-established scientific knowledge to educated guesses, preliminary models, and tentative assumptions. The genuine policy debates that take place use all such materials, not as certain, established facts, but as evidence, whose quality must be assessed through appropriate procedures. Since the evaluation of uncertain information can involve an assignment of burden of proof (is a substance deemed harmless until proved otherwise?), the analogy with jurisprudence is much closer than had previously been realized by spokesmen for science. And when we consider such materials as evidence brought into an argument, rather than as imperfect facts or defective knowledge, the need for analysis of their quality, including uncertainties, becomes natural and obvious.
Social studies of scientific advice show that for many complex problems, the processes within the scientific community as well as between this community and the 'external' world-policy makers, stakeholders and civil society-determine the acceptability of a scientific assessment as a shared basis for action. These processes concern, among others, the framing of the problem, the choice of methods, the strategy to gather the data, the review and interpretation of results, the distribution of roles in knowledge production and assessment, and the function of the results in the policy arena. Although assumptions underlying the design of these processes are rarely discussed openly, they are important for the knowledge becoming either 'contested' or 'robust'. More research on complex issues sometimes reveals more uncertainties and can even lead to more intense controversy and weaker evidence if these implicit assumptions are not adequately dealt with (Sarewitz 2004.
In contrast to the general practice, it is not enough to analyze uncertainty as a 'technical' problem or merely seek for consensus interpretations of inconclusive evidence. In addition, the production of knowledge and the assessment of uncertainty have to address deeper uncertainties that reside in problem framings, expert judgments, assumed model structures, etc. Because scientists are generally not well prepared for this new task, systematic guidance is needed.

Good practice guidance
The challenge to scientific advisers is to be as transparent and clear as possible in their treatment of uncertainties. Recognizing this challenge, MNP commissioned Utrecht University to develop, together with MNP, the RIVM/MNP  Assessment andCommunication (MNP/UU 2003, Janssen et al 2005). This guidance aims to facilitate the process of dealing with uncertainties throughout the whole scientific assessment process (see table 1). It explicitly addresses institutional aspects of knowledge development, openly deals with indeterminacy, ignorance, assumptions and value loadings. It thereby facilitates a profound societal debate and a negotiated management of risks. The guidance is not set up as a protocol. Instead, it provides a heuristic that encourages self-evaluative systematic critical reflection in order to become aware of pitfalls (Ravetz 1971) in knowledge production and use. It also provides diagnostic help as to where uncertainty may occur and why. This can contribute to more conscious, explicit, argued, and well-documented choices.
Following a checklist approach inspired by Risbey et al (2005), the guidance consists of a layered set of instruments (mini-checklist, quickscan, detailed guidance, and tool catalog) with increasing level of detail and sophistication (see figure 1). It can be used by practitioners as a (self-) elicitation instrument or by project managers as a guiding instrument in problem framing and project design. Using the mini-checklist and quickscan questionnaire, the analyst can flag key issues that need further consideration. Depending on what is flagged as salient, the analyst is referred to specific sections in a separate hints and actions document and in the detailed guidance. Since the number of cross-references between the documents comprising the guidance is quite large, a publicly available interactive web application has been implemented (www.mnp.nl/guidance). This web application also offers a prioritized to-do list of actions regarding what uncertainties require further assessment and what methods are suitable for that assessment. Further it generates reports of sessions to warrant that all information is traceable and documented, which enables internal and external review. It also includes an easily accessible version of the tool catalog.
In the remainder of this section, we describe each of the foci depicted in table 1, which provide the structure of the minichecklist and quickscan shown in figure 1.

Problem framing
'Problem framing' relates to the inclusion and exclusion of different viewpoints on the policy problem and the connections the policy analysis should make to other policy problems. It also involves choices where the system boundary is drawn for the assessment of the problem. Decisions on problem framing influence, for instance, the choice of models (what domains should they cover, which processes should be included, etc) and the choice of what knowledge is considered relevant to include in the analysis.

Involvement of stakeholders
'Involvement of stakeholders' concerns the identification of the relevant stakeholders (e.g., government; parliament; governmental advisory councils; other governmental actors at local, national or international levels; research institutes; scientists; sector-specific stakeholders; employers organizations; labor unions; environmental and consumer organizations; unorganized stakeholders; citizens; media; etc) and their views on the problem, including disagreements among them. There are several ways in which stakeholders can be involved in the assessment. They can either be involved directly or, alternatively, analysts can try to incorporate their perspectives.

Selection of indicators
In the selection of indicators for scientific policy advice, choices are made with respect to output processing and interpretation: decisions are taken on what indicators are calculated and included in the study. One should realize that alternative choices can always be made and that sometimes alternatives are brought forward and are advocated by participants in the societal and political debate. The uncertainties associated with indicators may differ depending on the indicators chosen, and indicators may be more or less representative of a problem.

Appraisal of knowledge base
In the appraisal of the knowledge base, one establishes what quality of information is needed for answering the questions posed, which depends on the required quality of the answers. Bottlenecks in the knowledge and methods which are needed for the assessment may be identified and decisions to pursue further research may be taken in the case of deficiencies. Often, however, it will not be possible to reduce the uncertainty.

Mapping and assessment of relevant uncertainties
The uncertainties in the scientific evidence can be characterized according to an uncertainty typology and subsequently  Table 2. Uncertainty matrix. The labels refer to the sources of uncertainty mentioned in the introduction to section 4 in the text. The function of this matrix is to identify the most salient uncertainty types that should be addressed in uncertainty assessment and communication. The assignment of uncertainties to types reflects the authors' judgment.
Level of uncertainty (from determinism, through probability and possibility, to ignorance) plans can be made for assessing these uncertainties more thoroughly by using standardized uncertainty assessment tools, e.g., taken from the tool catalog (van der Sluijs et al 2004). All these activities take place with an eye on enabling one to state the consequences of these uncertainties for the most policyrelevant conclusions of the study. We here offer brief descriptions of the uncertainty typology and the tool catalog.

Nature of uncertainty
In order to facilitate communication about the different types of uncertainty that arise in scientific assessments, an uncertainty typology is part of the MNP guidance for Uncertainty Assessment and Communication (Janssen et al 2005, Petersen 2006).
The typology is based on a conceptual framework that resulted from a process involving an international group of uncertainty experts most of whom participated in developing or reviewing the guidance (Walker et al 2003). Uncertainty can be classified along the following dimensions: its 'location' (where it occurs), its 'level' (whether it can best be characterized as statistical uncertainty, scenario uncertainty or recognized ignorance) and its 'nature' (whether uncertainty primarily stems from knowledge imperfection or is a direct consequence of inherent variability). In addition, the typology distinguishes the dimensions 'qualification of knowledge base' (what are weak and strong parts in the assessment) and 'value-ladenness of choices' (what biases may shape the assessment). The typology is presented as a matrix (see table 2). This uncertainty matrix is used as an instrument for generating an overview of where one expects the most important (policy-relevant) uncertainties to be located (the first dimension), and how these can be further characterized in terms of the other uncertainty dimensions mentioned. The matrix can be used as a scanning tool to identify areas where a more elaborate uncertainty assessment is required. The different cells in the matrix are linked to available uncertainty assessment tools suitable for tackling that particular uncertainty type. These tools are described in a tool catalog that aims to assist the analyst in choosing appropriate methods.
The tool catalog provides practical ('how to') information on state-of-the-art quantitative and qualitative uncertainty assessment techniques, including sensitivity analysis, NUSAP (Funtowicz andRavetz 1990, van der Sluijs 2005), expert elicitation, scenario analysis, and model quality assistance . A brief description of each tool is given along with its goals, strengths and limitations, required resources, as well as guidelines for its use and warnings for typical pitfalls (see also Refsgaard et al 2007). It is supplemented by references to handbooks, software, example case studies, web resources, and experts. The tool catalog is a 'living document', available on the web (at www.nusap.net/guidance), to which new tools can be added.

Reporting of uncertainty information
The assessors must ensure that the uncertainties are adequately communicated, mainly through formulating messages that are robust with respect to these uncertainties-that is, the strength of the policy-relevant statements made is tailored to the reliability of the underlying evidence.

An illustrative example: environmental health risks from particulate matter
We will illustrate the application of the guidance for a particular environmental problem: the case of health risks of ambient particulate matter. Particulate matter is air pollution consisting of a complex mixture of particles of various diameters and various chemical compositions. Depending on the diameter of the particles, either the abbreviation PM 10 is used (for particles with a diameter up to 10 µm) or the abbreviation PM 2.5 (for particles with a diameter up to 2.5 µm). Exposure to PM in ambient air has been linked to a number of different health outcomes, ranging from modest transient changes in the respiratory tract and impaired pulmonary function, through increased risk of symptoms requiring emergency room or hospital treatment, to increased risk of death from cardiovascular and respiratory diseases or lung cancer. This evidence stems from studies of both acute and chronic exposure, and from toxicological studies (reviewed in, e.g., WHO 2006). Because of these health risks, the regulation of ambient particulate matter has become part of air quality policies across the world. For instance, the US and EU are regularly updating their air quality standards for PM on the basis of progressing scientific evidence. And cities in Asia are beginning to implement command-and-control policies to reduce PM emissions from traffic and industry.
The issue of concern in our example is the effect of different PM policy strategies (e.g., setting of PM standards) on public health. Even though the evidence from epidemiological studies accumulates and consistently shows statistically significant associations between health effects and PM 10 or PM 2.5 concentrations (Pope and Dockery 2006), large uncertainties and controversy remain about the sources, exposure and causes of health effects (RIVM 2002, MNP 2005, Moolgavkar 2005, Maas 2007). Both quantitative and qualitative approaches can be followed to analyze these uncertainties. The US-EPA, for example, has a long tradition in quantitative uncertainty analysis.
In 1997, the US-EPA established guiding principles for Monte Carlo analysis, which recommends a two-dimensional approach in which variability and epistemic uncertainty are treated separately (EPA 1997). An application in air pollution emission is, for instance, Frey and Bharvirkar (2002). US-EPA further develops a guidance on the development, evaluation, and application of regulatory environmental models along the lines of quantitative uncertainty analysis (EPA 2003). In this article we apply a more qualitative approach to the uncertainties in the environmental health risks from particulate matter.
On the basis of an expert meeting with Dutch experts on particulate matter and health organized by MNP and Utrecht University in May 2005 (Kloprogge and van der Sluijs 2006), combined with the Impact Assessment of the EU's thematic strategy on air pollution (EC 2005), we have identified the following key sources of uncertainty relevant to policy decisions regarding the PM and health problem: (a) attribution of effects to individual species of particle (causal fraction) or other pollutants or stressors; (b) quantification of the mortality impact of exposure to fine particles; (c) distribution of risk over subgroups of the population (to what extent is the relative risk age-dependent?); (d) assessment of effects of chronic exposure to particles on the prevalence of bronchitis; (e) inter-annual variability in meteorology; (f) emission data; (g) poor understanding of secondary organic particles; and (h) measurement uncertainty.
Below, we use the structure of the MNP guidance for Uncertainty Assessment and Communication, shown in table 1, for systematically reflecting on issues of uncertainty management and communication in the case of health risks from PM.

Problem framing
Four scientific problem views can be distinguished in the policy debate on particulate matter: 'PM 2.5 is the problem', 'PM 10 is the problem', 'Specific traffic related species are the problem (e.g., diesel soot)', and 'It is mainly a socioeconomic problem (PM not the main cause)' (Maas 2007). The conclusions of scientific assessments of the PM problem are critically sensitive to the problem frame chosen, while the present state of knowledge is inconclusive regarding which framing is most adequate. For instance, strong associations can also be found between cardiopulmonary diseases and traffic noise (Kempen et al 2002), the quality of housing and the diet of low income families (Eschenroeder and Norris 2003). Even though recent attempts to correct for such confounding effects have strengthened the evidence for low-dose PM effects on health, it can still not be ruled out that the observed health effects are largely caused by an accumulation of other causes in low income neighborhoods close to highways. The degree to which such interwovenness with other problems is taken into account and the choices made for the system boundary may influence the conclusions. This requires systematic reflection by science advisers. The way uncertainties about the health risks of PM should be dealt with in policy advice depends on the role of such advice in the policy process. In some countries, such as the United States, PM 2.5 has already been regulated since 1997, while in the European Union such regulation will come into effect in 2008. In the first case, the focus of recent assessments is more on the effectiveness of regulation and on uncertainty in emissions than on the need for setting new air quality standards. And thus the types of uncertainties that are most important to deal with differ among these cases.

Involvement of stakeholders
Participation of stakeholders in knowledge production can help to increase the quality of the risk assessment. Participation stimulates the inclusion of more viewpoints (e.g., NRC 1996), which in turn helps to rule out the possibility that important dimensions of the problem are overlooked. Participation of stakeholders in assessment can also improve the use of assessments. For instance, in the US, proposals for new air quality standards, such as the revision of the US National Ambient Air Quality Standards (NAAQS) for PM proposed in 2005, undergo a public review that aims to build a widely shared scientific basis. As another example: in the Clean Air for Europe program, over one hundred stakeholder meetings were organized to disseminate results, to share experiences on the use of different policy instruments (including economic instruments), to discuss issues relating to the implementation of current air quality legislation, and to review the uncertainties and their implications.

Selection of indicators
Metrics matter in air quality assessments (Bell et al 2005). For particulate matter, the decision to use particle size to assess and regulate health risks of PM does not necessarily lead to the best protection of human health. Generally speaking, particle size is an imperfect proxy for toxicity. The chemical composition and reactive surface of the particles may be of much more importance, but are difficult to measure and monitor. PM 10 and PM 2.5 may not be the most relevant indicators for the health risks from PM; depending on the problem frame chosen, other indicators become more relevant (e.g., specific chemical fractions). If specific chemical PM fractions are suspected to be primarily responsible for the health impacts (e.g., particles emitted from cars), then reducing SO 2 emissions from electric utilities may not be an effective way to reduce health risks, despite the fact that PM 2.5 concentrations (secondary particles) decrease. The choice of indicators makes a huge difference in practice.

Appraisal of knowledge base
There is a broad consensus among scientists and policy makers that PM constitutes health risks that need regulation. However, the evidence from toxicological and biological studies is still weak (Moolgavkar 2005). While there are several plausible hypotheses, it is recognized that we are ignorant of the true underlying mechanism that explains the association between PM and health effects. Only a small number of long-term epidemiological cohort studies have been performed, mainly in the United States. It is questionable whether the results are representative for other countries. Furthermore, it is difficult to determine the exposure to PM (exposure depends on the behavior of individuals, for which assumptions have to be made), to establish a reliable exposure-effect relationship, and to account for multi-causality and synergies. Statistical uncertainty in epidemiological models and data prevail. Finally, there are bottlenecks in determining PM emissions and concentrations: measurements are often unreliable or not representative for larger areas; and models often give estimates that are not in agreement with measurements.

Mapping and assessing relevant uncertainties
In table 2, the key sources of uncertainty (a) through (h) are mapped on the uncertainty typology of the guidance. In the table we can see that model structure uncertainty and data uncertainty are particularly pertinent in this case, that the quality of the evidence for the causal models is considered problematic, that model assumptions may be subject to subjective choices and that the data uncertainty for emissions, meteorology and concentrations are largely characterized by variability and can thus not be fully reduced. This analysis shows that the classical statistical uncertainty methods are not sufficient to deal with the key uncertainties in the PM and health case, and that other methods are required to address the uncertainties. For instance, scenario uncertainties can be addressed by scenario analysis techniques. For an assessment of the qualification of the knowledge base for a particular model structure, for example, pedigree analysis (see box 1) , Refsgaard et al 2006 or a model quality checklist  can be used. The value-ladenness of a model can be assessed, for instance, by way of critical analysis of assumptions  or perspectivebased scenarios (van Asselt 2000). In CAFE (Clean Air For Europe), some of the uncertainties have been analyzed by way of sensitivity analysis, focusing particularly on uncertainties in energy demand and agricultural production, emission data and emissions abatement factors, the various ambition levels, or target-setting methods.
This mapping and assessment of relevant uncertainties ends with a prioritization of uncertainties. In the Dutch expert workshop mentioned earlier (Kloprogge and van der Sluijs 2006), the experts ended up with a ranking with as top three: (a) attribution of effects to individual species of particle (causal fraction) or other pollutants or stressors; (b) quantification of the mortality impact of exposure to fine particles;

Box 1. Pedigree analysis
Pedigree analysis is part of the NUSAP (numeral unit spread assessment pedigree) system for uncertainty assessment (Funtowicz and Ravetz 1990, van der Sluijs et al 2005a, 2005b. Pedigree conveys an evaluative account of the production process of information, and indicates different aspects of the underpinning of the numbers and scientific status of the knowledge used. Pedigree is expressed by means of a set of pedigree criteria to assess these different aspects. Assessment of pedigree involves qualitative expert judgment and can be done in an expert workshop or in expert elicitation interviews. To minimize arbitrariness and subjectivity in measuring strength, a pedigree matrix is used to code qualitative expert judgments for each criterion into a discrete numeral scale from a low (0) to high (4) level of knowledge, with linguistic descriptions (modes) of each level on the scale. For example, the criterion 'empirical basis' has a scale running from 'crude speculation (0)' to 'large sample of direct measurements (4)'. Each special sort of information has its own aspects that are key to its pedigree, so different pedigree matrices using different pedigree criteria can be used to qualify different sorts of information. An example of the outcome of a pedigree analysis applied to Netherlands national emission inventories is given in figure 2. (c) distribution of risk over subgroups of the population (to what extent is the relative risk age-dependent?).
This implies that uncertainties that are very hard to quantify in a reliable way dominate in this case study.

Reporting uncertainty information
In the end, uncertainty assessment should assist analysts in crafting robust policy-relevant conclusions, for instance conclusions about the health benefits of introducing specific PM standards. Such conclusions may contain qualifiers that indicate the quality of the evidence behind them (e.g., through the use of subjective probability statements) or that make clear that particular scenario assumptions have been made. Since we have seen in the preceding analysis that recognized ignorance is highly salient, it is furthermore important to communicate this ignorance and its implications for society and politics, e.g., in terms of risks. It is important to acknowledge that communication involves more than reporting. Ideally, the policy meaning of the scientific uncertainty should be jointly established in a dialog with all stakeholders.

Institutional dimensions of KQA
The institutional challenges of implementing this new approach to knowledge quality assessment (KQA) should not be underestimated. It entails much more than disseminating the documents through an organization. For example, MNP's top management has ordered and subsequently endorsed the guidance; MNP's methodology group led the development of the mini-checklist and quickscan; the use of the guidance is now mandatory as part of the agency's quality assurance procedures; and the staff is actively trained to acquire the necessary skills. In addition, a methodological support unit is available in the agency to assist and advise in assessment projects. The required process of cultural change within the institute was consciously managed over the period [2003][2004][2005]. Although the guidance is not fully used within all projects yet, it is increasingly used, attitudes towards dealing with uncertainty in performing and reporting environmental assessments have changed, and communication on uncertainty in MNP reports has improved over this period (Wardekker et al 2008).

Conclusion
The application of the guidance for Uncertainty Assessment and Communication to the PM and health problem delivers several insights with respect to the guidance's six foci (table 1). First, the problem frame chosen determines which types of uncertainties are most important to deal with. Second, the representativeness and acceptance of a scientific assessment depends critically on how well the assessment relates to stakeholders' views on what aspects of the problem are relevant. Third, the choice and precise definition of indicators makes a huge difference for managing the PM and health problem. Fourth, with respect to the strength of the knowledge base, the main bottlenecks are: ignorance about the true underlying mechanisms that explain the association between PM and health effects, representativeness of the few available US cohort studies for other countries, and accounting for multi-causality and synergies in establishing exposure-effect relations. Fifth, from the characterization of the uncertainties in terms of the uncertainty matrix, it becomes clear that model structure uncertainty is salient. Furthermore, it appears not to be enough to address only statistical uncertainty. Recognized ignorance and scenario uncertainty are crucial in this case. Finally, uncertainty assessment should assist analysts in crafting robust policy-relevant conclusions, for instance by making use of qualifiers and by recognizing ignorance. The societal and political risks associated with the underlying uncertainties should be discussed in a dialog with policy makers and stakeholders.
Overall, the structured approach of the guidance promotes critical reflection without prescribing a strict protocol. The active deliberation on uncertainty in the policy-advisory setting brings about a joint learning process for advisors and policy makers, which leads to a deeper understanding and increased awareness of the phenomenon of uncertainty and its policy implications. The expectation is that this process may lead to a more responsible, accountable, more transparentand ultimately more effective-use of intrinsically uncertain science in decision making. Looking at the institutional dimensions, we stress the key importance of training and facilitation to support the process of knowledge quality assessment (KQA).
Transparent and effective uncertainty management in science-for-policy requires systematic reflection and argued choice. KQA approaches such as the one exemplified here can enhance societies' capacity to deal with uncertainties surrounding knowledge production and knowledge use in the management of complex environmental risks.