Trapped between two tails: trading off scientific uncertainties via climate targets

Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

Analyses of greenhouse gas targets typically emphasize uncertainty about the warming produced by emission pathways [1][2][3][4], uncertainty about the damages incurred by each degree of warming [5][6][7][8], and uncertainty about the cost of reducing greenhouse gas emissions [9,10]. Much work has focused on how fundamental nonlinearities in the climate system create skewed distributions for temperature change [11][12][13]. Economic analyses of greenhouse gas policies must further account for nonlinearities in the economic impacts of climate change and for the nonlinear benefits of potential advances in low-carbon technologies. We show how these fundamental economic nonlinearities interact with the fundamental climatic nonlinearities to create skewed Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. distributions for both the costs and the benefits of climate policy.
However, estimating these skewed distributions for practical use requires contentious assumptions. Imagine that a forecaster in 1900 had to predict today's cost of generating electricity and how it interacts with the climate. Would she have forecast the rise of natural gas turbines and nuclear plants? Would she have foreseen how air conditioning would create summer-peaking load curves? In light of this difficulty, some methods for long-range policy analysis have turned from emphasizing the optimality of a particular policy to evaluating the robustness of multiple policy options across a range of plausible futures [14,15]. In the second part of the paper, we compare how warming scenarios, impact scenarios, and technology scenarios affect policy evaluations. In doing so, we use a novel combination of abatement costs generated by a technology-rich 'cost-effectiveness' integrated assessment model and climate benefits defined by the damage functions used in more stylized 'policy-optimization' integrated assessment models. We find that a 500 ppm carbon target provides positive net benefits relative to a 550 ppm target across nearly the full range of scenarios. In contrast, increasing the target's stringency to 450 ppm provides positive net benefits only in high-damage futures, in good-technology futures, or with policymakers who heavily weight future costs and benefits. While warming uncertainty has historically received the most scientific attention, impact uncertainty and technology uncertainty are currently more crucial to policy evaluations.

Trading tails via emission policies
We analyze the implications of different climate targets by modeling uncertainty about warming, by interacting it with uncertainty about economic damages, and by connecting technological uncertainty to cost uncertainty. We first demonstrate general propositions about how each type of uncertainty depends on greenhouse gas policies. We later quantitatively compare the relative importance of each type of uncertainty for policy evaluations.

Analyzing the distribution of climate damages
The benefits of a greenhouse gas mitigation policy come from avoiding the damages associated with future warming. We begin by analyzing the distribution of annual benefits 100 years after a step-increase in radiative forcing. Linear feedback analysis explains the distribution of long-term, 'equilibrium' warming in terms of uncertainty about individual feedback processes [13]. To approximate the distribution for nearer-term, 'transient' temperature change, we follow [16] in adapting the linear feedback framework by representing ocean heat capacity as a time-varying negative feedback. We then have the transient temperature change T (K) one century after a step-increase of R (W m −2 ) as a function of aggregate equilibrium feedbacks f e (nondimensional): The reference system (blackbody) sensitivity to forcing is [17], and the change in forcing from carbon dioxide (CO 2 ) concentration M is R = 5.35 ln(M/M 0 ) with M 0 being pre-industrial CO 2 of 280 ppm [18]. The moderating ocean 'feedback' is f o = −0.16 [16]. Figure 1(a) demonstrates how ocean heat uptake moderates the response of temperature to strong feedbacks. The nonlinearity of temperature in feedback strength produces the familiar skewed distribution for long-term equilibrium warming (right axis). However, the distribution for the next century's warming demonstrates much less skew (left axis) because, as is well understood in the scientific literature, the ocean's heat capacity initially moderates high-warming outcomes.
A damage function converts temperature change into economic losses. Highly aggregated integrated assessment models commonly represent warming as reducing economic output by a fraction that increases with a polynomial of temperature [20,21,19]: for a, b > 0. The parameter a is an economic loss coefficient calibrated to studies of moderate warming. The parameter b controls how the damage function extrapolates from impact assessments for moderate warming to impacts at much greater levels of warming. A quadratic term is the most common assumption (b = 2), but some have used linear (b = 1), cubic (b = 3), and higher-degree polynomials inside of the damage function. In accord with the convention, we refer to a damage function as, e.g., 'quadratic' based on its choice of b.
The convexity of the damage function over moderate temperatures (controlled by b) amplifies the skew in the transient temperature distribution. The distribution h D for damages follows from the distribution h f for equilibrium feedbacks: The online supplementary material (available at stacks.iop. org/ERL/8/034019/mmedia) derives approximations to the mean, variance, and skewness of the damage distribution. It shows that raising the CO 2 concentration not only increases expected damages but also increases the spread of the distribution. The skewness of damages is positive, indicating that the tail extends towards high-damage outcomes. As CO 2 reaches high levels, the increasing spread flattens the damage distribution and thereby reduces its skew. Figure 1(b) shows how increasing (i.e., relaxing) the CO 2 target not only shifts the damage distribution towards less favorable outcomes but also increases its dispersion: tightening a CO 2 target not only reduces expected damages but also trims the damage distribution's right tail. Nonetheless, the positive skew is pronounced across the entire range of CO 2 targets. For a given value of the economic loss coefficient, a cubic damage function places substantially more probability mass on extreme outcomes than does a quadratic damage function.

Analyzing the distribution of abatement cost
By the foregoing analysis, more stringent climate targets reduce both expected damages and the spread of possible damages. Using a parsimonious model of technology and abatement cost, we now show that tightening the CO 2 target can also induce an opposing skew in the distribution for the cost of the required emission reductions. Let z = i z i be an aggregate technology index, with z i = 1 indicating a breakthrough in technology i and z i = 0 indicating that technology i stays at its baseline level. The cost of reducing emissions to reach CO 2 target x is C(z; x). Costs decrease in the target and in the technology index, with better technology Normally distributed uncertainty about equilibrium feedbacks and about an aggregate technology index translate into distributions for damages and abatement cost that are skewed towards undesirable outcomes. The feedback distribution follows [13] in using normally distributed equilibrium feedbacks with a mean of 0.65 and a standard deviation of 0.13. The economic loss coefficient a in the damage calculations has 2.5 • C of warming reducing output by 1.7% under the quadratic specification [19]. The abatement cost function assumes constant elasticity of −0.5 with respect to technology (i.e., achieving 1% more breakthroughs lowers abatement cost by 0.5%) and is normalized to the 650 ppm target's no-breakthrough scenario. The cost of each CO 2 target under the no-breakthrough scenario comes from the Global Change Assessment Model 3.0. most valuable when the CO 2 target is stringent (see online supplementary material). If technologies substitute for each other (as might sources of renewable electricity), then we have costs convex in z, but if technologies complement each other (as might batteries and solar panels), then we have costs concave in z.
Imagine there are n possible low-carbon technologies under development. Each achieves a breakthrough with identical, independent probability p, irrespective of the choice of CO 2 target. This assumption induces a binomial distribution on z, which is approximately normal for np sufficiently large. The density function for abatement cost is: The abatement cost distribution is a nonlinear transformation of the technology distribution. If technologies substitute for each other, then the probabilities of low-technology outcomes stretch to cover a relatively broader region of the cost space (the denominator decreases in z). Even when the technology distribution is itself symmetric, the cost distribution has a positive skew, with the tail reaching towards higher costs. The opposite result holds when technologies complement each other. The relationship between technologies therefore determines the direction in which the cost distribution skews. The spread of the abatement cost distribution depends on the CO 2 target. We approximate its standard deviation as: Tighter CO 2 targets make costs depend more strongly on technology (i.e., the cross-partial of C(z; x) is positive). They therefore increase the spread of the abatement cost distribution. While tighter targets trim the tail of the damage distribution, they extend the tail of the abatement cost distribution. The right axis of figure 1(c) shows that more stringent targets can affect the spread of the cost distribution as strongly as the mean.
To trim both tails, emission policies must provide effective innovation incentives or form a portfolio with re-search and development (R&D) policies. Induced innovation or targeted R&D policies might increase the number of possible technologies or might raise the probability of a breakthrough in each technology. For n and p sufficiently small, these technology policies further increase the spread of the abatement cost distribution because they make low-cost outcomes more attainable (see online supplementary material). However, if technologies are substitutes and n and p are sufficiently large, these technology policies decrease the spread of the abatement cost distribution: by making it more likely that some combination of technologies will achieve success, they shift some probability mass from higher-to lower-cost outcomes, but because additional breakthroughs do not lower cost substantially once some breakthroughs have already been obtained, the probability mass already concentrated around low-cost outcomes does not shift far towards still-lower-cost outcomes (left axis of figure 1(c)). The main effect of these technology policies is not to make the lowest-cost possible outcomes much cheaper but to make the set of lower-cost possible outcomes more likely.
Thus far we have shown that common assumptions about damages and technology imply that both the damage and abatement cost distributions have tails extending towards undesirable outcomes. These tails become more salient as the distributions grow more dispersed. Greenhouse gas targets directly affect the expected values, but they also affect the spreads. If emissions are not reduced at all, abatement cost is guaranteed to equal zero at the risk of high-damage outcomes. Depending on how fast each tail grows with the climate target, a middling target could leave both distributions with significant tails. Meanwhile, holding technology constant, deep emission reductions trim the tail of the damage distribution while inflating the tail of the abatement cost distribution. Only a policy portfolio that combines deep emission reductions with effective innovation incentives can successfully trim both tails at once.

Climate targets' robustness to each type of uncertainty
We have seen that uncertainties about warming, damages, and abatement cost together skew the distributions of benefits and costs towards undesirable outcomes. More stringent greenhouse gas targets narrow the benefit distribution by inflating the cost distribution. However, actual policy evaluations must go beyond these basic facts into the troubled waters of forecasting both long-run technology development and the social damages incurred by potential future warming. The relative policy importance of major uncertainties has previously been assessed via Monte Carlo analyses that use highly aggregated integrated assessment models to generate consumption paths [19,[22][23][24][25]. We here adopt the perspective of robust decision-making [14,15]. Rather than assign probability distributions to each uncertain factor, we instead assess which policies perform well over a broad set of plausible futures. We also learn which possibilities most strongly influence policy evaluations and thereby learn about valuable directions for future research.
We assess the robustness of two proposed greenhouse gas constraints to a range of technology, damage, and warming scenarios. Uncertainty about warming is the best understood of the three and so the most amenable to probabilistic treatment. It has long been represented as uncertainty about a metric called climate sensitivity [26], which is the equilibrium warming induced by doubling the carbon dioxide concentration. However, the probabilities of different warming outcomes themselves depend on how one combines climate models' estimates of the strength of individual feedback processes [27]. We evaluate this uncertainty by using two markedly different distributions for equilibrium climate sensitivity in addition to exploring three distinct values of climate sensitivity from the Intergovernmental Panel on Climate Change's (IPCC's) 'likely' range.
The probabilities of technology outcomes or damage functions are far less understood: the complex relations between the climate and the economy restrict our ability to forecast the economic impacts of moderate to high levels of warming [7,28], and neither expert assessments nor historical experience provide a sure means of predicting technological change beyond the immediate future. We account for uncertainty about the damage function by applying four different functional forms from the literature, and we capture uncertainty about technological change by undertaking a full-permutation scenario analysis. Our analysis is a novel combination of a technology-rich integrated assessment model for calculating the costs of climate targets and damage functions for defining the benefits of adopting those targets 3 . The results demonstrate the sensitivity of climate cost-benefit analysis to each of the uncertain factors and to their interactions. Further, by measuring the benefit-cost ratio of a climate target in terms of its 'break-even' damages from moderate warming, our results also illustrate the range of moderate warming damages compatible with positive net benefits from choosing a given emission constraint instead of a slightly weaker one.

Integrated assessment framework
We represent uncertainty about technological change by exploring a set of 384 possible technology futures presented in [35]. These technology futures are processed through the Global Change Assessment Model (GCAM 3.0), extended to 2290 for assessing long-term warming effects. GCAM is an open-source global integrated assessment model with detailed energy technology representations 4 . The physical Earth system in GCAM is represented by the Model for the Assessment of Greenhouse gas Induced Climate Change ]. The human system in GCAM links three major modules representing the economy, energy systems, and agriculture and land-use. The simulation process determines a consistent set of market-clearing prices for all energy, agricultural, and forest products. It tracks the emission of 16 greenhouse gases, aerosols, and short-lived species up to 2095. After 2095, non-CO 2 emissions are held constant while CO 2 emissions are fully simulated for another two centuries. These technology futures are combined with three sets of CO 2 emission constraints (450, 500, and 550 ppm) to yield 1152 least-cost abatement paths and corresponding emission scenarios. By coupling these with 17 climate sensitivity values, we obtain a total of 19 584 data points that define the costs and benefits of each emission policy.
We ultimately measure the benefits of a CO 2 target x compared to one 50 ppm higher, in accord with the policy focus on carbon concentration targets and as an approximation to the marginal tradeoffs captured in more stylized policy-optimizing integrated assessment models. Under technology scenario z, the incremental cost of attaining CO 2 target x instead of one 50 ppm higher is C xz , a present value over 2005-2290. The abatement cost histograms in figure 2(a) have a positive skew with the expensive tail growing as the CO 2 target becomes more stringent.
Following from the previous section's analysis, this positive skew occurs because abatement technologies imperfectly substitute for each other within GCAM's optimized pathways. If these frequency histograms were interpreted as probability distributions, they would assign more probability to extreme outcomes than would the binomial assumption in the earlier analytic framework.
In order to estimate the benefits of adopting each CO 2 target, we use MAGICC 5.3 to generate warming outcomes along each least-cost emission path from GCAM ( figure 2(c)). Different values of climate sensitivity produce different warming paths in MAGICC 5 . We undertake runs with climate sensitivity varied between 0 and 8 • C at 0.5 • C intervals. We report values for three runs with a known value of climate sensitivity in the IPCC's 'likely' range of 2-4.5 • C [40]. We also integrate the full set of 17 climate sensitivity values using two discretized distributions for climate sensitivity ( figure 2(b)): a narrow distribution 5 Equilibrium climate sensitivity is a parameter in MAGICC 5.3. Increasing this parameter affects the transient evolution of the climate system by increasing the surface temperature change required to maintain energy balance in each period for a given change in ocean heat content. The exact effect in any given period also depends on how ocean heat content evolves in a more sensitive climate. that concentrates probability mass on the feedback values estimated by climate models, and a broad distribution that recognizes the difficulty of learning about feedback strength from the available ensemble of climate models [27]. These two distributions each have a mode near 3 • C.
Altogether we have temperature T xzct for each CO 2 constraint x, technology scenario z, climate sensitivity c, and year t from 2005 to 2290. A damage function then converts each temperature outcome into economic losses. Climate damages (as a fraction of pre-damage output) are commonly represented as a function of temperature, exogenous no-damages global output Y t , and the economic loss coefficient a. We consider four damage functions from the literature (figure 2(d)) 6 . First, a quadratic polynomial in temperature has been the traditional benchmark in highly aggregated integrated assessment models [19,20]: . Second, some models have assessed a linear function [21,23]: The linear specification assigns smaller damages to high temperatures than does the quadratic specification, but it also assigns greater damages to low temperatures. Third, these same studies have also applied a cubic function [21,23]: . The cubic specification assigns greater damages to high temperatures than do the quadratic or linear specifications, but it also assigns smaller damages to low temperatures. Finally, an 'additive' function recognizes limited ability to substitute consumption for deteriorating environmental quality [42,43]: . Under this additive function, damages increase with global output because we value additional environmental quality relatively more than additional consumption once we have already attained high levels of consumption.
Adopting CO 2 target x instead of one 50 ppm higher produces incremental benefits B xzd (c; a) by reducing the present value of future damages: where d indicates the damage function. The consumption discount rate r is the same one used in calculating the present value of abatement cost. GCAM's baseline rate is 5% yr −1 , close to the 5.5% annual rate in [19]. We assess sensitivity to this assumption by also using a consumption discount rate of 1.6% yr −1 to match preference parameters in [21]. where climate sensitivity has probability mass function h c (·). These expected benefits define the present value of a climate policy given a technology scenario and damage assumption 7 . Finally, we derive the value of the economic loss coefficient that exactly equates abatement cost C xz and expected benefits E[B xzd |a]. We call this value the 'break-even coefficient'. It in turn implies a 'break-even loss', defined as the damages from 2.5 • C of warming at which a climate target exactly pays for itself under given assumptions about the damage function, climate sensitivity, and technology future. The loss from 2.5 • C of warming is a previously studied parameter that determines the level of a given damage function, whereas the functional form assumption discussed above determines how the function extrapolates damages from 2.5 • C to other levels of warming. Framing the results in terms of the break-even loss provides a familiar basis for assessing the economic attractiveness of each emission constraint. The average estimate in the economics literature is that 2.5 • C of warming reduces global output by 0.7%, with the estimates having a standard deviation of 1.2% [8]. If an emission constraint has a break-even loss below 0.5%, it probably provides net benefits in that given future. If its break-even loss is above 2%, then it only provides net benefits in that future if losses from 2.5 • C are at the high end of economic estimates. Figure 3 presents the results of each of the 15 360 cases (384 technology futures × 5 climate sensitivity scenarios × 4 damage functions × 2 CO 2 target comparisons, all with a 5% annual discount rate) in a common analytic framework. Figure 4 does the same for the lower annual discount rate of 1.6%. Break-even losses within the darker shaded areas make policy attractive under the central damage estimates from [8], while higher break-even losses make policy attractive only if damages from moderate warming are above the bulk of the estimates summarized in [8]. The relative importance of each type of uncertainty becomes apparent by visually comparing how it affects the spread of break-even losses and their relation to the conventional economic estimates.

Results: determinants of break-even loss
First, each box-and-whiskers indicates the role of technology uncertainty. We see that technology uncertainty creates high variance in the cost of attaining the 450 ppm CO 2 target. Technology outcomes have a smaller impact on the 500 ppm CO 2 target's cost-effectiveness because it requires less abatement effort and hence can be achieved with low cost by a limited set of technologies. As forecast by the above theoretical analysis, the distribution of technology outcomes skews strongly towards high break-even costs, reflecting that different low-carbon technologies partially substitute Figure 3. The damages from 2.5 • C of warming (as a percentage of global output) that would equalize the additional abatement cost C xz and expected climate benefits E[B xzd |a] from adopting CO 2 target x instead of one 50 ppm higher, all calculated with a 5% consumption discount rate. The range of technology scenarios (n = 384) is represented by each box (median and interquartile range) and its whiskers (minimum and maximum). Comparing across columns within a group reveals the effect of changing the distribution for climate sensitivity, comparing across groups reveals the effect of changing the damage function, and comparing across plots reveals the effect of changing the CO 2 target. The darker shaded region indicates output losses within one standard deviation (σ ) of the average best estimate (µ) summarized in [8], and the lighter shaded region indicates those losses within two standard deviations. (a) 450 ppm CO 2 target (versus 500 ppm). (b) 500 ppm CO 2 target (versus 550 ppm). for each other. The first few breakthroughs are the most helpful, and there are many ways to get them. The 500 ppm target's cost-effectiveness depends only on avoiding the least favorable combination of technology outcomes, while making the 450 ppm target cost-effective instead requires achieving multiple technological breakthroughs 8 .
Second, comparing across columns within a grouping gives the effect of changing the climate sensitivity distribution. As expected, increasing climate sensitivity makes climate policy more valuable: we see the boxes step steadily down. Further, learning that climate sensitivity were at 2 • C rather than 3 • C would be more important for policy evaluations than learning it were at 4 • C rather than 3 • C. Representing climate sensitivity as unknown but obeying a distribution with a mode near 3 • C does not produce dramatically different outcomes from assuming it is known to be exactly 3 • C. However, using the broad distribution does slightly reduce the expected benefits of a greenhouse gas policy because the additional probability attached to low-warming outcomes has slightly more influence on benefit calculations than does the additional probability attached to high-warming outcomes.
We assess the value of induced innovation or targeted R&D policies by comparing the effect of better technology to the effect of altered climate sensitivity. Switching from the 75th percentile technology outcome to the 50th percentile technology outcome is similar in magnitude to increasing climate sensitivity by 1 • C. Switching from the 75th to the 25th percentile is similar to raising climate sensitivity by 2 • C with the quadratic damage function. Estimates of climate sensitivity have been remarkably stable over the past century. A 1 • C shift in our best estimate of climate sensitivity would be a dramatic change, but it would affect a climate policy's net incremental benefits by no more than would attaining a few additional technology breakthroughs. These additional breakthroughs are especially important for the more stringent CO 2 targets. Policies that advance low-carbon technologies have substantial value in making stringent CO 2 targets cost-effective. Further, stringent CO 2 targets and improved low-carbon technologies are probably correlated: pricing carbon emissions should induce innovation that reduces the cost of attaining the carbon target, depending on how that innovation crowds out other research investments [44].
Third, by comparing groups of columns within a plot, we see the effect of the functional form for damages. The quadratic specification is the most common assumption in the literature. While one might think that making the damage function more convex would decrease the break-even loss by increasing the damage from high-warming outcomes, these more convex polynomials also decrease the damage from low-warming outcomes (for fixed damage from 2.5 • C). Because the oceans moderate near-term warming, the linear damage function actually imposes the greatest damages over the next half-century (figure 2(d)).
We see these conflicting effects of greater convexity at work in the break-even loss. The low climate sensitivity value of 2 • C generates relatively little warming. It does not matter much whether one applies a linear or a quadratic damage function to these pathways, and applying a cubic damage function can actually raise the break-even loss. For higher values of climate sensitivity, increasing the damage function's convexity does indeed lower the break-even loss. Using a linear function instead of the quadratic tends to shift the median above what had been the 75th percentile break-even loss. This change makes the 450 ppm target cost-effective only with the very best technology outcomes or with relatively high damages from 2.5 • C. Using a cubic function instead of the quadratic has a similar effect as increasing climate sensitivity from 3 to 4 • C. By increasing the impact of high levels of warming, the cubic function increases the benefit of reducing CO 2 with expected climate sensitivity above 2 • C and reduces the importance of the particular technology future 9 .
Using the additive damage representation has a pronounced effect on the value of more stringent climate targets. The effect of switching from the quadratic (polynomial) 9 Reference [23] similarly finds that climate sensitivity has only a small effect on policy evaluations under a quadratic damage function, and they also find that climate sensitivity matters strongly under quartic or quintic damage functions. However, they report that changing from a quadratic to a cubic damage function has only a small effect on welfare measures. While not directly comparable, our results do appear more sensitive to these variations in the damage function's curvature (as do the older results of [45,46]). Our results' greater sensitivity could be due to differences in evaluation frameworks or to our more complex climate system. damage function to the additive damage function is of similar magnitude as the combined effect from increasing climate sensitivity from 3 to 4 • C and switching from the quadratic to the cubic damage function. The additive function assumes that we cannot easily substitute consumption goods for environmental quality. As consumption reaches high levels, additional environmental quality becomes more important relative to additional consumption. Damages thereby become very large in the next centuries as global output continues to grow. Given that all of these damage functions are plausible and have always been pure assumptions, our results indicate high value to better understanding what type of function may best approximate the impact of future warming.
Finally, figure 4 demonstrates the effect of using a lower discount rate of 1.6%. This lower discount rate derives from reducing favoritism for the present and from reducing aversion to consumption inequality over time [21]. The climate economics literature has extensively debated the choice of discount rate. We here only demonstrate the sensitivity of break-even loss to the choice of discount rate and how the relative importance of each type of uncertainty changes under a lower discount rate. First, the smaller discount rate substantially reduces the break-even loss. This is because most costs are borne relatively quickly while most benefits accrue relatively late (see online supplementary material, available at stacks.iop.org/ERL/8/034019/mmedia). Even the 450 ppm target now has a break-even loss below the literature's mean estimate across most technology, warming, and damage scenarios. Second, the lower discount rate further reduces the importance of climate sensitivity, and it makes the break-even loss more sensitive to the choice of damage function than to the technology future. The high near-term cost of the worst technology outcomes no longer affects cost-effectiveness so strongly, while more distant increases in damages now receive more weight in the policy evaluation 10 . In fact, under all but the lowest climate sensitivity values, the cubic and additive functions suggest such strong benefits to policy that they render technology uncertainty nearly irrelevant.
While the previous section's theoretical analysis broadly framed the climate challenge, the present numerical analysis concretely compares the cost-effectiveness of potential emission constraints. Restricting the CO 2 concentration to 500 ppm yields positive net benefits across a range of technology, climate, and damage futures. Conventional economic estimates support this target unless climate change ends up at the extreme low end of scientific projections. In contrast, the cost-effectiveness of limiting the CO 2 concentration to 450 ppm depends on assumptions 10 Previous work has found that uncertainty about preference parameters controlling the consumption discount rate is more important for policy than uncertainty about climate, cost, and damage parameters, but many of these studies do not consider uncertainty about the curvature of the damage function [19,47]. In line with [22], our results underscore the importance of the consumption discount rate but also highlight the importance of the assumed functional form for climate damages. See [48] for more on how preferences for smooth consumption profiles interact with skewed distributions for climate sensitivity and with assumptions about damages from high-warming outcomes. about long-range unknowables and on preferences about consumption over time. The 450 ppm target is cost-effective in a world with multiple breakthroughs in low-carbon technology or in a world in which high-warming outcomes cause large losses. It is also cost-effective when policymakers give nearly as much weight to future consumption as to present consumption. Refining understanding of technology pathways and high-warming impacts is crucial to more precisely evaluating the cost-effectiveness of the 450 ppm target. These results highlight the importance of focusing climate science research around simulating regional variables that are important for damage assessments. They also suggest orienting economics research towards more flexibly estimating damage functions and analyzing the ability of policy instruments and portfolios to direct and accelerate technological change.

Conclusion
Our theoretical analysis demonstrated how fundamental properties of the economic and climate systems combine to generate distributions for the costs and benefits of climate policy with tails extending towards undesirable outcomes. Climate policy remains trapped between these tails unless policies to reduce emissions effectively induce technological change or form a portfolio with targeted innovation policies.
Our numerical analysis directly compared climate targets' robustness to plausible pathways for warming, damages, and technology. In contrast, current numerical methods and computational resources sharply constrain the number of uncertainties that can be addressed in stochastic policy-optimizing models and the level of detail in their climatic and economic systems. Throughout the analysis we used a common population and economic growth assumption from GCAM's baseline scenario. Future work could extend our framework to incorporate uncertainty ranges for population and economic growth as projections emerge for their evolution to 2290. We also mostly ignored uncertainties associated with preference parameters, policy effectiveness, economic or physical stochasticity, threshold behavior, or damages from sources such as precipitation or the rate of warming. It is important to further the work of directly comparing these remaining uncertainties in detail-rich policy evaluation models so as to prioritize factors for integration into theory-rich optimization models.
Much scientific research has gone into estimating climate sensitivity and transient warming. We find that policy evaluations are relatively robust to the remaining uncertainty. In contrast, policy evaluations are not robust to anticipating additional breakthroughs in abatement technology or to the functional form for extrapolating high-warming impacts. Yet integrated assessment models must still use representations of technological change based more on convenience than on confidently known underpinnings, and little work has gone into estimating the shape of the damage function that translates warming into economic losses. Advancing understanding of technological change requires implementing more sophisticated models of technology development, competition, and adoption. Advancing understanding of appropriate damage functions requires better modeling of the regional implications of given warming paths, better economic and ecological understanding of the pathways by which climate change matters, and better communication between disciplines so that climate scientists predict the types of variables that policymakers, analysts, and economists find most crucial.