Accounting for soil carbon sequestration in national inventories: a soil scientist’s perspective

As nations debate whether and how best to include the agricultural sector in greenhouse gas pollution reduction schemes, the role of soil organic carbon as a potential large carbon sink has been thrust onto center stage. Results from most agricultural field trials indicate a relative increase in soil carbon stocks with the adoption of various improved management practices. However, the few available studies with time series data suggest that this relative gain is often due to a reduction or cessation of soil carbon losses rather than an actual increase in stocks. On the basis of this observation, we argue here that stock change data from agricultural field trials may have limited predictive power when the state of the soil carbon system is unknown and that current IPCC (Intergovernmental Panel on Climate Change) accounting methodologies developed from the trial results may not properly credit these management activities. In particular, the use of response ratios is inconsistent with the current scientific understanding of carbon cycling in soils and response ratios will overestimate the net–net sequestration of soil carbon if the baseline is not at steady state.


Introduction
Globally, approximately 14% of recent anthropogenic greenhouse gas (GHG) emissions were attributable to agricultural activity (IPCC 2007). Improved management of agricultural land has the potential to both reduce net GHG emissions (Smith et al 2008) and to act as a direct CO 2 sink through soil carbon sequestration (Lal 2004). The recognition that agricultural soil carbon sequestration could be a successful 'win-win' or 'no regrets' policy (e.g. Lal 2004, Smith 2004, simultaneously reducing atmospheric GHG levels at potentially very low abatement costs (Smith et al 2008) and increasing food security through improved soil health has thrust the issue onto numerous national political agendas. 1 Author to whom any correspondence should be addressed.
Existing agricultural field trials represent one of the greatest immediately available resources for the study of management impacts on soil carbon sequestration. These data sets have both informed soil carbon modeling efforts (i.e. Parton et al 1987, Skjemstad et al 2004 and formed the basis for the stock change factors used in current IPCC inventory guidelines (IPCC 2006, Ogle et al 2005. A myriad of experimental designs exist for agricultural field trials (Petersen 1994). The core feature of most of these designs is to subdivide a formerly homogeneously managed plot into any number of replicated treatments in an effort to minimize confounding factors. In fact, some of the best statistical designs are not amenable to SOC research because the treatments are rotated through numerous blocks to completely eliminate the influence of soil variability. The measurement of soil organic carbon (SOC) content at the onset of a trial provides a baseline from which to calculate the absolute impact of imposed management treatments on rates of SOC change. Unfortunately, the great majority of these studies were designed to define the influence of agricultural management practices on plant dry matter production, grain yields and other agronomic properties and as a result few longterm trials accurately measured SOC stocks at the onset of experimentation (Sanderman et al 2010). Useful data on SOC stock changes have been and can still be gathered without the aid of the baseline data (e.g. Ogle et al 2005). However, as we demonstrate in this letter, there are numerous uncertainties in using these data in a predictive capacity.
In this letter, we first discuss typical field trial results in relation to changes in SOC stocks. Next, we highlight some of the difficulties in using these data in a predictive capacity to account for changes in SOC stocks in general. Finally, based upon these observations, we present a critical evaluation of current IPCC Guidelines (IPCC 2006) for accounting for emissions or removals resulting from SOC stock changes in national inventories under article 3.4 of the Kyoto Protocol.

Discussion of typical results
Less than 50% of the studies in major reviews of SOC stock changes (i.e. Ogle et al 2005, Sanderman et al 2010 have actually followed a change in management through time, the remainder have compared SOC stocks in contrasting management practices after a defined number of years of implementation. Without the baseline data at the inception of a trial or a temporal sequence of measurements, it is impossible to determine whether or not a current measured difference in SOC between two treatments has resulted in a net sequestration of atmospheric CO 2 . In a comparison of the influence of two management practices (A and B) on SOC stocks, the five scenarios depicted in figure 1 would all lead to the measurement of a greater stock of SOC under practice A. However, a net sequestration of atmospheric CO 2 would only occur in three of the five scenarios (i.e. scenarios 1, 2 and 3). In scenario 1 both management practices would lead to a net sequestration; while in scenario 5 both practices would lead to a net loss of carbon back to the atmosphere, yet with a Figure 2. Results from a hypothetical field trial comparing conventional and improved management practices initiated at three different times (A, B and C) after converting a natural ecosystem to agricultural production in year zero. All three points show the same relative gain of 5 Mg C ha −1 in the improved management practice over a five year period; however, the actual rate of change is completely different.
snapshot-in-time approach both scenarios would be interpreted as having resulted in the same relative gain in SOC.
A second consideration, which is often the underlying reason for the various scenarios in figure 1, involved in defining the influence of applied management practices on SOC stocks is whether SOC has stabilized at a new steady state value indicative of the original management practice or is still changing and progressing towards a new equilibrium value. Evidence suggests that the imposition of agriculture on previously undisturbed soil will result in a 20-50% loss of SOC (Mann 1986, Davidson and Ackerman 1993, Lal 2004 with the rate of loss being greatest initially and then diminishing over time (dashed line in figure 2) with a new equilibrium not being reached for 20-100 years. If two management practices (conventional and best practice in terms of SOC accumulation) are initiated at different times after clearing (points A, B and C) different SOC sequestration outcomes are obtained (figure 2). The relative difference in SOC content measured between the two management treatments at all three times is similar (5 Mg C ha −1 over a five year period); however, the benefit in terms of sequestration of atmospheric CO 2 relative to the conditions present at the start of the three experiments is completely different. Without SOC measurements taken at the start of each of the experiments (A, B and C), the different carbon sequestration scenarios depicted in figure 2 would not be evident and the best management system may be inappropriately considered to have sequestered atmospheric carbon.
Results from replicated field trials with time series data in Australia (table 1) indicate that many Australian agricultural soils are somewhere between scenario A and B in figure 2. The relative change between treatments consistently showed the presence of a greater amount of SOC under the improved management practice; however, this was due Table 1. Selected Australian field trial results comparing sequestration rates if calculated as the linear slope of SOC measurements through time (rate of change) versus the relative difference between treatments if only measured at the end of the trial period. In all comparisons, treatment #2 is the management practice which is thought to increase SOC levels over treatment #1 (i.e. no-till (NT) versus conventional tillage (CT)).

Rate of change
Wongan Hills, WA WW; CT versus NT; S?; N0 + Ndp 10 10 −0.40 * * −0.25 * * 0.14 White (1990)  to the conventional practice losing SOC and the improved management practice losing SOC at a smaller or insignificant rate. This situation is not unique to Australia. Recently, Senthilkumar et al (2009) found that despite nearly 100 years of conventional management, tillage trials in the northern Great Plains of the United States were still losing SOC and that 20 years of improved no-tillage management had only been able to halt the decline in SOC stocks. From an atmospheric standpoint, all three scenarios in figure 2 have the same net impact-i.e. 18.3 Mg CO 2 ha −1 less CO 2 in the atmosphere after five years compared to no change from the conventional management practice. However, as we will outline below the differences between avoided emissions and actual carbon sequestration in soils are more than rhetorical, and, from a carbon accounting standpoint, this fact can have important ramifications (see section 4).

Predictive value of these results?
Do observed differences in point-in-time measurements of field trials translate to the case of adopting the new management? If the conventional management treatment was at steady state at the beginning of the trial (e.g. point C in figure 2), then we can reasonably assume that the difference was due to increased SOC stocks under the improved management system and adoption of this practice at some point in the future will result in these gains. This scenario forms the conceptual framework for the IPCC stock change accounting methodology (Ogle et al 2005). However, if the conventional management was not at steady state and was continuing to lose SOC (i.e. points A and B in figure 2) and the relative gain in the improved management treatment was really due to a reduction or cessation of losses, then it is uncertain whether this data will give us an answer to the posed question because the accumulation and loss of SOC may not be symmetrical processes.
There is evidence from both mechanistic and modeling studies that SOC is typically lost more rapidly than it is gained. First, the formation of stable aggregates that retard SOC decomposition may be much slower than their destruction during tillage (Jastrow et al 1996, Six et al 2000, thus SOC stocks may not build nearly as rapidly as they appear to be lost (Balesdent et al 2000). This concept of hysteresis was demonstrated by Pankhurst et al (2002) by switching management practices after 14 years of a trial and then resampling three years later. Applying tillage to the previously NT plots resulted in large losses of SOC from the upper 10 cm of soil, however, applying NT to previously tilled plots resulted in non-significant SOC changes after three years (Pankhurst et al 2002). In a second example, studies of rangeland management have shown that re-establishment of healthy, diverse and productive plant communities takes much longer (up to 5-10 times longer) than the degradation of these systems (Harrington et al 1984, McKeon et al 2004 leading to potential rapid loss of SOC but gradual gains (Hill et al 2006). Third, most multiple-pool soil carbon models (e.g. Parton et al 1987, Skjemstad et al 2004 will show greater loss rates than At year 0, cultivation began and C inputs were halved to 2.8 Mg C ha −1 yr −1 . Then at year 30, the management shifted so that C return to the soil doubled, back to pre-treatment levels. Roth-C models total organic carbon (TOC) dynamics through three pools: fast cycling (primarily particulate organic carbon, POC), slow cycling (soil humus, HUM), and inert (dominated by char). Loss of SOC is shown in solid lines, while the build-up of C (dashed lines) following the management shift is plotted in reverse (upper axis) to illustrate the hysteresis pattern of stock changes. Since the inert pool (dominated by char) is not responsive on decadal timescales, this pool was assumed to be zero for this simulation. sequestration rates when modeling input changes over decadal timescales because changes typically take on the order of a century to fully propagate through all the various SOC pools ( figure 3). These examples all suggest that reducing SOC loss rates may be easier than actually increasing stocks.

Carbon accounting implications
Following the passage of the Marrakech Accords (UNFCCC 2001), biospheric carbon sinks and sources can be included in attempts to meet national emission reduction obligations under article 3.4 of the Kyoto Protocol (UNFCCC 1992). Carbon emissions and removals for article 3.4 activities most pertinent to the agricultural sector need to be accounted for on a 'net-net' basis; meaning that net emissions and removals from activities during a commitment period are to be compared to net emissions and removals in a base year (Schlamadinger et al 2007). Accounting for net uptake or release of CO 2 from soils presents a unique challenge for the AFOLU (Agriculture, Forestry and Other Land Uses) sector because accurate measurements of net fluxes are extremely difficult, thus necessitating monitoring changes in SOC stocks instead of net emissions. The analysis of field trial data in sections 2 and 3 leads to several important considerations for carbon accounting in soils.
Currently, the agreed-upon IPCC Guidelines for National Greenhouse Gas Inventories (IPCC 2006) outline accounting methodology into a three-tiered system depending on the level of desired detail and availability of suitable information. In tier I and II accounting approaches, annual SOC stock changes are calculated as the difference between SOC stocks in the last year of the inventory period (t = 0) and at the beginning of the inventory period (t = 0 − T ) divided by the inventory time period (T , default is 20 years): At each time point, stocks are calculated by multiplying a reference stock value (IPCC default values for tier I and country/region specific values for tier II) by a series of stock change factors (F LU = land use, F MG = management regime, F I = input of organic matter), which are assumed to have linear effects for 20 years before reaching a new equilibrium (IPCC 2006): The default stock change factors, which represent the ratio of the SOC content under an applied management practice to the SOC content of the baseline or conventional practice after 20 years of implementation, termed the response ratio, were derived by Ogle et al (2005) from a global data set composed of point-in-time measurements of field trials. In order to maximize sample size in this linear mixed-effects modeling analysis, all sample points in studies with time series data were included with the time-dependent interaction between repeated measures accounted for using random effects. While maximizing the statistical robustness of the model, this data treatment lost the ability to track whether or not the baseline management SOC stocks were changing and thus the ability to decipher between sequestration and emissions avoidance.
In terms of net-net accounting for SOC change in national GHG inventories, differentiating between sequestration and emissions avoidance should not be important. However, as we will show in the following paragraphs, current tier I and II good practice accounting methodologies will overestimate the net emissions reduction in the case of emissions avoidance.
Under the IPCC guidelines, SOC stocks following all management changes, including initial cultivation of native soil, are assumed to stabilize at a new steady state after 20 years (IPCC 2006). Thus the baseline condition is always assumed to be at steady state and any positive change in SOC stocks resulting from a management shift is treated as sequestering atmospheric carbon. For many soil types, especially clay-rich soils high in SOC, the 20 year timeframe is likely too short to re-establish equilibrium following initial cultivation of the land and that some conventionally managed soils may continue to lose carbon for 50-100 years (Dalal andMayer 1986, Senthilkumar et al 2009). Additionally, modern agriculture is undergoing rapid changes and what is considered 'conventional practice' today is often completely different than two decades ago. The overall effect of these incremental changes in conventional practice on SOC stocks are not known, but results such as presented in table 1 suggest that a sustained loss of SOC may be common.
An implicit assumption in calculating SOC responses as ratios is that the stock change is proportional to the size of the stock. Carbon losses due to enhanced decomposition should be proportional to the amount of SOC present, especially the amount of C in fast cycling pools (i.e. figure 3). However, carbon gains will primarily scale with inputs (i.e. residue return from plant production) which are mostly decoupled from the size of the current SOC stock. This model of zero-order kinetics for inputs and first-order kinetics for losses is the foundation of nearly all mathematical modeling of soil carbon dynamics (e.g. Jenny et al 1949, Parton et al 1987. If, as most researchers believe, SOC sequestration is primarily driven by enhanced C inputs to the soil system (Sanderman et al 2010), then scaling responses by stock size is likely not the most appropriate methodology and, in fact, contradicts the concept of carbon saturation in soils (e.g. Six et al 2002, Stewart et al 2007. Additionally, if SOC stocks in the baseline year were not at steady state, a second important consideration with the use of response ratios is that there will be a non-symmetrical response between increases in the numerator (newly applied management) and decreases in the denominator (baseline management). If the baseline is steady, there will be a straightforward linear relationship between the response ratio and the relative annual stock change between management practices (figure 4). However, if the baseline management is continuing to lose carbon, then for any given response ratio, the actual stock change will be lower than for the steady state baseline scenario (figure 4). For scenarios 3, 4 and 5 from figure 1, the erroneous assumption of steady state will result in an overestimate of the actual relative carbon gain by 9, 20 and 33%, respectively.
Under IPCC guidelines, the recommended highest-order (tier III) accounting system for monitoring changes in SOC is a spatially explicit detailed modeling-and/or inventory-based system (IPCC 2006). In an inventory-based approach, SOC stocks would only be followed through time so the relative SOC gains (i.e. emission reductions) found in many field trials would not be 'seen' and only the actual change in stocks over time will be reported. If the results in table 1 were broadly applicable, there would actually be a real net increase in CO 2 emissions from Australian soils of 0.4±0.2 Mg CO 2 ha −1 yr −1 (mean ± s.e. of the #2 treatments in table 1) over the baseline year across the land area where the improved management practices have been adopted.
From a modeling standpoint, knowledge of only the relative difference between management styles at a given point in time would not be good enough to predict SOC stock changes. Additionally, time series data is absolutely critical for accurate model calibration (cf Skjemstad et al 2004). Even with a robustly calibrated SOC model, one would also need to know or be able to model if the baseline, when projected into the commitment period, is changing (i.e. scenarios A and B in figure 2) in order to fully account for the net-net carbon abatement occurring in soils.
Thus, under current IPCC recommended accounting guidelines, there may be a perverse situation where Tier I and II approaches will yield emissions reduction credits, albeit with questionable accuracy, for all positive management shifts regardless of whether the shift results in net sequestration Figure 4. Visualization of the differences in the relative annual SOC stock change between improved and conventional management practices compared to the corresponding response ratios (SOC improved /SOC conventional after 20 years under new management) for the five hypothetical field trial scenarios given in figure 1. Scenario #2 (conventional at steady state and improved management gaining SOC) represents the assumed trajectory in IPCC stock change factor accounting. Reference SOC stocks were assumed to be 60 Mg C ha −1 for this exercise.
or simply a reduction in SOC losses, while SOC accounting under the more detailed tier III approach could result in net liabilities for a country if the positive change in management only resulted in a reduction in SOC loss rate. Given that even a reduction in the loss rate of SOC is of value in terms of meeting GHG abatement targets and is consistent with the goals of net-net accounting (Schlamadinger et al 2007), this analysis suggests: (1) Current tier I and II methodologies based on stock change factors should be re-evaluated and, at a minimum, the management factors need to be based on annual rates of SOC stock changes (Mg C ha −1 yr −1 ) and not response ratios. If implemented well, this type of system would provide a simple means of net-net accounting for soil carbon stock change at the national scale.
(2) A tier III methodology that simultaneously tracks the actual rate of change in SOC stocks and the SOC stocks under a 'business-as-usual' scenario for the land area that has implemented a management change will likely provide the most robust estimates of the net-net emissions/removals resulting from management induced soil organic carbon stock changes.

Conclusions
Results from agronomic field trials generally show a relative gain in carbon stocks with implementation of management practices that return or retain more of the carbon captured by growing plants. However, much of the data used to support such a conclusion has been derived from point-in-time measurements which are ambiguous as to whether the relative difference was due to net sequestration or simply a cessation of losses during the trial (i.e. an avoidance of emissions). While all of the scenarios in figure 1 represent a real net benefit to GHG abatement, we have argued here that (1) the predictive power of results from most agronomic field trials to alternative situations where these management practices have been implemented is questionable without detailed knowledge of the state of the soil carbon system; and (2) the current recommended IPCC accounting methodologies may not properly credit these activities and may indeed result in contradictory results when accounted for using tier I or II versus tier III approaches. Given that GHG credits for soil carbon sequestration will not be widely included during the first commitment period (2008-2012) of the Kyoto Protocol, there is time to develop more robust accounting systems that correctly credit agricultural management activities.