Modest capacity of no-till farming to offset emissions over 21st century

‘No-till’ (NT) agriculture, which eliminates nearly all physical disturbance of the soil surface on croplands, has been widely promoted as a means of soil organic carbon (SOC) sequestration with the potential to mitigate climate change. Here we provide the first global estimates of the SOC sequestration potential of NT adoption using a global land surface model (LSM). We use an LSM to simulate losses of SOC due to intensive tillage (IT) over the historical time period (1850–2014), followed by future simulations (2015–2100) assessing the SOC sequestration potential of adopting NT globally. Historical losses due to simulated IT practices ranged from 6.8 to 16.8 Gt C, or roughly 5%–13% of the 133 Gt C of global cumulative SOC losses attributable to agriculture reported elsewhere. Cumulative SOC sequestration in NT simulations over the entire 21st century was equivalent to approximately one year of current fossil fuel emissions and ranged between 6.6 and 14.4 Gt C (0.08–0.17 Gt C yr−1). Modeled increases in SOC sequestration under NT were concentrated in cool, humid temperate regions, with minimal SOC gains in the tropics. These results indicate that the global potential for SOC sequestration from NT adoption may be more limited than reported in some studies and promoted by policymakers. Our incorporation of tillage practices into an LSM is a major step toward integration of soil tillage as a management practice into LSMs and associated Earth system models. Future work should focus on improving process-understanding of tillage practices and their integration into LSMs, as well as resolving modeled versus observed estimates of SOC sequestration from NT adoption, particularly in the tropics.


Introduction
Agricultural practices that increase soil organic carbon (SOC) storage have been widely researched as a means of offsetting greenhouse gas emissions while also improving soil health and food security (Smith et al 2008, Stockmann et al 2013, Palm et al 2014, Lipper et al 2018, Ogle et al 2019. SOC sequestration is a key component of many countries' National Determined Contributions to reducing emissions under the Paris Agreement (Richards et al 2016). Mitigating climate change through SOC sequestration on agricultural land underpins other major international initiatives, such as the '4 per 1000' program introduced at the United Nations Framework Convention on Climate Change Conference of the Parties (COP21 in 2015). As summarized by Minasny et al (2017), the '4 per 1000' program seeks to offset yearly fossil fuel emissions through an equivalent yearly increase in SOC, and 'no-till' (NT) is one of the main methods cited by the initiative for increasing SOC on agricultural land. NT is the most of extreme form of conservation tillage and completely eliminates both the mechanical breakup of the soil surface through plowing as well as the cultivation of the soil to prepare the seedbed; planting operations and sweeping away of crop residue are typically the only forms of soil disturbance in NT systems.
Few studies have attempted to estimate cumulative global SOC gains from NT exclusively and most studies have examined NT as part of a wider suite of agriculture management practices aimed at SOC storage (Lal 2004, Elzen et al 2013, Powlson et al 2014, Sommer and Bossio 2014. Attempts at estimating global SOC gains from NT have been hindered by uncertainty in the magnitude of gains under NT from plot-scale field studies, which show widely varying capacity of NT to increase SOC depending on experimental design and geographic context (Angers and Eriksen-Hamel 2008, Luo et al 2010, Stockmann et al 2013. Despite this uncertainty, NT agriculture continues to be endorsed as an effective climate change mitigation measure by some studies and among policymakers (Lal et al 2004, Smith et al 2008, Smith et al 2014, Soussana et al 2017, Baveye et al 2018. Several methods exist for estimating changes in SOC from NT at the global scale, though all are subject to considerable uncertainty. Most of these methods are empirical approaches that rely on linear scaling methods, which are not capable of capturing non-linear changes in SOC over time (Ogle et al 2010). Process-based models, such as DayCent (Parton 1996) can simulate non-linear changes in SOC over time in a spatially explicit manner, but are limited in geographical scope and do not have the ability to represent land use change (LUC) (Jain et al 2005, Lugato et al 2018. Global land surface models (LSMs) (i.e. the land component of Earth system models) are process-based models that can overcome some of these shortcomings by comprehensively simulating non-linear biogeochemical and biogeophysical processes in a spatially explicit manner for Earth's entire land surface (Bonan and Doney 2018). Recent advances in LSMs allow for transient historical LUC and, in some instances, explicit simulation of agricultural management practices over time at global scales. LSMs can also simulate atmospheric forcings associated with changes in climate and concentrations of greenhouse gases (Pongratz et al 2018).
In this research, we conduct the first study of its kind to use an LSM to simulate the impact of different tillage practices on SOC at the global scale. We begin by evaluating the effects of historical, conventional intensive tillage (IT) practices on cumulative changes in SOC on croplands. Subsequently, we assess the future potential for SOC sequestration and climate change mitigation via adoption of NT practices on all croplands globally over the 21st century.

Model description
The Community Land Model (CLM) is the land surface component of the Community Earth System Model (CESM). In this study, we run land-only simulations using CLM version 5.0 (CLM5). CLM5 is capable of representing land surface biogeochemistry and biogeophysics for many land surface processes, with components for simulating LUC, dynamic vegetation and phenology, hydrology, human management activities, and ecosystem dynamics (Lawrence et al 2019). CLM5 allows for LUC over time, including changes in the distributions of crops, and simulates LUC based on Land Use Harmonization version 2 data (LUH2). CLM5 soil biogeochemistry is based on the cascading decomposition approached use in the DayCent/CENTURY model (Parton 1996), with multiple distinct litter and soil pools for representation of SOC, as well as vertically resolved soil data columns to a depth of 49 m (Koven et al 2013). As in Day-Cent, decomposition and accumulation within various SOC pools in CLM5 is calculated on a 30 min discrete time-step, and is a function of temperature, moisture, depth, and aeration within the soil.
The CLM5 crop sub-model (CLM5-CROP) is derived from the AgroIBIS LSM and is among the most comprehensive crop sub-models within major ESMs (Pongratz et al 2018). CLM5-CROP has timevarying crop distributions derived from LUH2 data (Hurtt et al 2020) for six types of crop (maize, soy, cotton, wheat, rice, sugarcane) and dynamically simulates crop growth and phenology as four phenological growth stages based on crop-specific growing degree-days (GDDs), corresponding to planting, leaf emergence, grain-fill, and grain harvest (see Lombardozzi et al 2020 for additional details). The crop sub-model simulates important management practices, including fertilization and irrigation, although it does not account for historical changes in crop breeding or other advances in agricultural management (Lombardozzi et al 2020). Additionally, all plant matter except for grain is returned to the soil column; grain C and N pools are transferred to the atmosphere over 1 year. Each crop type is allocated a separate soil column for irrigated and rain-fed crops, both of which are separate from the natural vegetation column; this precludes crop management practices from spilling over and indirectly impacting natural vegetation. Fertilization (N only) is applied in CLM5 by adding N directly to the soil mineral N pool and is prescribed in a spatially explicit fashion by crop type according to transient LUH2 data for fertilizer use globally (Lawrence et al 2016). Irrigation is applied to irrigated crop fractions at 6:00 am local time daily, if soil moisture falls below a specified minimum value (Portmann et al 2010). More comprehensive descriptions of the CLM5-CROP model and associated processes can be found in the literature (

Tillage implementation
We simulated different tillage practices as a proportional change in decomposition rates to existing SOC litter and soil C pools within CLM5. Since CLM5 biogeochemistry and SOC pool structure is based on DayCent (Parton 1996), we simulated tillage by applying decomposition multipliers for tillage that had been calibrated and validated in previous Day-Cent studies that simulated the influence of tillage on SOC pools in the top 20 cm of the soil profile (Hartman et al 2011, Parton et al 2015. Our implementation of multipliers in CLM5 model code was based on preliminary work on tillage in CLM4.5 (Levis et al 2014).
Tillage practices are variable in space and time and more recent, industrialized cropping systems tend to maintain more IT practices (i.e. increased decomposition rates) compared to historical and subsistence tillage practices. Although spatially and temporally variable tillage practices would make our simulations more realistic (Prestele et al 2018, Porwollik et al 2019, this is not currently possible within the current CLM5 model setup. We therefore applied uniform multipliers at the two levels of intensity (moderate, high) to capture the potential range of SOC changes due subsistence tillage and industrialized tillage, and to assess the sensitivity of the model to varying levels of enhanced decomposition due to IT. Values for decomposition multipliers varied by SOC pool in both the 'high' and 'moderate' implementation of tillage practices (table 1). Decomposition multiplier values for 'high' intensity tillage were based on DayCent version 4.5 simulations conducted for the U.S. Great Plains region and are intended to represent high intensity tillage systems in more recent, industrialized cropping systems (Hartman et al 2011). The decomposition multipliers for 'moderate' intensity tillage were derived from default multipliers for DayCent version 4.0, and are intended to represent the comparatively lower tillage intensity in historical, subsistence, and non-mechanized cropping systems (Metherell et al 1993, Manies et al 2000, Leite et al 2004, Chang et al 2013. The adoption of NT was simulated in an idealized manner by 'turning off ' the 'high' and 'moderate' enhanced decomposition of SOC associated with IT. The practical implementation of 'NT' practices in CLM5 simply turns tillage multipliers 'off ' by setting decomposition scalars equal to one.
In all cases, we represented soil disturbance from tillage in the CLM as three discrete, sequential events, with each event corresponding to individual tillage management practices that are common across many cropping systems. These three events were intended to simulate 'primary tillage' , 'cultivation' , and 'planting and weeding' . Here 'primary tillage' refers to major disturbance of the soil and incorporation of crop residue conducted prior to planting in order to prepare the soil; 'cultivation' corresponds to lower intensity soil disturbances following 'primary tillage' with the goal of removing weedy vegetation and creating a uniform seedbed; 'planting and weeding' consists of a final event wherein crops are planted and additional, low-intensity clearing of weedy vegetation is performed.
Each of the three main tillage events occurs in sequence annually for each crop type in CLM5. Following crop planting in the model, the increase in decomposition rates for cropped soils in CLM5 is implemented for a period of 75 d, an interval which largely agrees with the literature on the time period over which enhanced decomposition from tillage is effective (Abdalla et al 2013(Abdalla et al , 2016 and similar to the timeframe used in the DayCent tillage representation. Planting in CLM5 occurs based on GDD thresholds, and we implement tillage practices at the same GDD threshold as planting, which is the first GDD threshold available for the crop model. Although this is potentially unrealistic because tillage events generally occur before planting in most cropping systems, our implementation of IT in the CLM5 code is more robust to future changes in model structure related to planting date thresholds for CLM5-CROP. We limited enhanced decomposition due to tillage to the top 26 cm of each cropspecific soil column in the model, since the 'plow layer' is generally considered between 25 and 30 cm in most cropping systems. Additional model testing during development verified that the magnitude of SOC changes were not sensitive to the timing of tillage initiation, but we did not conduct true sensitivity experiments in this regard.

Data and experimental design
CLM simulations were conducted in offline mode at 1.9 • × 2.5 • resolution. All simulations used CLM5 component sets with active carbon and nitrogen biogeochemistry and prognostic crops (CLM5.0-BGC-CROP). Since the CLM was not fully coupled, all historical simulations for the 1850-2014 interval were simulated using prescribed atmospheric forcing from the Global Soil Wetness Project (GWSP) (Dirmeyer et al 1999 We ran two historical simulations with IT treatments at two different levels of intensity (high, moderate) based on DayCent-based decomposition multipliers over the historical time period (1850-2014) (table 2), as described in section 2.2. A historical control simulation without tillage practices (i.e. CLM5 default settings) was run for the entire 1850-2014 historical interval in order provide a reference for computing relative changes in SOC for the IT simulations. Simulations with IT treatments and control were initiated following model stabilization (i.e. spinup) to 1850 land use distribution and equilibrium SOC conditions, with decomposition multipliers for IT simulations going into effect beginning in 1850. Historical IT treatment simulations applied enhanced decomposition to all croplands globally between 1850 and 2014. The land surface area in cropland in historical IT and control simulations evolved over time based on prescribed, historical LUC. Tillage practices in each simulation were implemented on all cropped areas globally, including land area in fruits, vegetables, and root crops. Tillage practices were included on all cropland globally in order to assess the maximum potential SOC sequestration of NT adoption globally, as well as determine model sensitivity in this regard.
Future simulations were conducted by extending the two historical IT treatment and control simulations over the 2015-2100 time period, and two new future simulations were conducted to simulate NT adoption over the 2015-2100 interval. NT treatment simulations were created by branching new simulations from both the IT 'high' and 'moderate' intensity historical simulations, but with tillage decomposition multipliers 'turned off ' (i.e. set equal to one). To understand the future mitigation potential of NT on current cropland, we assumed the global adoption of NT practices in the model occurred on all cropland simultaneously beginning in 2015 and remained in place until 2100. All future simulations held crop area constant at 2015 distributions. We therefore assumed that NT adoption and continuing IT would occur exclusively on existing cropland, with no future cropland expansion because of uncertainty about the geographic distribution of future LUC to crops, which would add complexity to model assumptions (Prestele et al 2016). Because the adoption of NT practices is relatively recent and current distribution is relatively small (Smith et al 2014, Kassam et al 2019, we only include the impacts of conversion to NT on SOC losses during future simulations. Changes in SOC due to differences in tillage practices between simulations were calculated by subtracting simulated quantities of SOC in treatment simulations from the corresponding quantity in the control simulation, which did not include tillage. CO 2 equivalents for changes in SOC were calculated by multiplying the quantity of carbon by 3.67 based on IPCC guidelines (den Elzen et al 2013). Code and data for this study can be found at the following locations on GitHub: https://github.com/mwgraham/ctsm/tree/mwgraham _ctsm_tillage_residue_harv_branch; https://github.c om/mwgraham/clm_tillage_project_data.

Historical impact of intensive tillage practices on soil carbon
Implementation of IT over the historical time period in our simulations resulted in cumulative carbon losses between 6.8 and 16.8 Gt C (25.0-61.7 Gt CO 2 e) more than the control simulation. This loss of soil carbon over 164 years was equivalent to roughly 1 year of current fossil fuel emissions of 9.8 Gt C (circa 2015) (le Quéré et al 2018). SOC losses strongly depended on the level of tillage intensity (figure 1). SOC stocks began declining shortly after implementation of tillage practices in 1850, and dropped continuously through the 20th century in association with increasing LUC to croplands during this time period (figures A1 and A2 (available online at stacks.iop.org/ERL/16/054055/mmedia)). Total historical losses due to IT would be equal to approximately 5.1%-12.6% of estimated cumulative SOC losses due to all agricultural practices globally, which have been estimated elsewhere at 133 Gt C (Sanderman et al 2017). There has been virtually no quantification from field studies of the proportion of SOC losses attributable to tillage alone compared to other agricultural management practices following LUC to crops, as IT is usually confounded with other management practices in such studies (Chatskikh et al 2009). The overall literature does indicate that tillage may have lower impacts on SOC compared to management practices that alter biomass inputs during LUC to cropland (e.g. crop biomass harvest) (Don et al 2011, Virto et al 2012, Fujisaki et al 2015. However, the numbers from the 'high' simulation our study are similar to those found in another LSM study using LPJ-GUESS for multiple crop management practices over the historical time period, which found that SOC losses from historical IT practice were 18 Gt C and accounted for 8% of cumulative losses over the historical time period (Pugh et al 2015).
Historical cumulative losses of SOC stocks per hectare were unevenly distributed geographically (figures 2(a) and (b)), and the highest SOC losses were concentrated in North America, Europe, and Northeast China. The geographic distribution and magnitude of declines in SOC stocks are within the range (i.e. <20% of estimated SOC losses per hectare, except in North America where losses are >50%) of those reported in a spatially explicit analysis of changes in SOC due to historical agricultural practices using a data-driven approach (Sanderman et al 2017). For instance, Sanderman et al (2017) report losses >60 Mg ha −1 in Western Europe, while our results show 10-15 Mg ha −1 . Similarly, temperate regions of South America with high shares of cropland (e.g. Southern Brazil, Uruguay, Argentina) had estimated losses of 30-50 Mg ha −1 for all management practices in the foregoing study, whereas our results indicate losses of 5-10 Mg ha −1 for tillage impacts alone. Areas with high per hectare losses accounted for a disproportionate share of carbon losses accrued on a percentile basis: the top quintile of cropped areas (i.e. the top 20% with the highest per hectare losses of SOC) accounted for >76% of the total cumulative SOC losses due to IT globally, regardless of tillage simulation (figure 2(c)). The geographical pattern of historical SOC losses appears to be driven by an interaction between initial SOC stocks, underlying background decomposition rates, and climate. Locations with large losses were generally in cool, moist temperate regions with high initial SOC and correspondingly low underlying decomposition rates in CLM5 ( figure A3). This indicates that areas with high initial SOC may have more SOC to lose, and that increases in decomposition rates due to tillage may have greater impact on SOC in these environments.

Potential for soil carbon sequestration through conversion to 'no-till' practices
Relative to continuing IT practices, simulated global adoption of NT beginning in 2015 resulted in an accumulation of 6.6-14.4 Gt C (24.2-52.8 Gt CO 2 e) over the 21st century (2015-2100), or 0.08-0.17 Gt C yr −1 (0.28-0.62 Gt CO 2 e yr −1 ), under RCP8.5 forcings. The magnitude of increases in cumulative SOC stocks depended heavily on the pre-existing carbon losses from the corresponding historical IT simulations; larger historical SOC losses allowed for greater SOC gains under NT (figure 1). Global SOC stocks in NT simulations increased logarithmicallywith diminishing returns over time-throughout the 21st century, but cumulative gains in SOC stocks from future conversion to NT (6.6-14.4 Gt C) did not fully compensate for the historical losses due to . This may be because the interval over which NT simulations were run (85 years) was much shorter than that for IT historical simulations (165 years), and simulating NT beyond the 21st century timeframe we employed here could result in full recovery of historical SOC losses.
Cumulative increases in SOC stocks per hectare, computed as the difference between NT and corresponding IT simulations, exhibited considerable variability in their geographical distribution and roughly mirrored losses under historical IT practices (figure 3). The highest SOC sequestration per hectare occurred in temperate regions of North America and Eurasia, where increases in SOC stocks for some locations in the high simulation exceeded 25 Mg C ha −1 (>92 Mg CO 2 e ha −1 ) ( figure 3(b)). This was equivalent to an average annual rate of >0.30 Mg C ha −1 yr −1 (>1.1 Mg CO 2 e ha −1 yr −1 ) for cropped areas in the top 5% of grid cells ( figure A4). The large cumulative increases in SOC stocks per hectare from these areas also represented a large proportion of cumulative global SOC sequestration from NT, and this closely paralleled proportional historical carbon losses by quintile (figures 2 and A5). Tropical regions of Africa and Asia had the smallest increase in SOC stocks of <1 Mg C ha −1 (<3.67 Mg CO 2 e ha −1 ) in the moderate simulation, with correspondingly minimal annual rates of SOC storage (<0.01 Mg C ha −1 yr −1 ; <0.04 Mg CO 2 e ha −1 ). Minimal gains in the tropics are likely due to a combination of low initial SOC levels and relatively high underlying decomposition rates. This geographic distribution of cumulative SOC accumulation per hectare suggests, first, that SOC gains due to NT are driven by the same factors as those for losses-namely modeled initial SOC stocks and the effects of parameter changes to background decomposition rates with tillage implementation. Further, the response to NT (i.e. modeled changes in decomposition rates) indicates soils with more SOC to lose may also have the greatest capacity to regain lost SOC through improved management (Stewart et al 2008, Castellano et al 2015. SOC sequestration rates attributable to NT for most locations in our study fall within the range of estimates from recent metaanalyses of SOC gains by climatic zone (0.06-0.54 Mg ha −1 yr −1 ; 0.22-1.98 Mg CO 2 e ha −1 ) (Smith et al 2008, Ogle et al 2019. However, rates of SOC sequestration in the tropics found in our study (<0.01-0.05 Mg C ha −1 yr −1 depending on simulation) were well below those reported in the aforementioned meta-analyses (0.35-0.54 Mg ha −1 yr −1 ; 1.28-1.98 Mg CO 2 e ha −1 ).  It is therefore possible that SOC gains simulated by CLM may be underestimated in tropical regions and our implementation of tillage as an idealized change in decomposition rates may be overly simplistic. Previous studies have shown that CLM5 biogeochemistry performs reasonably well when comparing observed versus modeled SOC stocks and SOC turnover times globally (Lawrence et al 2019). CLM5 simulates SOC stocks with a high degree of accuracy for most regions and performs well in tropics, since CLM5's low performance on scores for bias and spatial distribution are attributable to low fit between modeled and observed SOC at very high latitudes with minimal cropland (Lawrence et al 2019). Similarly, Lawrence et al (2019) reported low error for modeled versus observed SOC turnover time in CLM5 globally and modeled SOC turnover times in tropical regions showed considerable overlap with observed values. On the other hand, the degree of uncertainty and variability is high in meta-analyses analyzing the effects of NT on SOC in the tropics, and many individual studies report minimal or no increases (Powlson et al 2016). This uncertainty may be higher in the tropics compared to temperate regions because relatively few studies in tropical regions have examined changes in SOC from NT adoption across varying edaphic and climatic conditions, and to depths below 30 cm in the soil profile (Ogle et al 2019). This implies that major field research is required to measure changes in SOC (to depths of ⩾1 m) under different tillage practices in order to reduce uncertainty and compare modeled versus observed data on the effects of tillage on SOC in the tropics.
With respect to cumulative changes in SOC stocks, we found very large increases for multiple locations in humid, temperate regions. Cumulative increases in SOC from NT in North America (>25 Mg C ha −1 in the 'high' simulation) over the 21st century surpass the most optimistic estimates of attainable cumulative increases in SOC stocks from field studies for these locations (i.e. 0-10 Mg ha −1 ) (Hollinger et al 2005, Syswerda et al 2011, Lal 2015. This may be because SOC stocks in these locations in North America increase logarithmically and never reach equilibrium over the 21st century ( figure  A6). As discussed above, CLM5 biogeochemistry for SOC stocks and turnover times performs well globally except at very high latitudes and fits well with observed values for most humid, temperate regions. This could indicate that our idealized implementation of tillage practices as changes in decomposition rates may be unrealistic and more complex modeling may be needed to capture changes in SOC associated with NT practices. Unrealistic increases in SOC stocks from NT for cropped areas in North America may signal that the multipliers used to simulate enhanced SOC decomposition rates in the 'high' IT historical simulation may be too high and that the associated magnitude of historical losses in SOC due to historical IT practices may also be unrealistic. Since a disproportionate share of cumulative, global increases in SOC are concentrated in humid temperate areas, cumulative gains in simulated SOC from NT adoption may be overestimated in the 'high' simulation in this study.

Conclusions and recommendations
We used an idealized approach to modeling tillage practices by examining only changes in decomposition rates and found that SOC storage may be overestimated in humid temperate regions and underestimated in the tropics compared to values reported in the literature. These discrepancies could be due to large uncertainties in field estimates of SOC sequestration under NT reported in the literature. There is additional uncertainty associated with parameterization of underlying biogeochemical processes and SOC dynamics in the CLM, along with other LSMs and process-based models more generally, since representation of soil decomposition is relatively simplistic at present (Wieder et al 2018). Alternatively, this may indicate that our idealized implementation of tillage as a simple change in decomposition rates does not capture more complex interactions between different tillage practices and SOC stocks (Virto et al 2012, Stockmann et al 2013, Pittelkow et al 2015. The factors governing tillage impacts on SOC are complex, and our study did not consider possible impacts of tillage on soil erosion, changes in crop residue cover, and soil moisture (Davin et al 2014, Bagley et al 2015, Erb et al 2017. We did not investigate the potential impact of NT on N 2 O emissions, which other modeling studies have shown could offset gains in SOC under NT (Lugato et al 2018). Further, while comparing modeled versus observed values of SOC storage under NT provides a useful check on modeled SOC changes, we would caution against strict one to one comparison between observed present day values of SOC changes and those simulated here under RCP8.5, since simulated climatic conditions differ radically over the 21st century in RCP 8.5. Nonetheless, our results are within the range of observational estimates for most locations under simulated future climate (table A1).
Modeled results indicated that SOC sequestration due to idealized NT adoption over the 21st century was equivalent to approximately 1 year of present-day fossil fuel emissions (6.6-14.4 Gt C), and that SOC sequestration was disproportionately concentrated in cool, humid temperate regions. Modeled annual rates of SOC sequestration, which we consider to be optimistic for the 'high' simulation (0.17 Gt C yr −1 ; 0.62 Gt CO 2 e yr −1 ) due to unrealistic gains in North America, are more than an order of magnitude less than '4 per 1000' program objectives. These initial results could potentially imply that major investments in additional management practices may be required to achieve program goals, and that the global capacity of NT practices to offset current emissions through SOC sequestration might be more limited than has been previously anticipated (Lal 2004, den Elzen et al 2013, Smith et al 2014, Minasny et al 2017. Despite using dramatically different methods, at the global scale these initial results appear to be congruent with a growing body of evidence from field trials demonstrating that NT is unlikely to represent a 'silver bullet' policy tool for increasing SOC on agricultural land in most locations (Powlson et al 2014, Baveye et al 2018. However, further work is needed to match modeled versus observed changes in SOC due to NT adoption for some locations, particularly in tropical regions. LSMs and Earth system models are increasingly used to evaluate the global-scale effects of varying agricultural and other land management practices on climate change mitigation and adaptation. Our results provide the first global estimate of how tillage practices may contribute to climate mitigation efforts using an LSM and represent an important, if idealized, first step in integrating soil tillage as a management practice into an LSM and associated Earth system model. More realistic simulation of tillage practices in CLM and ESMs more generally will require development and incorporation of spatially explicit datasets of tillage intensity that evolve over time, but this will require major changes to existing model infrastructure. Such spatially and temporally explicit management has been incorporated into CLM for crop fertilization and irrigation, but not for other crop management practices to this point. Future work aimed at improving our understanding of other processes associated with different tillage practices and their implementation into LSMs could help further refine these results to obtain more accurate predictions of the global SOC sequestration potential of NT adoption.

Data availability statement
The data that support the findings of this study are openly available at the following URL/DOI: https://github.com/mwgraham/clm_tillage_project_ data. Data will be available from 31 March 2021.

Acknowledgments
The CESM project is supported primarily by the National Science Foundation (NSF). This material is based upon work supported by the National Center for Atmospheric Research (NCAR), which is a major facility sponsored by the NSF under Cooperative Agreement No. 1852977. Computing and data storage resources, including the Cheyenne supercomputer (doi: 10.5065/D6RX99HX), were provided by the Computational and Information Systems Laboratory (CISL) at NCAR. We thank all the scientists, software engineers, and administrators who contributed to the development of CESM2.
Financial support for this research was provided in part by U S Dept. of Agriculture-National Institute of Food and Agriculture Project #2015-67003-23485, EASM-3: Decadal Prediction of Sustainable Agricultural and Forest Management. This work was also supported in part by the U S Dept.

Author contributions
M G developed the research concepts, conducted modeling and analysis, and led writing on the publication. R Q T and D L provided training for modeling and analysis, feedback on writing and research concepts, and financial support. M O provided feedback on writing and development of research concepts, as well as financial support.