Uncertainty analysis of the Limits to Growth model: sensitivity is high, but trends are stable

Since publication, the Limits to Growth model has received both praise and criticism. One criticism is the model’s sensitivity to input error. We have performed an uncertainty analysis to see in retrospect if the model’s sensitivity was of concern. The results show
 that standard deviations of output variables are high. The general trends of the variables, however, are predictable, with very similarly shaped trend lines. Trajectories indicating a favourable future for humankind (i. e., without a severe decline in population and resources) stay in areas
 of low probability.Uncertainty analysis is an important step in determining the reliability of a model. Models which are used to determine policies or guide decisions must be reliable to ensure sound choices are made. The Limits to Growth model by Donella Meadows and colleagues
 was one of the first computer models to investigate global issues of population growth and resource constraints. The model received much attention and criticism, sometimes being accused of being too sensitive to variations in input parameters. This paper studies the model’s sensitivity
 to input error through an uncertainty analysis, and examines if this sort of analysis could have affected the debate surrounding the model’s reliability and usefulness. Results showed that given the data used to calibrate the model, the output was susceptible to large variations, with
 the population variable returning a normalised standard deviation of 0.43. However, despite input error, the trends of the variables remain predictable.

orrester's World Dynamics Model, which formed the basis of the book The Limits to Growth (Meadows et al. 1972), has received significant attention since the book's publication in 1972. The model is often referred to as the World3 (W3) model. It combines concepts from demography, economics, agriculture, and technology (Austin and Cottler 1977). The model's primary objective is to analyse the dynamic relationships between the world's human population and its resources; where the definition of resources is extended to man-made capital (Meadows et al. 1974). The model can be broken up into five main sections: population, capital, agriculture, resources, and pollution. All sectors are connected via feedback loops, making the model dynamic over time.
The purpose of the model was to provide qualitative projections of the dynamic behaviour of the world system. This was due to the unpredictable nature of social systems and also the lack of data available to researchers at the time. It was argued that while imprecise, the knowledge gained about the behaviour of the system would still be valuable for policy makers. The authors thought that decisions regarding population control, energy consumption, and investment in new technologies would be better informed by studying the model (Meadows et al. 1974).
The parameters in the model were derived by the collection and analysis of a large amount of published data. The details of the data, data analysis process, and justifications for parameter selection were published in a technical book (Meadows et al. 1974).
The results of the model showed current trends of that time would, if continued, lead to a sharp deterioration of the global community's welfare. Resources would become scarce and the carrying capacity would decline, in turn bringing about a sharp reduction in the human population (Meadows et al. 1972). This scenario was dubbed the standard run or business as usual.
To test the behaviour of the model, parameters were changed to simulate different social behaviours, resource conditions, environmental policies, or technological advances. A set of ten scenarios were presented in the book The Limits to Growth (Meadows et al. 1972). While each change in parameters resulted in different output, the general shape of trajectories and outcomes closely resembled the standard run, unless drastic changes were made in a specific manner. An example of a drastic change was to limit the number of children per family to two and to set capital investment to equal capital depreciation.
The W3 model underwent severe scrutiny when it was first published. Bardi describes the reaction to the study as "harsh", with many critics dismissing it as flawed (Bardi 2011). The debate surrounding the model quickly turned from scientific to political (Schoijet 1999). Often the sensitivity of the model would be used as an argument as small changes to parameters could produce large changes in the output (Trainer 1999).
Several sensitivity analysis (SA) studies were performed on the W3 model. The team that developed the W3 model intuitively tested it by manually varying parameters and inspecting the changes in output. Many other researchers conducted their own tests in the same fashion as Meadows' team (see Castro 2012 and references therein). The contributions of Austin and Cottler (1977), Vermeulen andde Jongh (1976) andde Jongh (1978) are clear examples of methodical local SA.
Vermeulen and de Jongh showed that small changes in parameters could produce drastic differences to the output of the W3 model (Vermeulen and de Jongh 1976). Their study noted that "[b]y changing three parameters by 10 % each in 1975 the world population collapse predicted by the model is averted", thus questioning the relevance of the W3 results. Austin and Cottler argued that if a model is sensitive to small variations in input parameters then the output is of "little practical interest" (Austin and Cottler 1977, p. 16).
In their study, Vermeulen and de Jongh found that some of the most sensitive parameters were industrial capital output ratio (ICOR), average lifetime of industrial capital (ALIC), fraction of industrial output allocated to consumption (FIOAC), life expectancy normal (LEN), reproduction lifetime (RL), desired complete family size normal (DCFSN), land yield factor (LYF), land fraction harvested (LFH), and inherent land fertility (ILF) (Vermeulen and de Jongh 1976, p.30). This was achieved through local SA, that is, altering each parameter by a small percentage and comparing the percentage change of the main variables. These parameters were then changed in combination and by ten percent to examine plausible responses of the model. Figure 1 shows some of the results obtained by Vermeulen and de Jongh (1976).
The study by Austin and Cottler again employed local SA to determine the sensitive parameters (Austin and Cottler 1977). Once identified, the authors discussed the potential of these parameters being changed in the real world to help alleviate problems. More recent work along these lines has been performed by Danós et al. (2017).
While the results of previous studies provided increased insights into the accuracy and behaviour of the model, the model's true uncertainty was largely unknown. An uncertainty analysis was required to fully understand the model's susceptibility to input error.
A very common method for testing the uncertainty of a model is via Monte-Carlo analysis. With this method Probability Density Functions (PDFs) are established for each model input. The model is executed a large number of times. Each time the input PDFs are sampled and the outputs recorded. Once all executions are completed, statistical measurement regarding uncertainty can be derived by analysing the recorded outputs (Saltelli et al. 2008).
A literature review, along with comments by Turner (2013), indicates that to date, an uncertainty analysis has not been performed on the W3 model. The purpose of this study is to undertake an uncertainty analysis of the W3 model to better determine its sensitivity to input error, and investigate the validity of past concerns about the model's sensitivity. In addition, we compare predicted with realised trajectories of output variables from 1970 to 2016.

Aim
The aim of this research is to evaluate the uncertainty of the W3 model due to input error. The results allow us to examine what could have been learnt by Meadows and his team in 1970 if the an alysis had been conducted at that time. A major criticism of the model is its sensitivity to input changes, and so an uncertainty analysis is useful to inform modellers and users about the inherent characteristics of the model.

Method
The W3 model was taken from the book Dynamics of Growth in a Finite World (Meadows et al. 1974, p. 549). The code was rewritten in the C++ language. To ensure the model was replicated correctly the standard run was executed and then plotted against the pub-  Vermeulen and de Jongh (1976). Std run: standard run. Edit A: industrial capital out put ratio (ICOR) and fraction of industrial output allocated to consumption (FIOAC) increased by ten percent, and average lifetime of indus trial capital (ALIC) decreased by ten percent. Edit B: edit A plus reproductive life time (RL) decreased by ten percent, and desired complete family size normal(DCFSN)increased by ten percent. Edit C: ICOR and FIOAC decreased by ten percent, and ALIC increased by ten percent. Edit D: edit C plus RL increased by ten percent, and DCFSN decreased by ten percent.
lished results for the standard run (Meadows et al. 1974, p. 501) as shown in figure 2. The error between the results can be attributed to the low quality of the published plots and extracting this data to a digital form. From this plot it was concluded that the model had been reproduced to an acceptable standard for the purpose of this paper.
A Monte-Carlo analysis was used to conduct the uncertainty analysis. The input Probability Density Functions (PDFs) were sampled and passed to the model one million times. Independence is assumed between all parameters. The PDFs for the W3 model parameters were derived through analysis of the data presented in Meadows et al. (1974), as this data was used to calibrate the model. Where possible, the standard deviation was calculated using the same data. However, some of the data needed to calibrate the W3 model were sparse or not available, making it difficult to calculate the standard deviation. For these cases a reasonable standard deviation was assigned to the parameter. 1 For parameters based on intuition or a single source of data, a SD of 15 to 20 percent of the parameter's nominal value would be allocated. Parameters based on two or three sources received ten to 15 percent, and those based on three or more sources received five to ten percent.
An important aspect of the W3 model is its inclusion of some parameters as table functions, that is, a parameter depends on an input variable according to a predefined static transfer function. Tables are defined by a set of points on a graph. Linear interpolation is used to determine intermediate points. This presents a major challenge when conducting a global SA, as the tables need to be easily changed with each simulation run.
For tabular parameters, the standard deviation SD data was calculated according to equation 1, given a set of data points .

2
Here, n equals the number of data points, λ is the shape factor which we set to λ = , and . 2 W i represents a weighting factor for each data point. The further away a data point is from , the smaller its effect on the standard devia tion at . T is a table function. Figure 3 shows the calculated standard deviation for the table input ISOPC (indicated service output per capita).
To assign a PDF to a table parameter, a set of standard deviations was created. The th element of SD corresponds to the th element of . Figure 4 (p. 278) shows a hypothetical table parameter with standard deviation of . The four bell shaped curves show the probability density functions assigned to each element of . In this example, we can see that increases with , and the standard deviation assigned to this table parameter also increases with .
To sample the table parameter's PDFs and generate a new table T M , a number P is chosen at random from the set (0,1). If the ith element of T is then the i th element of T M is defined as,

3
where C is the cumulative distribution function such that with μ and σ denoting mean and standard deviation.
An example T M was generated with P = 0.2. The new table has been plotted in figure 4 as a dashed line. We can see that this new table is similar to the original table T but has been shifted slightly. As x increases, T M deviates from T due to the increasing SD. For every simulation, a new T M is generated.
1 The assigned standard deviation of these parameters is essentially arbitrary.
The values were chosen to ensure variation in parameters without changing them by orders of magnitude. To illustrate the quandary, imagine trying to assign a standard deviation to a set of data containing only one data point. 2 A value of λ = signifies that a variable xd from x will contribute of its value compared to a variable at x. This relationship is exponential with respect to x. The weighting function is an attempt to mathematically describe the SD of the table function. It is not meant to be a rigorous formulation. In the future we want to apply a suitable confidence measure for these generalized additive models. where N represents the number of data samples taken. Table 1 of the online supplement 3 summarises the standard deviations assigned to each parameter as a percentage of its nominal value. For W3 table parameters the average standard deviation is reported. The data for each parameter can be found at the indicated page number in Meadows et al. (1974).

Results
In this section the data from the study will be presented in a variety of methods, that is, trajectory lines, probability density maps, probability density functions, percentiles and averages, and standard deviations. Each method allows for a different understanding of the results.
The main variables examined from the W3 model were human population (POP), fraction remaining of non-renewable re sources (NRFR), industrial output per capita (IOPC), food per capita (FPC), crude birth rate (CBR), crude death rate (CDR), and persistent pollution normalised with 1970 levels (PPOLX). The units for these variables respectively are: people, unit-less, US dollars per person-year, vegetable equivalent kilograms per person, births per 1,000 people-year, deaths per 1,000 people-year, and unit-less. These variables will be the main focus of this paper.

Trajectories
Firstly, to understand the variety of outcomes, the trajectories of 100 simulations were plotted. This allows for a clear representation of how each trajectory behaves. This is limited to a relatively small number of examples (100 of the 1 million simulations) as the plot becomes overly crowded with the addition of more simulation trajectories. Figure 5 shows the trajectory of the population in the W3 model for 100 runs. It can be seen that there is a wide variation of trajectories, however most follow a similar path to that of the standard run shown by the light green line. Some trajectories manage to reach a maximum in excess of ten billion people. This is approximately three billion greater than the maximum of the standard run. Some trajectories have very high values in the year 2100 compared to the standard run.
One trajectory stays low for all years, showing how the model can react dramatically for some permutations of the variables. Two trajectories can be seen very slowly increasing for the whole simulation. This would indicate a favourable and stable future for humanity for the period simulated, that is, no sudden fall in population in the 21 st century.

Probability density maps
To overcome the crowding of the trajectory plot, probability density maps were produced. These maps show the concentration of trajectory lines, thus darker areas of the plot represent a higher probability of a trajectory passing through that point. Figure 6a shows the probability density of the population variable. Areas with a high density of lines in figure 5 correlate to darker areas in figure 6a. We can see that the probability is spread thinly after the year 2010, denoted by the lightening of the grey. The probability remains thinly spread for the first half of the 21 st century. We can note a slight converging after about 2070, denoted by a darkening, as most trajectories fall to lower levels of around 3.5 billion. Figure 6b shows the probability density of the fraction remaining of non-renewable resources. It is evident that the probability is spread thinly between the years 2000 and 2030. There is a clear increase in probability density for the final decades of the simulation for low values of remaining resources. The darker regions follow the trajectory of the standard run denoted by the thin black line.

Probability density functions
To better understand the evolution of the probability density it was plotted for individual times, that is, the initial year (1970) and each subsequent 20 years. Figure 7a (p. 280) shows the progression of the population's PDF over time. We can see a very concentrated PDF in 1970. This is representative of the limited uncertainty of world population data which was used to initialise the model. The PDFs widen with time, accompanied by a fall in peak probability.
The year 2050 has the lowest peak and greatest distribution of probability, indicating the least amount of certainty for the population.
The certainty of the model begins to increase after 2050, as evidenced by the peak probability increasing. The PDFs maintain a log normal shape for all times. The peak of each PDF loosely follows the standard run indicated by the vertical dashed line. Figure 7b (p. 280) shows the PDF over time for the fraction remaining of non-renewable resources variable. Again, there is a concentrat ed PDF for the year 1970. As the model progresses through time, the certainty quickly diminishes with the lowest peak in 2010. The peak begins to increase after 2010 as the fraction remaining of non-renewable resources variable converges to a low of around 0.2. Again the peak of the PDF follows the standard run.

Percentiles
A useful approach to analyse the results is with the use of percentiles. The percentiles chosen were the 5 th , 25 th , 50 th (median), 75 th , and 95 th . By examining the 5 th and 95 th percentiles the range of 90 percent of all trajectories can be easily determined. Likewise, the 25 th and 75 th percentiles show the range of 50 percent of all trajectories. Figure 8 (p. 281) shows the percentiles, mean, and standard run for six of the main variables of the W3 model. In plot A we can observe that the percentile lines begin very close together, showing a high certainty of the population value. The percentile lines then "fan out" as time progresses, indicating the increasing uncertainty. The lines reach a wide spread by 2030. At this point, 50 percent of all trajectories are contained within a range of 1.5 billion, 90 percent within 3.76 billion. Beyond 2030 the percentile lines remain approximately the same width apart. The percentile lines retain a similar trajectory to that of the standard run. For the other variables we see that the percentile lines also follow similar paths to that of the standard run.
Fraction remaining of non-renewable resources, and industrial output per capita undergo an increase in uncertainty in the early decades, followed by a convergence in later years. The increase in certainty comes from the fall of non-renewable resources and collapse in industrial output towards their limit of zero. The Sene ca effect is clearly visible in the industrial output per capita variable. 4 The variables food per capita, crude birth rate, and crude death rate have a more consistent level of uncertainty as indicated by the consistent spacing between percentile lines.
It is interesting to note the average and median of the population remains well below the standard run. This suggests that pop ulation is more likely to stay at a level lower than that of the standard run.
Real world data will be discussed in section Real world comparison.

Standard deviation
To fully understand how the uncertainty of the output varies with time, the standard deviations (SDs) of the PDFs at each time interval were calculated. Figure 9 (p. 282) shows the SDs of the population (a), fraction remaining of non-renewable resources (b), and industrial output per capita (c). We can see that the population's SD gradually increases to a maximum of approximately 1.5 billion. This is a large value considering that the maximum population of the standard run is approximately 7 billion. It begins to decrease after 2070 as most trajectories fall to lower values.
The SD of the fraction remaining of non-renewable resource quickly climbs to a maximum of 0.18 by 2020. This is a large SD given the variable's bounds of 0 to 1. After 2020, the SD begins to fall as trajectories converge as seen in figure 8 b. Industrial output per capita likewise has a very sharp increase in SD. It peaks at around 110 dollars per person in the year 2000, a significant value compared to the 300 dollars per person of the standard run at the same point in time. Figure 10 (p. 282) shows the SD of each variable normalised by its 1970 standard run value. The normalised SDs of the variables food per capita, fraction remaining of non-renewable resources, and crude birth rate remain reasonably steady for the years sim-> 3 The supplementary material is available at www.oekom. de/publikationen/zeitschriften/gaia/supplementary-material/c-157. 4 The "Seneca effect" is a description proposed by Bardi (2017) for sudden and unexpected collapse. The term is named after the Roman philosopher Seneca who wrote "Increases are of sluggish growth, but the way to ruin is rapid" (Lucius Annaeus Seneca, Moral Letters to Lucilius, XCI, 6). ing of non-renewable resources data were taken from the British Petroleum Energy Outlook (BP 2018). An exponential function was fitted to the data to estimate missing data (i. e., pre-1970 records). An upper (150 zettajoule) and lower (60 zettajoule) estimate for total fuel reserves was used to mimic Turner's (2012) estimates. Food per capita data were taken from calories/capita/day data from the Food and Agricultural Organisation (FAO 2018). This too was scaled to align with the 1970 standard run value. The real world data for industrial output per capita and food per capital align very well. However, industrial output per capita is now (in the last few years) beginning to exceed the standard run values. Real world birth rates, and even more so death rates, have been lower than predicted.These two deviations between predicted and realised trajectories, however, partially neutralise each other, resulting in the real world population now being slightly higher (approximately eleven percent) than predicted. The fraction of non-renewable resources remaining in the real world is well above that predicted by the standard run. In the W3 model, the dwindling resource pool is what prompts the down turn of the industrial output. As the real world data have not reached such a level by now, the W3 model would suggest that the decline in industrial output is yet to come, as evident in the real world data of GDP per capita, or IOPC.
It should be noted that data agreement is not conclusive of model validity. Many different models could produce similar alignment. An ordinary exponential function (simply calibrated to fit real world data) would also align nicely with population and industrial out-ulated, reaching maxima of 0.25, 0.20, and 0.16 respectively. Crude death rate normalised SD is steady at approximately 0.2 for the first five decades, and then experiences a sharp spike reaching 0.41 in the year 2040. This period corresponds to the beginning of the population decline in the model.
We can see from this that the severity of the decline varies strongly between simulation runs. The normalised SD of industrial output per capita quickly rises to a high of 0.54 around the turn of the millennium. It then proceeds to decline to a comparable level to that of food per capita, fraction remaining of non-renewable resources, and crude birth rate. The normalised SD of population continually rises to a maximum of 0.43 and then declines slowly. Normalised persistent pollution normalised SD reaches a high of around 3.6 during the years 2030 to 2060. This is the most uncertain variable as the modelling of pollution is very difficult, and due to the large increase in pollution from 1970 levels. Figure 11 (p. 283) zooms into the comparison of the standard run with real world data from 1970 to 2017 as indicated in figure 8. The data for population (POP), deaths (CDR), and births (CBR) were obtained from the United Nations' (UN) Department of Economic and Social Affairs' (DESA) Population Division (UN DESA Population Division 2017). Industrial output per capita (IOPC) data were approximated using global GDP data and the population data. GDP data were taken from the UN DESA Statistics Division (2018). This was scaled to align with the 1970 standard run value, allowing for easy comparison of trends. Fraction remain- It is evident that the model as such is sensitive to input error; however, the trends, that is, the shapes of the trajectories, are less so. Thus, while the exact numbers produced by the model may be of little practical use, the behaviours exhibited by the model would still be of interest. RESEARCH put per capita development to date. However, it would offer little insight into the actual mechanics to the phenomenon behind the data. The only way to begin to validate the standard run simulation is to actually observe a collapse (by which point it is too late to avoid it) in the real world. This is a very hard test to satisfy as the data and variables used to calibrate the model have most likely changed as our society has progressed, for example, concerning efficiencies in material usage or the introduction of unconventional fossil fuels. Thus the collapse would most likely be dif ferent to that shown in the standard run.

Real world comparison
Meadows and his team understood this issue very well. The W3 model was also used to examine other transitions into the future. The model has mechanisms that allow for modifications in parameters to be made at a particular point in time. This was designed to enable the simulation of events like government poli cy changes or technological advancements. The standard run can only represent a society trapped in 1970's behaviour and technol ogy.
An issue regarding the model is the exclusion of an explicit renewable resource sector. Because of this the model is nearly guaranteed to produce a "collapse". While the addition of a renewable energy or resource pool would allow for a greater chance of a sustainable future (one in which variables plateau), it would potentially still result in a "collapse". For example, the energy study of Dale et al. (2012), which investigated the futures of non-renewable and renewable energy production, showed a sharp downturn in total energy production after the year 2060, despite strong growth in the renewable energy sector.
Another issue is that energy and material resources are lumped together in a single variable. This limits the complexity of the possible scenarios. It is clear that many intricacies of the world have been consolidated into single variables in the W3 model. This causes many issues regarding model usability and structural accuracy and could cause larger errors than parameter uncertainty.
Other investigations have been made into the relationship of the W3 scenarios and real world data. Examples include Turner (2012) and Pasqualino (2015).

Knowledge gained
After performing the uncertainty analysis, our knowledge of the W3 model has greatly improved. While past local SA studies allowed sensitive parameters to be identified and then hypothetical scenarios tested, this uncertainty analysis identifies more accurately the model's sensitivity to input errors.
It is evident that the model as such is sensitive to input error; however the trends, that is, the shapes of the trajectories, are less so. Thus, while the exact numbers produced by the model may be of little practical use, the behaviours exhibited by the model would still be of interest.

Limitations
The results presented in this paper demonstrate the error sensitivity in the standard run scenario. The results are only relevant for a situation in which the parameters stay fixed to their initial values, that is, as they were in the first half of the 20 th century. An addition to this study would be to allow "stabilising policies" to be implemented at a certain year in the simulation. The year in which the policies are implemented would be given a PDF to illustrate the uncertainty of when society would make a significant change to its behaviour. This addition could improve our understanding of uncertainties in the W3 model. This study only investigates uncertainty caused by model input errors. The question of the accuracy of the model itself is still open for debate. While the model captures many economic and ecological relationships in our world, it still omits or aggregates most aspects.
The removal of trajectories which drastically diverge from historic data, for example, the lowest trajectory in figure 5, would improve the accuracy of the results. However, as they contribute a low proportion of all simulation runs, their effect on the results is negligible, thus it was deemed unnecessary to remove them for this study. > FIGURE 8: Graphs of percentiles, mean and standard run over time for A pop ulation (POP)(people), B fraction remain ing of non-renewable resourc es (NRFR)(unit less), C industrial output per capita (IOPC) (1970 $US), D food per capita (FPC) (vegetable kg), E crude birth rate (CBR)(births per 1000 people), and F crude death rate (CDR)(deaths per 1000 people). Percentiles (5, 25, 50, 75, 95)(dotted lines), mean (dashed line), standard run (solid line), and real world data (red line). There are two red lines in plot B, as an upper and lower estimate for total fuel reserves.

Conclusion
The results show that the unpredictability of the model is noteworthy. The variables population(POP), industrial output per capita (IOPC), fraction remaining of non-renewable resources (NRFR), and normalised persistent pollution (PPOLX) produced normalised standard deviations of up to 0.43, 0.54, 0.20, and 3.6 respectively. The general trends of the variables are predictable, with more than 95 percent of simulation runs producing nearly identically shaped trend lines. This is demonstrated by the similar shape of each percentile line, and by the fact that these lines do not drastically diverge as time progresses. It is evident that trajectories can be found that indicate a favour able future for humanity, that is, no sudden downturn in popula tion or access to resources, without policy intervention for the time period modelled. However, these trajectories reside in areas of low probability, and thus should not be considered as likely outcomes.
For the authors of The Limits to Growth, a study such as this one might have proved useful in explaining the model's behaviour to its critics and demonstrating its main purpose, that is, allowing the study of possible trends, not producing precise predictions.
Author contributions: A.H. performed research and wrote the paper; B.S. and M.R. provided research direction and contributed to the writing. This research has been conducted with the support of the Australian Government Research Training Program Scholarship.  Bestellung an: abo@oekom.de Leseproben, Informationen zur Zeitschrift und Abobedingungen: www.oekologisches-wirtschaften.de