Hedging as an adaptive measure for climate change induced water shortage at the Pong reservoir in the Indus Basin Beas River, India

This study investigated the adaptive capacity of static and dynamic hedging operating policies to shore up the performance, i.e. reliability and vulnerability, in irrigation water supply of Pong reservoir in India, during climate change. The policies were developed using genetic algorithm optimisation and used to force reservoir simulations for different climate change perturbed inflow series, whence derive the performance. For static hedging, the hedging fraction remains constant throughout the year while for dynamic hedging, this fraction varies monthly or seasonally. Results showed that static hedging was effective at tempering the systems vulnerability from its high of ≥ 60% to lower than 25%, while maintaining an acceptable volume-based reliability. Further simulations with dynamic hedging provided only modest improvements in these two indices. The significance of this study is its demonstration of the effectiveness of hedging as a climate change adaptation measure by limiting water shortage impacts. It also demonstrates that simple static hedging can match more complex dynamic hedging policies.

Hedging as an adaptive measure for climate change induced water shortage at the Pong reservoir in the Indus Basin Beas River, India Adebayo J. Adeloye This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Introduction
Reservoirs are a major component of most water supply systems utilising rivers resources.
Their purpose is to regulate natural river flow fluctuations by storing the excess water during high-flow periods, which are then released during low-flow periods to meet domestic, industrial, agricultural and other demands served by the system. The planning of reservoirs using historical runoff data observed at the reservoir site is the best option available to the analyst but could be problematic if the operational runoff situation of the reservoir differs radically from the planning situation, e.g. with predicted climate change that might further reduce the amount, and increase the variability, of reservoir inflows in various regions of the world (IPCC, 2007).
The realization that climate change will affect future inflow series and hence the performance of reservoirs to meet its obligations has led to the intensification of research efforts to assess these impacts as a precursor to the development of effective mitigation and adaptation strategies (see e.g. Nawaz and Adeloye, 2006;Fowler et al., 2003;Li et al., 2009;Adeloye et al., 2013). In general, most of these studies have reported deteriorating performance with climate change, e.g. lower reliability, increasing frequency and/or magnitude of water shortages, etc., although as recently demonstrated by Soundharajan et al. (2016), there are huge uncertainties associated with both the magnitude and sign of these impacts. Such unsatisfactory situations call for concerted adaptation and/or mitigation efforts, which might require that either the facilities are expanded (e.g. building new reservoirs, development of other sources such as groundwater) or operational improvements are introduced for existing facilities. New builds for capacity expansion are often controversial, requiring long gestation periods and can have unwanted social-environmental consequences. In contradistinction, however, devising improved operational practices is much quicker and has been proven to be effective for significantly curbing systems vulnerability (Eum et al., 2011).

A C C E P T E D M A N U S C R I P T
3 Most reservoirs are operated using rule curves which guide the operator's decision on the quantity of water to release based on the total available water at the beginning of each month (Yin et al., 2015). Available water is the sum of the starting storage and the anticipated inflow during the month. While the former is known at the beginning of the month, the latter is unknown except through a forecast. The assumption is being made here that forecast monthly inflow is equal to the historic monthly runoff. Thus, rule curves depict target storage levels to be maintained in the reservoir if it is to be able to successfully meet the demand.
They are normally determined using any of the available reservoir planning analysis tools (e.g. the sequent peak algorithm, SPA-see McMahon and Adeloye, 2005), forced with the historic runoff data record at the reservoir site. Once determined, they remain fixed and are used for guiding the operation of the reservoir.
A schematic illustration of basic rule curves is shown in Fig. 1a, in which the operator will attempt to meet the full monthly demand whenever the available water is in the interval Rule curves are easy to deploy but may result in high vulnerability or large single period shortages, a problem that may be exacerbated by projected climate change. For example, Chiamsathit et al. (2014) reported vulnerability indices of 67% and 88% respectively for domestic and industrial water allocations at the Ubonratana reservoir in northeast Thailand when operated with basic rule curves such as those illustrated in Fig. 1a. To improve the effectiveness of rule curves in curbing excessive vulnerability, water rationing (or hedging) during normal operational periods is often carried out (Bhatia et al., 2018;Tu et al., 2008;Eum et al., 2011;Srinivasan and Kumar, 2018;Chang et al., 2019). By normal operational period is meant that there is sufficient water in storage to meet the full demand but when hedging, this full demand is deliberately unmet by cutting-back. If the cut-back is moderate, i.e. ≤ 25% of the requirement, the impact on the socio-economic wellbeing of water users will be minimal (Fiering, 1982).
A single stage, static hedging policy is illustrated in Fig. 1b. It shows the critical curve that delineates the hedging zone and hence that triggers the onset of hedging and the "α" representing the static (or constant) fraction of the demand to supply. Thus, in comparison with the no-hedging rule curve illustrated in Fig. 1a, the full demand satisfaction in Fig. 1b only occurs when the available water is above the critical curve. Whenever the available water is below the critical rule curve, the water is rationed by delivering at most only a fraction "α" of the full demand, i.e. I t  D t , where I t is the supply, D t is the demand and  (0 ≤  ≤ 1) is the static rationing ratio.
The difference between the dynamic case in Fig. 1c and the static case in Fig. 1b is that in the dynamic case, the rationing ratio is not constant but varies from one month to the next, i.e. I t =  m D t , where m=1,2,..,12 and (0 ≤  m ≤ 1) is the rationing ratio for month m. It is also possible to have seasonally varying rationing fraction; indeed, both the monthly and seasonally varying options will be evaluated in the current work.  Peng et al., 2015), commence hedging in the region where the water availability situation is already dire, which runs counter to the notion that to be effective, hedging should be saving water during normal operational periods; not aggravating the water stress. Additionally, the few previous studies on dynamic hedging have not attempted to study the performance of such hedging policies within the context of climatic change.
Another approach that has been used for hedging is that implemented in the WEAP software tool (SEI, 2005), in which a buffer zone is created with the release rationed whenever water in storage is within this buffer zone. Although the WEAP approach is still based on reservoir storage zoning (as opposed to being SOP-based), the release is rationed based on the water available in the buffer zone at the start of the month. Consequently, unlike the development in the current work in which the release is rationed from the demand, the release during hedging will be totally unrelated to the water demand for the WEAP approach. By rationing using the water demand, users will be able to know a priori the amount of water to expect and hence plan to adapt accordingly. In contradistinction if rationed from the water available in storage at the start of the period, however, the water to expect will be subject to the vagaries of the inflow and hence unknown to users. A further issue with the WEAP approach is that the rationing factor is often selected arbitrarily or at best through calibration. Finally, the inflow during the month is usually ignored when allocating the water during hedging.
In the work reported herein, an attempt is made to bridge this knowledge gap in reservoir operation. In particular, the determination of the critical curves (CRC) that delineate the A C C E P T E D M A N U S C R I P T 6 reservoir zones and hence trigger when to hedge, as well as the associated hedging factors  and  m , is achieved using genetic algorithms (GA) optimisation.
The aim of this work therefore is to investigate the adaptive capacity of reservoir release hedging for coping with climate change induced water shortages. The objectives are to: 6. Review the performance and make recommendations.
Objective 1 was accomplished by as part of a previous study (see Soundharajan et al., 2016) and the basic rule curves will form the basis for developing the hedging-integrated enhancements for the policies.
In the following section, further details of the adopted methodology are given. This is then followed by the description of the case study, following which the results are presented and discussed. The final section contains the main conclusions and recommendations of the study.

Methodology
A C C E P T E D M A N U S C R I P T 7 A flow chart of the adopted methodology is shown in Fig. 2. Further brief descriptions are provided below.

Genetic algorithms (GA) optimisation for rule curve development
Genetic Algorithm (GA) is a random search optimization algorithm inspired by biological evolution that provides a robust method for searching for the optimum solution to complex problems (Michalewicz, 1992). In GA, the solution set is represented by a population of strings, which comprises a number of blocks each representing the individual decision variable of the problem. Strings are processed and combined according to their fitness (objective function value evaluated using the components in the string), in order to generate new strings that have the best features of two parent strings. Strings with the best fitness have the greatest chance to future generations, similar to the process of natural selection.
To start the GA optimiser, the initial solutions (or strings) are randomly generated. Three fundamental operations are involved in manipulating strings and moving to a new generation: selection, crossover, and mutation. The selection operation helps identify the strings, which are included in the reproduction process, for developing the next generation of strings. There are a number of approaches for selection, all of which determine the probability of selection as a function of fitness. Because the initial solutions are randomly generated and hence may not be reproduced exactly if repeated, the GA is normally repeated over several randomly generated initial solutions and either the mean of the best solutions or most superior best solution taken as the final solution. A detailed description of the GA will not be attempted in this paper; interested readers are referred to the fundamental treatise on the subject by Michalewicz (1992).
Despite the existence of a large number of traditional non-linear programming (NLP) techniques for solving this kind of constrained optimization problem, a search-based optimizer such as the GA is preferred because of its ability to search for the solution from a population of points (not a single point), its use of the objective function information itself rather that the derivatives of the function, and its use of probabilistic (as opposed to deterministic) transitions rules. GA has been widely applied in reservoir operation studies for developing optimal rule curves, operating policies and hedging (Kim et al., 2006;Oliveira and Loucks, 1997;Wardlaw and Sharif, 1999;Sharif and Wardlaw, 2000;Ahmed and Sharma, 2005;Jothiprakash and Shanthi, 2006;Reddy and Kumar, 2006;Kangrang et al., 2008;Azari et al, 2018).

Optimisation of hedging rules: objective function and constraints formulation
Optimisation of the ordinates of the CRC for each month and rationing ratio(s) (static and dynamic) with GA requires specifying the objective function and constraints equations.
The objective function is the minimization of the sum of squares of the water deficit The constraints are as follows: where S t and S t+1 are reservoir storage at the beginning and the end of time period t; Q t is the inflow during time period t; D t and t I are respectively the demand and actual release during time period t; E t is the net evaporation (evaporationrainfall) from the reservoir surface during time period t; WA t is the water available at the beginning of time period t; ER t is excess release during time period t; CRC m is the ordinate of the critical rule curve for the month m; and all other symbols are as defined previously.
All the variables defined above are in volumetric units, which means that the net evaporation, normally expressed as depth of water, must also be converted to volumetric units. To achieve this in a way that is compatible with the linear reservoir mass balance Eq. (1a), the typically non-linear area-storage relationship was approximated using a linear function as (Loucks et al., 1981;McMahon and Adeloye, 2005): where A t is the exposed reservoir surface at the start of t, c and d are parameters, K a is the active storage capacity of the reservoir and all other variables are as defined previously. Since the interest is on the active storage capacity which excludes the dead storage, the parameter "c" is constrained to represent the exposed surface area at the top of the dead storage; thus, "d" is the slope of the linear approximation of the area-storage function in the active storage

A C C E P T E D M A N U S C R I P T
10 part of the area-storage relationship. With "c" known, the only parameter needing estimation in Eq. (1c) is the slope "d". The volumetric net evaporation during t, i.e. in interval [t, t+1], then becomes: where e t is the net evaporation depth. Eq. (1d) is a linear function in the reservoir storage state S t ; consequently, it can be included in the mass balance Eq. (1b) without any furore.

Simulation of Catchment hydrology
The simulation of catchment runoff response to weather forcing was achieved using HYSIM, a conceptual catchment hydrological model (Manley and WRA, 2006;Pilling and Jones, 1999). HYSIM is a time-continuous, conceptual rainfall-runoff model. The model has two sub-routines simulating, respectively, river basin hydrology and the channel hydraulics. The

A C C E P T E D M A N U S C R I P T
11 hydrology is simulated using seven stores representative of land use and soil type. The full structure of the model is schematically illustrated in Fig. 3.
The seven natural stores into which the hydrology routine was conceptualised comprise interception storage, upper soil horizon, lower soil horizon, transitional groundwater store, groundwater store, snow storage and minor channel storage, all with associated hydrological parameters as detailed by Pilling and Jones (1999). The interception storage in the model denotes canopy storage of moisture and is determined by the vegetation type in the model.
Water stored in the interception compartment is ultimately lost by evaporation. The transitional groundwater store is conceptualised as an infinite linear reservoir, and serves to represent the first stage of groundwater storage. The store receives water from both the upper and lower soil horizons through the process of deep percolation when these horizons are at or above the field capacity. Water in the transitional groundwater store is constantly discharging to the permanent groundwater store also through deep percolation. The hydraulics routine routes the flow down the channel using a simple kinematic wave approach, also with associated parameters (Manley and WRA, 2006).
HYSIM inputs are the precipitation, temperature and, where available, the potential evaporation as inputs. The temperature is required for the modelling of snow-melt and accumulation based on the empirical degree-day approach. Where estimates of the potential evaporation are unavailable a priori, the temperature is also utilised for estimating the evapotranspiration. Further details about the model are available by Manley and WRA (2006).

Assessment of climate change impacts on hydrology
Once satisfactorily calibrated, the HYSIM was used to assess the effects of projected climate change on reservoir inflows. To derive future climate scenarios, the delta perturbation approach (see Vicuna et al., 2012)  where for [x, z; y], x is the starting value, z is the terminal value and y is the increment.
The above delta perturbations were applied to historic precipitation and temperature data for the Beas and the perturbed series were used to force the calibrated HYSIM rainfall-runoff model for the catchment, to determine alternative reservoir inflow series resulting from the perturbed climate.

Reservoir behaviour simulation and performance indices
Reservoir simulation used the reservoir mass balance equation shown in Eq. (1b), subject to the constraint posed by the rule curves, i.e.: In essence, the constraint in Eq.
(2) limits the reservoir content within the boundary of URC m and LRC m ; consequently, when this constraint is to be violated, t I will be increased or where N s is the total number of intervals out of N that the demand D t was fully met, R t is the time-based reliability, R v is the volume-based reliability and other terms are as defined previously.
(b) Resilience, : where f s is number of continuous sequences of failure periods and f d is the total duration of the failures, i.e. f d = N -N s .

ACCEPTED MANUSCRIPT
As seen above, the volume-based reliability and vulnerability indices are directly linked to the water shortage and will thus form the focus of the subsequent dicsussions.

Case study and data
The Pong dam (and its reservoir) is located on the Beas River, which is one of the five major rivers of the Indus basin, India (see Fig. 4). Located at longitude 76 o 05E and latitude 32 o 01N, the reservoir drains a catchment area of 12,560 km 2 , out of which the permanent snow catchment is 780 km 2 (Jain et al., 2007). Active storage capacity of the reservoir is 7,290 Mm 3 ; the dead storage is 1,280 Mm 3 . Apart from its use for generating hydropower, the Pong meets irrigation water demands of about 7,912 Mm 3 annually, which as illustrated in Fig. 5 is spread relatively uniformly throughout the year. Total irrigated area is 1.6 Mha with rice, wheat and cotton being the major crops sown. The irrigation water first passes through the turbines for power generation before flowing downstream for diversion. For this reason, irrigation constitutes the primary purpose of the reservoir and hence the focus of this study.
Monsoon rainfall between July and September is a major source of water inflow into the reservoir, apart from snow and glacier melt. Snow and glacier melt runoff in Beas catchment was studied from 1990-2004 by Kumar et al (2007) from which it is known that its contribution is about 35% of the annual flow of the Beas River at Pondoh Dam (upstream of Pong dam). Consequently, the ability of the Pong reservoir to satisfactorily perform its functions is susceptible to possible disturbances of these water sources due to climate change.
For a system that is inextricably linked to the socio-economic well-being of its region, any significant deterioration in performance of the Pong reservoir or its ability to meet the demand will have far reaching consequences. This is why it is important to carry out a

A C C E P T E D M A N U S C R I P T
15 systematic assessment of the performance of the reservoir operation and to use the outcome to inform the development of appropriate decision making.
Monthly reservoir inflow and irrigation demands from 1998 -2008 were available for the study. The historic mean annual runoff (MAR) at dam site is 8,570 Mm 3 (annual coefficient of variation is 0.21) and the seasonal distribution of the annual runoff is shown in Fig. 5. As The mean annual net evaporation depth is 0.493 m, which because it is positive implies that evaporation exceeds the rainfall on an annual basis. As noted by Nawaz et al. (1999), failure to accommodate net evaporation in the planning analysis for such a situation will lead to under-sizing of a reservoir because the positive net evaporation is an additional demand that must be provided for. The seasonal pattern of the net evaporation is shown in Fig. 6a, which is a mixture of positive and negative values as expected. Fig. 6b shows the non-linear areastorage function for the dam site as observed, with the linear-approximation superimposed.

HYSIM calibration
To accommodate the spatial variability within the catchment, the Beas catchment was divided into three sub-basins, namely the upper, middle and lower (see Fig. 4), based on consideration of altitude, spatial difference and available meteorological data. However, the which is more important for water resources planning than the high flows periods. The estimated Nash-Sutcliffe efficiency indices (Nash & Sutcliffe, 1970) during the calibration

A C C E P T E D M A N U S C R I P T
17 and validation were respectively 0.88 and 0.78, both of which lend further credence to the modelling skill of the calibrated HYSIM.

Climate change impacts on reservoir inflows
With HYSIM satisfactorily calibrated and validated, it was possible to use the validated model to assess impacts of changes in the rainfall and temperature on the runoff. As noted earlier, changes in annual rainfall considered were -10% to +10% with an increment of 5%.
Similarly, temperature changes considered were 0 o C to +5 o C with an increment of unity. Table 1 summarises the percentage change in annual and seasonal runoff relative to the simulated historic runoff. As expected, increasing the rainfall causes the annual runoff to increase while reducing the rainfall also causes the runoff to decrease for all the temperature scenarios. However, the simulation has also revealed a large influence of the melting glacier While increasing or decreasing the rainfall by the same amount has resulted in similar absolute change in the runoff for no change in temperature, the situation is quite different when temperature increases are also considered. For example, as shown in Table 1, an increase in annual rainfall of 5% produced a 10.21% increase in the annual runoff if the temperature increased by 1 o C; however, a similar decrease in rainfall with the 1 o C

A C C E P T E D M A N U S C R I P T
18 temperature increase only resulted in a decrease of only 1.6 % in the annual runoff. As noted previously, the Beas hydrology is heavily influenced by the melting snow from the Himalayas and what these results show is that runoff contributed by the melting snow partially compensates for the reduction in direct runoff caused by the combined effects of lower rainfall and higher (temperature-induced) evapotranspiration. Indeed, as the assumed temperature increase becomes higher, the effect of any reduction in the annual rainfall fully disappears, resulting in a net increase in the annual runoff. Consequently, increasing the temperature by 2 o C has resulted in a net increase in the annual runoff of 12.4% and 7% for 5% and 10% reductions respectively in the annual rainfall.
The annual runoff situation presented above masks the significant seasonal differences in the simulated runoff response of the Beas. As Table 1 clearly shows, both the post-Monsoon and winter seasons that do not benefit from the melting seasonal snow and its associated runoff tended to be well-behaved in terms of the response, with reductions in the rainfall producing significant reductions in the generated runoff. Indeed, for these two seasons, increasing the temperature slightly can worsen the runoff situation even for situations in which the rainfall has increased, as clearly revealed by the 2.4% reduction in the winter runoff with 1 o C and 5% rises, respectively in the temperature and rainfall. These situations must be resulting from the dominance of the evapotranspiration loss, which in the absence of additional water from melting snow will make the runoff to decrease. As the temperature becomes higher, however, more and more of the glaciers will melt causing the runoff to increase despite the increased evaporation expected with the increasing temperature.
This is a short-term advantage that would be impossible to maintain if the projected temperature increase takes hold, because such would result in the gradual disappearance of

A C C E P T E D M A N U S C R I P T
19 the glacier and/or less of the precipitation falling as snow. These issues were not considered in detail in our simulation, which has assumed albeit unrealistically that the glacier extent and seasonal snow accumulation would maintain the status quo.

Optimised Rule Curves
The optimised rule curves derived using GA (see also Soundharajan et al., 2016) are shown in Fig. 8. Fig. 8 (a) is the basic set of rule curves, in which no hedging is implemented. As remarked previously, these formed the basis of further GA optimisation to derive the hedging enhancements. When operating the reservoir with Fig. 8(a), attempts will be made to meet the full irrigation water demand whenever the available water lies within the URC and LRC confines. The space above the URC is meant for flood water and as seen in the figure, the URC deepens during the monsoon period to leave space for the increased inflow and thus reduce the potential for flooding. For the low inflow winter and pre-monsoon seasons, the reservoir is kept as close as possible to the maximum, thus ensuring that more water is available for meeting the demands. Fig. 8(b) shows the single-stage, static rationing hedging integrated rule curves. The critical hedging curve that triggers the water rationing lies everywhere between the URC and LRC as expected and allows supplying the full demand over a very wide range of water availability in the system during the high flow season. The range of water availability in which the full supply can be attempted is much narrower for the drier, post-monsoon periods. When water rationing takes place, only 17% of the full demand is cut back, leaving 83% of the full demand being attempted. This optimised hedging shortfall is modest and lesser than the 25% tolerable shortage suggested by Fiering (1982); however, what is important is its effect on the overall vulnerability of the reservoir system which will be discussed in the next section.

A C C E P T E D M A N U S C R I P T 20
The dynamically varying hedging policies are shown in Figs. 8(c) (monthly) and 8(d) (seasonally); their associated optimised supply limits (or rationing ratios) are shown in Tables   2 and 3, respectively. Unlike the static situation for which the rationing ratio was constant, the dynamic rationing is varying monthly (seasonally) reflecting the relative water abundance in the various months (seasons). Thus, as seen in Fig. 8(c) for example, the proportion of the demand supplied in the monsoon months was highest, almost approaching 100%. As the available water reduces, e.g. during the winter and pre-monsoon seasons, the proportion of the demand supplied attained its least value of < 80%. While the dynamic situation would seem more plausible in that it adjusts the hedging fraction to the reservoir inflow situation, it is a much more involved operating policy to develop [e.g. 13 (static policy) compared with 24 (dynamic policy) variables] and use than the simple static policy.
Another feature of the dynamic scheme is that the optimised critical storage curves that trigger hedging have also responded to the reservoir inflow situation in that during the low inflow winter season, the curves are below those for the dynamic case ensuring that the water available for meeting the full demand is more and hence occasions for which >20% reductions will be needed will be few. On the other hand during the high inflow, monsoon seasons, the critical curves for the dynamic policies are higher than those of static hedging, meaning that rationing will occur more frequently albeit the cut back amounts would be very small since the associated rationing ratios are close to unity.
The implication of this is that the dynamic policies will offer improvement in performance over the static policy but the question remains by how much? The attractiveness of the dynamic policy therefore would stem from its effect on the system performance: a very significant improvement in performance over the static policy case would be required to justify the preference for the former. Quantifying this improvement will require a simulation study of the reservoir which will be reported in the next Section.

Reservoir performance
Although all the five performance indices defined in Section 2.4 (see Eqs (3-7)) were evaluated in reservoir simulations forced alternatively with the simulated historic and climate-change perturbed inflow sequences, only the results for the two reliability measures (time-based and volume-based) and the vulnerability are discussed because they are directly linked to the water shortage as noted earlier. The vulnerability is important because of the acknowledged effect of hedging on the index: if hedging is to be adjudged to be effective, then it must impact the vulnerability significantly. Additionally, apart from the fact that the volume-based reliability is directly linked to water shortage, the two reliability measures are routinely used in the evaluation of water resources systems and therefore widely recognised.
Consequently, their inclusion for discussion here is not only recognising their popularity but also to emphasize the nature of the trade-offs between them, although Adeloye et al. (2017) recently presented an approach that harmonised the two reliability measures, thus effectively removing the need for the trade-offs when using the reliability indices for reservoir performance evaluation. A further attraction of the two reliability measures stems from their efficiency: for example, the results of extensive Monte-Carlo simulations recently reported by Soundharajan et al. (2016) showed that both the time-based and volume-based reliability exhibit the least variability of all the indices of reservoir performance. The volume-based reliability index shown in Fig. 10 has confirmed its generally higher value when compared with its time-based counterpart in Fig. 9. Additionally, it is also clear from the Fig. 10 that, unlike the time-based reliability, the volume-based reliability is not as drastically affected by hedging. For example, for the simulated baseline reservoir inflow sequence, the volume-based reliability only changed from 92% (no-hedging) to 89% (with static hedging). The R v situation improved with the dynamic policies but this improvement was rather marginal to justify the complexity associated with the development and deployment of the dynamic (seasonally or monthly varying) policies. Thus, although hedging has increased the number of occasions in which systems failure occurs, its effect on denting the ability of the system to meet the total period demand is very minimal. This is because the amount by which the water supply is cut back by hedging has been optimised to be so small that, cumulatively over the entire period, the total water shortage is insignificant relative to the total period demand. The fact that the volume-based reliability for the different climatechange temperature and precipitation perturbations considered are indistinguishable is an indication that hedging would protect the performance of the system in meeting the total A C C E P T E D M A N U S C R I P T 23 period demand, even with projected climate change. Thus, hedging can be said to represent an effective adaptive measure for climate change effects on large scale water resources facilities.
The vulnerability (or maximum single-period water shortage) is shown in Fig. 11. As noted earlier, the primary reason for deploying hedging is to limit single-period shortages and hence temper the vulnerability of water resources systems. Also shown on the plots is the horizontal line for a vulnerability of 25% which, as remarked previously, represents the tolerable shortage limit for most water users. With no hedging, the vulnerability is high (approximately 60%) under existing conditions and intensifies to about 65% when the catchment becomes drier due to projected reduction in precipitation by climate change. The vulnerability is tempered for wetter conditions but even for the most benign of these, i.e. projected 10% rise in precipitation, the recorded vulnerability was still above 47%, much higher than the 25% tolerable vulnerability threshold suggested by Fiering (1982).
The dramatic effect of hedging on the vulnerability can be seen in the evaluations for both the static and the dynamic policies. For example for the static policy, a most fascinating aspect is the fact that hedging has virtually eliminated the huge vulnerability associated with the driest climate change scenario; indeed, for all the scenarios, the vulnerability has been reduced to below the 25% threshold, from the peak of almost 65% when there was no hedging. This is remarkable given that only a mere 17% of the demand was withheld during hedging. As was the case with the other performance indices, the vulnerability was slightly less when the dynamic (monthly and seasonally varying) policies were deployed. This reduction in vulnerability does not justify the extra efforts in developing and deploying the dynamic policies.

A C C E P T E D M A N U S C R I P T
24

Conclusions
This study has developed optimised static and dynamic zone-based hedging policies for the

A C C E P T E D M A N U S C R I P T
34 Highlights  Climate change will cause deterioration in reservoir vulnerability.
 Operational changes with hedging will improve the vulnerability situation.
 Hedging rule curves with constant rationing is as effective as varying rationing.
 Hedging is a viable adaptive measure for climate change induced water shortage.