A Water Allocation Model (WAM)

by on under WAM
19 minute read

Summary

This note discusses various aspects related to forecasting how water from Deep Creek Lake should be allocated equitably to the various demands or stakeholders. A methodology, named WAM, for Water Allocation Methodology, was developed and historical data was analyzed with it. Not surprisingly, historical records reflect actual consumption since all stakeholders had already been taken into account.

1. Introduction

In a separate section of this website a methodology was developed that was able to analyze the operations of the Deep Creek Hydro Project without a-priori knowledge of groundwater flows and rainfall and operating as it was operated during that year by Brookfield.

No tool has yet been developed to optimize the allocation of the Deep Creek Hydro Project water releases to its stakeholders. This report examines WAM’s possibility as a forecasting tool, in a heuristic sense, by conducting certain “gedanken” experiments, “thought” experiments, to assess feasibility. It is inherent that no methodology is going to be perfect and satisfy everyone all the time, but it is entirely possible to have a time-varying prediction of water releases that all stakeholders should find adequate to apply in planning their respective activities.

Basically what we’re looking at is the following:

  1. Brookfield, the operator of the hydro-electric facility, would like to maximize the water release from the lake at times when electricity rates are favorable (meaning highest) to do so and/or whenever their contract with PJM, the allocator of Deep Creek Hydro’s power production, demands.
  2. People around the lake would like to be able to use their boats from say May 1 through September 30, perhaps October 31. Few people leave their boats in the water after these dates because of the deterioration of weather conditions.
  3. The people in Friendsville would like to be able to schedule white water releases at least several times a week from say June 1 through September 30 and perhaps more on certain holidays.
  4. To enhance the survival of the brown and rainbow trout fish population in the Youghiogheny river, the fisherman would like for the water to stay below 25 C at the Sang Run bridge at any time (more on this topic elsewhere).
  5. ASCI, the white water venue on top of Marsh Mountain, would like to extract water from the lake when needed
  6. The Wisp golf courses would like to extract water, when needed, to keep the two golf courses green
  7. Thousand Acres would like to withdraw water from the lake to irrigate its golf course when needed.
  8. All of this has to happen with whatever rainfall there is during the year in the Deep Creek Watershed.

Thinking about this one observes that there are a number of conflicting demands. What is good for one set of stakeholders may be undesirable for an other set. But, all depend on the amount and timing of the rainfall.

So, to me, the purpose of a water budget is to develop and exercise a methodology that should optimize the allocation of the waters of Deep Creek Lake to all its various stakeholders in the best way possible. Because of unsynchronized stakeholder demands there is no guarantee that a demand by any stakeholder can be met satisfactorily.

Because not everyone is going to be satisfied at every instance, the methodology must be fair, on average, to all parties. The answer to “What’s fair?” can surely be debated.

This also means that forecasting both supply and demand are going to be important. Supply in the sense that water comes into the lake via direct rainfall and ground water flows, while demand means the allocation of water from the lake to various stakeholders. Supply as by nature, demand is man-made.

2. Analysis

Here is an outline of the process of water allocation as I visualize it:

  1. Obtain the current lake level reading from the existing gage. (NOTE: There should really be another, a redundant water level gage at another location, to be used when the primary gage is out of service for whatever reason, which it does occasionally!)
  2. Compute the available volume of water based on the current value of the lower rule band. The current volume is based on the stage-storage relationship previously defined as a result of bathymetric work performed in 2012. See another of my reports and a rule-of-thumb analysis.
  3. Compute the future daily water needs.
    • Compute the white-water needs defined by a ‘suggested’ schedule in the Permit.
    • Estimate the amount of evaporation and remove it from the available water volume. Note that this can only be grossly estimated, although, with a little experimental effort, a more accurate relationship can be developed for the lake with historical data and made adaptive with future data.
    • Estimate a TER schedule that may be required. This can be done in a simple manner, using long range forecasts or statistically based on historic data. A recent paper Brown trout thermal niche and climate change: expected changes in the distribution of cold-water fish in central Spain suggests a much simpler and more accurate approach for forecasting a TER need. I strongly believe that TERs are not necessary.
  4. If there is still volume available, then Brookfield is allowed to generate power, with whatever schedule is dictated by their PJM contract or their own internal procedures.
  5. The algorithm should be revisited every day. This would automatically incorporate any water coming into the lake via rain and/or groundwater.

All of this seems rational, but item (3) is the kicker. Given that one can do these items to a decent approximation for a couple of weeks into the future, the results will be most likely that there will be a shortcoming of water, especially during the early part of the season when there is still a lot to be satisfied. This approach basically assumes that water will be coming into the lake during the forecasting period based on a combination of weather forecasts and historic data. Water comes in because it rains, and water comes in from groundwater flows and creeks. We just don’t know exactly how much and when, but one can make a reasonable approximation and learn about its performance. Unfortunately, this cannot be tested exactly with historic data.

While certain stakeholders are working tirelessly to have the waters at the Southern end of the lake dredged to gain significant additional water depth, chances for funding such an expensive proposition are slim, at least at this time.

The proposed water allocation methodology described in this report, based on the water budget approach described elsewhere, is simple and adaptive to changing climatology and water depths.

So the problem becomes one of defining a forecasting strategy for water coming into the lake.

3. Approach

The first tenet of the proposed approach is that Condition 19 of the whitewater schedule in the current MDE permit with Brookfield [2], as a mandatory condition, is done away with entirely. Instead it should only be used as guidance.

Secondly, the TER protocol should be done away with, and if not, its methodology needs to be revamped completely because the current output produces far too many errors for such an inherently simple problem. Given current analytical tools, instrumentation and studies in the literature a much more reliable procedure can be developed, so that the basic TER notion could still prevail, mostly. I have argued elsewhere that a slow release, once a day for forecasted warm days, would be far more effective in preserving the fish population in the Youghiogheny River.

To reenforce this notion, the purpose of both the white-water and TER releases must be reexamined in terms of their practical and economic implications for the town of Friendsville, the County and the State.

The proposed approach, WAM, hinges on the facts that:

  1. The current water level in the lake is measured by Brookfield, as it is today
  2. A lower rule band (maybe some fine tuning?) is set that cannot be violated under ANY circumstance
  3. A few conditions that must be met (white water, rule band, TER(?))
  4. A set of desirable outcomes (water levels for recreation; discretionary power)

can be used to determine the timing and duration of all releases on a daily basis.

Note that rain, creek flows, ground water flows and lake surface evaporation [3] are automatically accounted for by just considering the water level on the day of an assessment, and these assessments should be done daily.

The MUST conditions will probably relate to certain mandatory white-water releases, such as for Memorial Day, July 4th, the “Friendsville Upper Yough Annual Team Race”, “Gauley Week” and TER(?) releases. As stated earlier, a TER release should be considered provisional upon a new TER model and protocol, because the existing one has too many false forecasts. A new TER model is being investigated [4].

All other white-water releases and discretionary events are subject to the predictions made with WAM. TER releases are also subject to the WAM as long as the expected exceedance of 25˚C is less than a certain amount and duration (to be specified).

I visualize the WAM to operate nearly autonomous, meaning it operates on a continuous basis, being activated automatically every day at say 4 am (this time is arbitrary and perhaps needs some more thought). WAM would generate for the Deep Creek Hydro operator a daily listing of possible release schedules (certain and tentative releases).

The autonomous method of operation has largely been demonstrated. The uncertainties still lie in the forecasting assumptions and certain development details that will be explained as everything unfolds here.

4. Data Acquisition

To make a WAM analysis possible, there are various data needs, all accessible via the Internet. The following should be downloaded from various (to be designated) data sources from the Internet prior to the analysis:

Current weather conditions and short and long term forecasts; The specific location (and alternative location(s)) from which is yet to be determined. If this fails, historical data can be used to make the forecasts.

River flows at Oakland and other locations to be determined (and if necessary).

The river water temperature measurements at the Sang Run River bridge and other locations (to be determined, if necessary)

The next step is the execution of a serialized set of calculations (at 5 am; these applications are not expected to take more than just a few minutes of computer time (downloading the required data from other sources can certainly be done in less than one hour):

  1. Analyze the downloaded data from the Internet and convert them for use by subsequent programs/scripts. This includes considering mandatory releases and other proposed releases.
  2. Compute the expected temperature at the Sang Run river bridge for today and forecasts for the next week or two. Set appropriate flags to define if a release is to occur and what the chances are for releases in the next few days, or perhaps even weeks.
  3. Compute the number of days that the current stored water can be used to satisfy proposed releases.
  4. Issue the appropriate notifications (phone message, emails, text messages, etc.) (all can be easily automated)

The WAM will alert the Brookfield operator daily whether to conduct a release at a certain time and for a certain duration or specify a “no release.” One of the questions to be investigated is: “How far into the future can we make the forecasts?”

A simple schematic of the overall process is shown in Figure 1:

Figure 1 Figure 1. Simple Schematic of the Predictive WAM.

The WAM should be able to provide a forecast, much like forecasting cloudy conditions and rain, meaning, do so with a certain probability. The probabilistic approach may take a few years to optimize, but good estimates based on past performance should already be doable, and be a consistent improvement over current conditions.

It’s important that most of the methodology will be accessible to third parties so that studies can be conducted to optimize the forecasting capabilities.

5. Development Aspects

To develop and subsequently validate a methodology requires observations of various parameters.

The most important set of data are those that ensure that the methodology works as expected. Furthermore, if we can’t validate against past records, then no matter how ‘sexy’ the methodology is, it’s useless.

Ideally, to validate the methodology one should do this with data that is not ‘contaminated’ with existing protocols for water releases, such as for, white-water, power generation, excessive rain or TER. We don’t have such a flexibility.

In the meantime, for development purposes, we need the same detailed level of lake level and power generation data for the years 2011 through 2016 as is being collected now. Fortunately, I recently acquired, courtesy of Jeff Leeks detailed (mostly 10 minute intervals) lake levels and generator status as ‘scraped’ from the Deep Creek Hydro website. This data is made available on the deepcreekscience.com website under “DataVault-> Lake Levels” menu item. An almost complete record set of similar data and a similar ‘scraping’ process was obtained by the author of this paper for just 2012, and is also actively being collected today.

Given that all data is contaminated by a history of releases and hence lake level changes, one has to devise an artificial data set that could be use for testing purposes. With the availability of detailed records, one could roll back some of the releases that occurred one or two weeks ahead of a given analysis day by pretending to reverse the flow during that period and hence get something akin to a ‘proper’ starting lake level.

For example, suppose we’re on May 6 and are doing our forecasts. We know the lake level on that day at the time of the analysis (perhaps take an average of the last hour). Suppose we’re after a prediction period of two weeks. All releases that would have occurred during those two weeks could be rolled back into increased lake levels on the day that the releases were made, and use those lake levels as part of the available storage.

The error that is introduced is the actual change in the rate ground water and creeks that flow into the lake. Groundwater flows depend on not only on what is in the ground, but also on lake levels themselves. Fortunately, there are only a few creeks flowing into the lake, all with a relatively small flow rate. This error should hence be relatively small.

Whether this will be necessary is not certain at this time, but it is a way to generate a, more or less, truer set of operating conditions.

Weather conditions are another set of important parameters. Short-term and long-term forecast are expected to play a role in the perceive methodology, especially concerning TER releases. Whether such records are available historically is uncertain. WeatherUnderground does not store past forecasts but does produce forecasts for any latitude/longitude combination. This can be tapped into for Deep Creek Lake.

There are various weather stations around the lake from which real time ambient winds and temperatures can be obtained, but solar radiation/cloud cover, which is expected to be an important parameter for better TER determinations, is not measured anywhere nearby, although this is a relatively easy measurement to make. We need solar radiation measurements!

6. The Plan

Since we have 10 minute interval data for lake levels and generator status for a six year period these can be used to develop the methodology. In other words, the development of WAM can get started now.

Although we have “generator status” information in the form of generators ON or OFF, it says nothing about whether the both turbines are operating or whether they are operating with the wicket gates less than full open. This is furthermore made complicated by not knowing with a reasonable precision the discharge rate at full power operation.

However, this net situation can be determined by looking at the USGS river flow gages. A release in these gages shows up as spikes. The height of the spike is related to the net effect of one or two turbines operating and the setting of the wicket gates, which is all we need to determine the total amount of water being released. This has been examined in a separate report [5].

Several approaches come to mind for weather forecasts.

  1. Use one or more weather forecasting services to define the amount of rain, evaporation, possible TERs
  2. Develop a groundwater recharge model
  3. Use historical data in some form for TERs, rain, recharge, evaporation

Let’s consider each of these possibilities.

7. Weather Forecasts

One important aspects of WAM is to get reliable forecast data. Around Deep Creek Lake there are a number of weather stations that all report their data to WeatherUnderground. On another website I’ve listed the stations that report.. WeatherUnderground uses a proprietary methodology called BestForecast. It is tailor-made for the location one desires a forecast for. Hence, using Deep Creek Lake coordinates their methodology will consider that weather that was archived at WeatherUnderground by the ten Deep Creek Lake stations.

Using “R” the forecast is easily retrieved:

library(jsonlite)  

# url with forecasts for the Deep Creek Lake area (coordinates: 39.5117N, 79.3156W)  
url <- "https://api.darksky.net/forecast/9779d36aa95b45df62a1cfd2a884e65d/39.5117,-79.3156"  

# read url and convert to data.frame  
document <- fromJSON(txt=url)  
document  
write_json(document, "../results/todays_forecast.txt")  
doc <- read_json("../results/todays_forecast.txt", simplifyVector = TRUE)  
doc  

First the URL is specified for the location for which a forecast data is to be extracted from. The data is read in JSON format and saved to disk from which it can be read and manipulated. For example, part of the forecast for rain comes out as:

  precipIntensityMaxTime precipProbability precipType temperatureHigh  
1             1510239600              0.43       rain           45.23  
2             1510293600              0.28       snow           25.48  
3             1510444800              0.17       snow           31.97  
4             1510509600              0.23       snow           40.52  
5             1510563600              0.32       rain           40.98  
6             1510668000              0.22       rain           39.30  
7             1510768800              0.29       rain           38.20  
8             1510866000              0.44       rain           42.56  

Note, that the forecast includes a whole lot more data, including an hourly forecast of a variety of parameters for the next day.

This data source requires a bit more investigation on how to incorporate the extracted data into a usable precipitation forecast. The point is there is a reliable source of weather forecast data. See here for a simple model.

8. Ground Water Flows

Direct precipitation on the lake is only one source of water for the lake. The other one, and probably the larger of the two, is inflows from groundwater, whether from streams, springs at the bottom of the lake or direct inflow along the shoreline. That this kind of recharge occurs can be seen easily when the turbines are running for a while and then turned off during a period when no rain has occurred in recent times. Figure 2 demonstrates this groundwater recharge. When the turbines stop operating one observes a rise in the lake level. This rise is due to water from groundwater discharge.

Figure 2 is a snapshot of a period of operation in 2014 (Jan 1-May 15). At the bottom of the plot is shown when the generators are operating. During the early part of this period one can see clearly the recovery taking place after generator operation, although only partly, because there is only a finite volume of water stored in the ground. Subsequently one may observe an increase of the water level probably caused by rain (a future version of this chart is being made showing rainfall also), but also a decrease because of generator operation and a replenishment by recharge.

Figure 2 Figure 2. A Demonstration of the Occurance of Discharge/Recharge.

It should be obvious that lake recharge cannot go on forever in the absence of rain. Hence the rate of recharge is a function of the number of prior dry days and the rain intensity prior to the start of the dry period. Given that we have a lot of data one, with some effort, can probably develop a fairly accurate recharge model. Given the available manpower, only a simple model will be used here, one that is elaborated upon in this section of the website.

9. Implementation

As of today, October 6, 2017, the plan is to try to implement the considerations described above. This is not a trivial exercise, although still not that difficult. The difficulty is mostly in defining some kind of data set that can be used for testing purposes and that is relatively ‘uncontaminated.’

10. List of References

  1. P. Versteegen, “Morgan’s Water Budget Model,” DCL219, April 2, 2017.
  2. Current Permit (01-Jun-2011) - BROOKFIELD POWER PINEY & DEEP CREEK LLC - GA1992S009(08), MDE, http://mde.maryland.gov/programs/Water/Water_Supply/Pages/DeepCreekLakePermitVersions.aspx
  3. P. Versteegen, “Evaporation from Deep Creek Lake,” DCL043, December 12, 2013.
  4. P. Versteegen, “A Youghiogheny River Temperature Model,” DCL221 (under development).
  5. P. Versteegen, “Processing the USGS River Gage Data,” DCL222, April 19, 2017

Author: PLV
First Published: 9/7/2017
Last Updated: 11/9/2017
Script Collection: NONE
Ref:
FILE: 2017-04-08-A Water Allocation Model (WAM).md
Adapted for this website: 9/21/2017


WAM,, R]
© 2017 PLV Some Rights Reserved - To comment on any of my work please email me at: moc.hcetsnes@etep