In this brief note I describe the analyses performed to determine the depth of the waters of Deep Creek Lake at the locations of boats slips, and reports specifically the percentages of boat slips that are at lake levels higher than 2458 ft, 2457 ft, and 2456 ft. This is an important issue for people at the Southern end of the lake, since those areas are generally shallower (Its a reflection of the land contour that existed prior to the creation of the lake; Deep Creek flowed from the Southern end to where the dam is currently). A couple of different methodologies were used. The results are inconclusive.
Since the bathymetric data were collected densely over specific transects see, for example, an analysis shown here, and here, it was found in the past that not all of the over 600,000 data points were necessary to describe the lake bottom adequately. Hence the percentages were calculated with different sets of bathymetric points.
The starting points are the bathymetric data obtained by DNR in 2012, the boat slips digitized from Google Earth Imagery (3,377 boat slips) in 2012 and the 2462 shoreline curve obtained recently, 2017. All of the bathymetry data points for which water depths were collected can be found here; the boat slip coordinates can be found here, while the shoreline contour is provided here. All coordinates are in the State Plane Coordinate System.
“R” was used to perform the analyses. As it turned out this was a relatively simple problem to solve. After the data was read in, the bathymetry levels were merged with the shoreline contour, the latter of which was assigned zero depth, to conform with the DNR bathymetric data, which were referenced to the 2462 ft ASL elevation, also the hight of the spillway. Then the ‘interpp’ function from the Akima package was called in the following manner:
# Akima interpolation and evaluation at the points of interest
z.slip <- interpp(x, y, z, x.slip, y.slip, duplicate="strip")
where x, y, and z were Eastings, Northings and depth values of the bathymetry, and x.slip and y.slip were the coordinates of the digitized boat slips. The results is stored in z.slip as the computed depths of the boat slip. The percentages of the boat slip depths at 2458, 2457 ft, and 2456 ft were computed by the following three expressions:
(z.2458 <- 100.0 * length(z.slip[z.slip < (2462-2458)]) / n.slips)
(z.2457 <- 100.0 * length(z.slip[z.slip < (2462-2457)]) / n.slips)
(z.2456 <- 100.0 * length(z.slip[z.slip < (2462-2456)]) / n.slips)
A quick study should reveal the meaning of these formulas.
Three sets of bathymetric data were used: 1) every 100th data point; 2) every 20th data point; and 3) all data points. This is based on experiences obtained in creating bathymetric maps. Not all points were generally need to produced a decent map. The percentage of boat slips at less than the indicated lake level are shown in the table below.
Using the Akima interpolation algorithm on the bathymetric data and the ’linear=TRUE’ option, producing essentially a TIN bivariate linear interpolation of the data, the results of which are as follows:
Table 1. Using the Akima Bi-Linear Algorithm.
Elevation | 100th | 20th | all |
---|---|---|---|
2458 | 10.9 | 2.2 | 1.7 |
2457 | 20.1 | 9.8 | 8.3 |
2456 | 37.5 | 25.6 | 23.7 |
While linear TIN is effective at representing surfaces for visualization, it is generally considered the least accurate of the available methods, especially the simplest, most common linear version. Other data interpolation algorithms have been attempted in my previous bathymetric work, with mixed results (Kriging and Loess). Akima’s bivariate cubic spline interpolation methods for irregularly spaced data (linear=FALSE), also based on TIN, often comes out on top, at least for the type of data I usually get involved with.
The results with Akima’s cubic spline method are summarized below:
Table 2. Using the Akima Cubic Spline Algorithm.
Elevation | 100th | 20th | all |
---|---|---|---|
2458 | 22.3 | 3.0 | x |
2457 | 31.2 | 8.8 | x |
2456 | 45.4 | 22.3 | x |
x The problem would not run to completion and terminated in error.
Another algorithm that I have used in the past is the MBA method which returns points on a surface approximated from a bivariate scatter of points that uses multilevel B-splines. See this article for a more detailed analysis of its effects. The results with the MBA algorithm are summarized below:
Table 3. Using the MBA Algorithm.
Elevation | 100th | 20th | all |
---|---|---|---|
2458 | 39.4 | 9.7 | 0.3 |
2457 | 53.4 | 21.3 | 3.0 |
2456 | 64.4 | 32.5 | 9.2 |
As can be noted, the results are sensitive to the algorithm that creates the surface of the bottom of the lake. Based on personal experience the Cubic Spline Akima approach is more faithful. The MBA methods creates much nicer graphs, but do a lot of smoothing. Clearly this requires more work to have greater confidence in the numbers that are generated. That work would, in all likelihood, involve more depth measurements.