# The Evolution of a Bathymetric Contour Map with A Test Problem

### Summary

Developing a nice bathymetric contour map requires an understanding of a number of technologies. This section of the website describes what it takes to generate such maps.

### Introduction

The “R” System has many ways of doing the same thing. The vastness of the packages and functions contained in the “R” system makes it often difficult to extract the desired functions. In this section I document how I went about generating such maps by creating a simple test geometry and exploring what can be done with R to create bathymetric maps.

For completeness, much of the basic work was done in the fall of 2012, the results of which can be found on the deepcreekanswers.com website, so here I retrace my steps.

Because I recently obtained a better shoreline boundary condition I felt compelled to redo that work with better data, and hopefully also with better insight into the capabilities of “R”. This, and other documentation, is the result of that work.

### The Data

I need two data items to demonstrate various functionalities:

1. A contour that defines the region within which bathymetric data points are available
2. A set of bathymetric data

Here is what I have created for the test problem. First the contour of a pretend cove:

# Definition of the polygon to approximate the shape of a cove

x <- c(47,44,33,20,19,30,16,4,16,34,45,52,60,60,47)
y <- c(0,13,20,20,24,30,37,42,45,40,28,20,25,0,0)
z <- seq(0, 0, length=length(x))

Here x and y are the coordinates of a shoreline point, z it’s its water depth, which here is zero, meaning it’s the shoreline. Even if you don’t know R, this formulation should be understandable (part of the beauty of R). A plot of this data is made by calling the plot function, with the result shown in Figure 1 (type=”b” means that both the points and lines are plotted)

plot(x, y, main="Simple polygon", type="b")

This is the simplest form. Other parameters can be specified to make it look prettier.

[: .centering}]
Figure 1. The Outline of the ‘Test Cove’.

Next I have defined a set of bathymetry points, x/y/z values. They are specified as positive numbers; a depth of 7 means the water is 7 ft deep (for Deep Creek Lake this would be with respect to the height of the spillway, which is at 2462 ft ASL.):

# Definition of irregularly spaced -z- values inside the contour

xx <- c(48,44,33,25,22,30,16,12,16,34,46,52,58,58,32,52)
yy <- c(1,14,22,22,22,28,38,42,43,36,21,18,20,4,30,12)
zz <- c(5, 6, 6, 4, 7, 5, 4, 3, 5, 6 ,8, 9, 6, 3, 5, 9)

The plot in Figure 2 shows both data sets. The red points are clearly depth measurements.

plot(x, y, type="b", main="Simple polygon with irregularly spaced values (2-D).")
points(xx, yy, pch=19, col="red")

The circles are the boundary points, while the red dots are water depth measurements.

Figure 2. Shoreline and Bathymetric Data.

Next I compute from the data given above some variables that will be used later on, such as the extent (bounding box) of the problem, moving all all of the data into one object for doing the interpolation, defining the grid over which the interpolated points are to be generated and specifying the ‘mask’ to be used to remove extraneous interpolation results from view outside the shore line.

# Definition of the bounding box, grid, cove outline and mask

# Define the extent
xmin <- min(x)
xmax <- max(x)
ymin <- min(y)
ymax <- max(y)
xbox <- c(xmin, xmax, xmax, xmin)
ybox <- c(ymin, ymin, ymax, ymax)

# Define the data set for interpolation (add the boundaries = 0.0)
xxx <- c(x, xx)
yyy <- c(y, yy)
zzz <- c(z, zz)

# Define the grid
dxy <- 0.2
nx <- (xmax-xmin)/dxy
ny <- (ymax-ymin)/dxy
x0 <- seq(xmin, xmax, length = nx)
y0 <- seq(ymin, ymax, length = ny)

# Define the mask that removes extraneous materials from the desired plot.
# Make the bounding box a closed polygon data item and put in dataframe
xxpoly <- c(xbox, xbox[1])
yypoly <- c(ybox, ybox[1])
q1 <- data.frame(xxpoly, yypoly)

# Make the cove boundary a closed polygon data item and put in dataframe
xpolycove <- c(x, x[1])
ypolycove <- c(y, y[1])
q2 <- data.frame(xpolycove, ypolycove)

q3 <- hole_in_frame(q1, q2)
# To use q3: polygon(q3[[1]], q3[[2]], col="grey") # Any color

Once this is set up one can call the ‘akima’ interpolation function. It creates a matrix with values that can then be processed by the ‘image’ function and put onto a plot.

Also placed on the plot is the cove outline and the location of the data points that give depths. Following this the contour function is called that computes and places the contours on the plot. This block of code finishes by putting a title on the graph.

# Do the Akima interpolation/smoothing

# Interpolate
akima <- interp(xxx, yyy, zzz, xo=x0, yo=y0, duplicate="strip")
# Plot interpolation
image.plot(akima)
polygon(x, y, lwd=4)
# Show data points used
points(xx,yy, pch=19, col="white")
# Make contours
title("Figure 3. Akima and image.plot and contour")

The way the ‘akima’ function works, and many other such interpolating functions, is that interpolation is done on a ‘convex hull,’ meaning a region of points that is surrounded by a set of lines whose length is the shortest possible (like tightly wrapping a string or rubber band around the data region; my definition; google for more exact definitions). The result is Figure 3.

Figure 3. The Output Produced by the Akima Interpolation Method.

The dark blue areas outside the cove boundary are artifacts of the interpolation method. Almost all such methods have these artifacts. These must be deleted from the plot in order to produce something more pleasing image.

The way I approached this is by creating an opaque mask with a hole cut in the middle that matches the geometry of the outside contour (shoreline). A separate writeup can be found here. All one has to add is the following line to the code block that produced Figure 3. The result of this application is Figure 4.

polygon(q3[[1]], q3[[2]], col="grey") # Any color

This is a very typical Akima behavior. The edges of the contours are jagged. This is a reflecting of making sure that the contours go trough the existing data points.

Figure 4. Final Clean Map with Akima Interpolation.

To accomplish all of the above requires the following libraries:

library(fields)  # For image.plot
library(akima)  # For akima
library(rgeos)  # setdiff()
library(MBA)    # MBA interpolation
library(sp)     # For 'polygon' and spatial representation

There are other interpolation techniques besides Akima, but personally, I generally found it to represent the most realistic results. But all is in the eye of the beholder, and testing against a subset of data that was not used in the creation of the map is definitely a ‘must’. This topic is for a “Validation” discussion.

Another interpolation scheme for irregularly spaced data that I’ve used at times and is available in one of the “R” libraries is the MBA scheme. Figures 5 and 6 show the raw results and the masked results. Definitely a much more pleasing representation. More accurate? Don’t know! That’s a topic to be addresses in a “Validation” report.

Figure 5. The Output Produced by the MBA Interpolation Method.

The next figure, Figure 6, is the same as Figure 5, but the area outside the cove contour has been masked out.

Figure 6. Final Clean Map with MBA Interpolation.

The code for the MBA interpolation results is as below.

# MBA Interpolation - clean version
image.plot(mba, xlim = extendrange(xbox), ylim = extendrange(ybox),
lwd=0.05, lty=3, xlab="Easting (ft)", ylab="Northing(ft)")
polygon(x, y, lwd=4)
# Show data points used
points(xx,yy, pch=19, col="white", type="p")
# Make contours
contour(mba, add=TRUE, labcex=0.7, drawlabels=TRUE)     # contours
title("Figure 9. MBA and clean image.plot and contours")
# Overlay the two polygons, subtracting the cove from the bounding box
# plot(setdiff(p1, p2), poly.args = list(col = "grey"), add=TRUE)
str(q3)
polygon(q3[[1]], q3[[2]], col="grey")  # Uses results of "hole_in_frame"

All this seems pretty easy, but it took some time to get to this point.

### Extracting the Contours

Certain desires have been made to extract the contours in digital form so that they can be displayed by other software products.

This took a little research on how to do that and the result is a bit obscure.

R is an object oriented language. An object is a data structure having some attributes and methods which act on its attributes. These data structures are called classes. Often a contributed set of functions to R, generally in the form of a package, has its own definitions of classes, and hence to move from one package (read set of functions) to another requires translations.

The results of an interpolation are stored in a particular type of data structure or class. We want to extract the lines from it. Such a need was desired by functions in another package which has made available a number of translators, one of which is for lines.

The following code makes the lines available in a commonly used data structure called a ‘data-frame.’

library(maptools)
x2 <- ContourLines2SLDF(contourLines(mba))   # Requires 'maptools'
library(broom)
df <- tidy(x2)    # Requires 'broom'

For our example problem here, using the MBA output, the results is the following data-frame content:

> str(df)
'data.frame':	6389 obs. of  6 variables:
\$ long : num  15.2 15.2 15.2 15.2 15.3 ...
\$ lat  : num  36.3 36.3 36.5 36.5 36.6 ...
\$ order: int  1 2 3 4 5 6 7 8 9 10 ...
\$ piece: Factor w/ 13 levels "1","2","3","4",..: 1 1 1 1 1 1 1 1 1 1 ...
\$ group: Factor w/ 55 levels "C_1.1","C_1.2",..: 1 1 1 1 1 1 1 1 1 1 ...
\$ id   : chr  "C_1" "C_1" "C_1" "C_1" ...

What we have is a string of lon/lat numbers (6389 sets) organized by ‘order,’ ‘piece,’ ‘group,’ and ‘id.’

The “id” is a given contour, made up of a number of pieces, where each piece is given a group name with the data ordered as defined by ‘order.’

Contact me if you need this kind of information. It’ll take a little more work to put this kind of information in a more user-friendly format.

### Areas Within A Contour

Another way to go about extracting contours and also calculate the areas within a contour is with the package “splancs.” Using the MBA contours one can accomplish this as follows:

x5 <- contourLines(mba)  # Defines the contours

library(splancs)  # Needed for the function 'areapl' below

carea <- vector()   # initialize
clevl <- vector()   # initialize
nc <- length(x5)    # Number of contour sets

for(i in 1:nc){
m1 <- x5[[i]]       # Extracts a contour data set
clevel <- m1\$level  # Determines the level it belongs to
unlist(m1)          # Converts from 'list' to 'numeric'
clevl[i] <- m1\$level    # Extracts the 'level'
carea[i] <- with(x5[[i]],areapl(cbind(x,y)))   # Computes the area
}
dflevel <- data.frame(clevl, carea)  # Converts to a dataframe

# Sums the values for each level; levels are defined in the contour plot
aggregate(carea ~ clevl, data=dflevel, FUN=sum)

For the present example the numbers it generates for this test problem are as follows:

clevl       carea
1      0  26.3213570
2      1 135.9901358
3      2 303.0234406
4      3 638.4058422
5      4 462.9957992
6      5 274.1022499
7      6 174.1646273
8      7 100.8513590
9      8  49.7847157
10     9  20.6734844
11    10   0.2831378

These are the areas within the plot levels range of depths. These areas are simply shown as a histogram in Figure 7.

Figure 7. Histogram of Areas Contained Inside Contours.

NOTE: The script that contains all of the code can be downloaded from here.

PLV - First Published: 9/1/2017; Revised: 9/12/2017

Bathymetry, R, Levels, Contours
© 2017 PLV Some Rights Reserved - To comment on any of my work please email me at: moc.hcetsnes@etep