HTML Notebooks

For many years I’ve been working with R, a computer programming environment that caught my interest in around 2011. It was not until DNR collected bathymetric data of the Deep Creek Lake in 2011, and people were expressing interest in how water depth varied over the season at their boat slip locations, that I became interested in analyzing the DNR data and learned how to make bathymetric maps. I had explored enough about R to understand that it could possibly be used to generate such maps.

This notion has extended to all of my work, to the point that all of the coding that I do is in R.

Overview

R is an open source computer language originally developed for statistical analyses, but which has become a favorite computer language in many academic and also private enterprise applications, and not necessarily in statistics. Here is the main link for R. The language is quite suitable to processing large amounts of data. In addition, it has outstanding capabilities to produce publication quality graphs. R is extensible and is free and supported by a large number of people all over the world.

All of my work is done on a Mac using the latest version of the OS X operating system, currently (March 2018) Version 10.13.3. R is easy to use and has a lot of support for this operating system. For other operating systems there are similar capabilities which, for here, are left to the ‘student’ to implement. This should not be a difficult task, often without changes.

I have started to document all of my computational efforts using Markdown, a simpler markup language for ordinary text that can be transformed into web pages. This whole website is documented with Markdown.

There are quite a few software packages that implement the Markdown syntax, some completely, some partially, and some with extensions. I generally use Markdown that is part of Jekyll, but to report computational kind of things I have found that R Markdown is the version I like to work with most. With R Markdown I can add chunks of R scripts that are executed when I produce an HTML page. Hence the code that is included on the website is ‘workable’ code. This is extremely useful since I can come back to things later, modify a script, and the script and its results are directly embedded in the web page.

As I noted above, my various reports are written in R Markdown but as embedded in RStudio. RStudio is an integrated development environment for R, while R Markdown is a simple formatting syntax for authoring HTML, PDF, and MS Word documents. Go here for more details on using R Markdown. R Markdown is oriented towards the R language and contains additional functionality to work with R itself. The combination of RStudio and R Markdown allows one to execute scripts as part of the documentation when using “knitr,” a package that is part of R.

To integrate the reports written in RMarkdown with this website, I have to use a slightly different approach for publishing the html documents that are generated in this way because the R Markdown/RStudio/knitr combination produces ‘finished’ HTML pages. The Jekyll (now Hugo) scheme of doing this websites is based on using ordinary Markdown files.

So on this page I provide links to these R Markdown/RStudio generated HTML pages with a very brief summary as to what they are about. So, here they are (They are not necessarily complete as of this time - March 3, 2018).

  1. “Working with Graphics”, First generated: 4/3/2018.

    This note discussed how graphics can be used to analyze various elements of the problem at hand. Here I show how an estimate can be derived for the nominal flow rate from the hydroelectric plant. The result will be used in WAM to determine the amount of allocation that is available.

  2. “Real Time Automated Analysis, Part 3”, First generated: 3/13/2018.

    Part 3 deals with defining the stakeholders and their requirements and the various approaches that one can use to predict the way they can be satisfied by the number of releases identified in Part 2.

  3. “Real Time Automated Analysis, Part 2”, First generated: 3/12/2018.

    Part 2 deals with transforming the data read about lake levels in Part 1 and predicting how many possible hours of releases are available.

  4. “Real Time Automated Analysis, Part 1”, First generated: 12/9/2016; Last Updated: 3/12/2018.

    This note describes how an operational system for managing the waters of Deep Creek Lake via Deep Creek Hydro while attempting to satisfy all ‘stake holders.’ Part 1 is about retrieving lake level data from the Deep Creek Hydro website.

  5. ‘Creating Bathymetric Maps of Deep Creek Lake,’ Pete Versteegen, First documented: 12/9/2016; Last Updated: 3/3/2018.

    This report describes how DNR’s bathymetric measurements of 2011 were analyzed and converted to maps of Deep Creek Lake and its coves containing contours of constant depths, commonly called bathymetric maps. This covers how I did the individual coves.

  6. ‘The scripts used to generate the various bathymetric maps,’ Pete Versteegen, First documented: 12/9/2016; Last Updated: 3/3/2018.

Testing the jAlbum files



PLV
First Published: 3/12/2018