OilTankVision in Jeff's Basement

Raspberry Pi, Azure, and an Oil Tank from the 1970s

I’ve had a problem at home that I’ve been dealing with for years: I have an old oil tank that came with the house that I just can’t get rid of. It sits in my basement with pipes that go through the foundation and is larger than my basement door. Literally, the tank was installed BEFORE the house was finished being built. We regularly have biodiesel fuel delivered in the late Autumn, Winter, and early Spring months to heat the house. This artifact from the 1970’s irks me to no end, as I need to monitor an analog gauge to know when to re-order oil for my home.

But… I am a software developer! I know the cloud, and I think I know IoT. I wanted to build a device to make this monstrosity’s management simpler. I had the original idea for this in the summer of 2017 and wrote an article about my initial experiments and using the LEAD Tools OCR product to analyze the tank’s gauge and report the ‘real volume’ of the tank. These initial experiments worked, but too much was needed to get the whole thing up and running. I needed a simpler and smaller approach…

The Plan..

I had proposed a more interactive, pair-programming session with my good friend Suz Hinton for DevIntersection Las Vegas 2018. We had NO idea what we could code together, with her expertise in IoT (Internet of Things), Node, JavaScript and the Cloud combined with my expertise in all things C#. My mind wandered back to this concept, this problem that I have been dealing with and we hatched a plot. Suz would write the code for the Pi to take a picture and upload it to Azure, and I would write an Azure function in C# to analyze the photo. Our friend bravecobra fortunately put together this flow chart to document how the processing:

Oil Tank Vision Flow chart
Processing of data from photo capture through data analysis

Here is the quick version of how this would work:

  1. On a scheduled basis, using a Linux cronjob, the Raspberry Pi will take a photo of the gauge
  2. The Pi would upload the photo to an Azure blob storage container
  3. An Azure Function would be triggered by the upload of the photo and send the photo to Azure Cognitive Services for analysis
  4. The results of the photo analysis would report the number in the window of the gauge, and we would store that in an Azure Table Storage record for future analysis

Here’s the fun part… you can watch us talk through and build this initial set of services. Our pair-programming session is available below. I’m the one in the hat 🙂

Building the initial OilTankVision services while on-stage at DevIntersection with Suz Hinton

We committed our source code on GitHub so that it could be tinkered with and completed for an initial deployment when I returned home. Within the next week or two, I got the Pi updated with the latest version of Raspian Linux during one of my streams and deployed the code to take pictures. That script is in the PiScript folder of our GitHub repository, and a snippet is included here:

What I really like about this code, is that it clearly defines the callbacks as separate functions and they’re not embedded as anonymous functions like clumsy Jeff would have written.

Configuring the Raspberry Pi and deploying the photo script

I placed the script in the /usr/local/bin folder on the Pi and tested the photo capabilities of the device with my webcam attached to it. We settled on configuring a crontab entry that would schedule the photos to be captured once every 4 hours. I jammed a spare USB-power LED light into one of the extra USB ports on the Pi and things started looking GOOD. I put the Pi, the camera, and the light on a shelf on top of the tank as shown in the picture at the top of this post. The only modification I had to make was to wrap some cardboard around the back of the gauge to force the camera to focus there instead of on the wall behind the tank.

Configuring the Cloud

With the Pi configured, I deployed my C# Azure functions so that they could start processing the images that would be uploaded. During this process, I learned a few things:

  • Azure Functions are a simple application that is executed based on several configurable triggers. In my C# driven case, my function was a static method would should be run when a new photo arrives.
  • Azure Functions can be configured with a “consumption plan”. This means that you are only charged for each time they run.
  • Azure Functions can be triggered by the introduction of a new blob in a container. Neat idea, as the upload of a file to Azure Blob Storage (basically an FTP server with more bells and whistles) will trigger a function to execute.
  • You must specify the connection to the blob storage container in your C# code in order for it to trigger properly. Without this connection information, your function does not know what to monitor and start processing.

I finished configuring and deploying the C# code to Azure a few days later and recapped everything on this stream:

Completing the deployment of the Azure Functions to support OilTankVision

My code to support this function, that processes the image from the Pi looks like

Lots going on in this one, and I’m not going to jump into the subroutines. I’ll leave that as an exercise for you dear reader.

The attributes at the top of this C# function declare to the Azure Functions runtime that the OilTankReading type returned from the method will be stored in TableStorage, in a table called “OilTankReadings” defined in the connection configured in the Azure Function settings called “OilTankStorage”. The FunctionName attribute is giving this function a more readable name for the Azure console, and the StorageAccount attribute… well as far as I can tell from my experimentation doesn’t really do anything.

Remember I mentioned the arrival of a photo would trigger this function? That’s the BlobTrigger argument hint that you see, and it receives an object from the gauges container (think of that as the directory the file is uploaded to) and captures the name of the file using the {name} notation and passes that in the name parameter to the Run function.

I have two tasks that run simultaneously, one to analyze the value called “SendToRecognizeTextApi” that uses the Azure Cognitive Services Text Recognition service and a second task that fetches the current outdoor weather around my home from weatherbit.io When these two values are returned, I’m now merging the two values into that single result object. But first…

Some tricky geometry to get an “absolute value”

Yeah, measuring oil level in a tank with a float is NOT an exact science. Certainly using image recognition technology is an exact science, but may not always work… and it tells me EXACTLY where it detects the number in the photo. With a little geometry and some math, I can get a more absolute value than just “150 gallons” in the tank.

Picture of gauge with indicators for height and position of digits
By measuring the height of the digit, the height of the window, and the top of the digit, we can approximate the volume reported

I know that the digits in my gauge photos are exactly 50px in height (photo above is resized) and the window they appear in happens to be 100px in height, I can use the position of the digit detected by Azure to calculate a correction factor to apply to the digits reported. The measurement needle for this gauge is in the center, so if a digit is at the top of the window, I would add 5 to the value because the higher values are lower in the gauge. Similarly, if the digit was at the bottom, I would subtract 5 from the reported value. You can see the 160 value peeking into the window in the photo above.

Reporting… PowerBI to the rescue!

Great, now I’ve got a bunch of data sitting in an Azure Table storage account. How should I present it? Write an ASP.NET application? Build a Xamarin application for my iPhone? Nope nope nope… PowerBI!

I really enjoy using PowerBI to build reports and dashboards across all kinds of strange shaped data, and I knew it would be up to the task. I created a new report, added my Azure Table account and immediately saw my data for querying. So simple… but then the real fun began.

I created a table with a hyperlink that would take me right back to the image that generated the readings, and added a trend part to show the current volume in the tank with a trend line. Finally, I added a fairly standard line graph to show the full history of tank readings. Not bad, and shows me what I need on my desktop:

My uber-simple PowerBI dashboard for readings from my oil tank

Here’s the part I LOVE about PowerBI… I added a Mobile view that I could use on my phone. Same dashboard, same data, same widgets… just arranged nicely for a phone and readable with the PowerBI app. I arranged the widgets and published the dashboard to the PowerBI service and now I can get this data when I want it on my phone:

Mobile view of the same dashboard

The only thing I had to configure to complete this was the online PowerBI dashboard needed to be told when to refresh data from Azure Table storage. I added scheduled refresh items in the configuration of the OilTank data source on PowerBi.com for every four hours, to coincide with when the photos were captured:

Refresh settings for the OilTank PowerBI dashboard. Note that the refresh times are limited
to x:00 or x:30 intervals.

Summary

There you have it… I can now see the “exact” volume of fuel left in my home heating tank, and I have the data on my phone when I want to peek in at it.

BONUS: With the photo links in the table of readings on my dashboard, I can click through to any photo that was captured of the tank.

I think this is just the start of hacking and analyzing this data. I can generate notifications when the tank gets low or an unexpected amount of fuel is used in a given timeframe. I could activate and deactivate the Raspberry Pi’s photo capture process based on the outdoor temperature. I could forecast the amount of fuel that I will need for the week ahead and automate an email to my fuel delivery service requesting a delivery.

What do you think? Are there features that I could add to this? Am I missing some analysis? Comment below and let me know what you think. I’ll tinker more with this project on my Twitch stream at twitch.tv/csharpfritz and we can talk about possible features. Download the source code for OilTankVision and send a pull request with your ideas. I’m always happy to review and comment on stream.

Stay warm this winter! I know I will, because I now ALWAYS know how much heating fuel I have left.