Internet of Trees – Soil Saturation Monitor Using Particle, Azure, and Power Bi

20151001_193811_resized WP_20151008_16_13_30_Pro WP_20151008_16_13_59_Pro 20151001_191734_resized_1

The Internet of Things is all about employing physical sensors on internet connected devices to gather information about the physical world, typically storing that information up into the cloud for archival or post-processing (i.e. aggregation or machine learning). Some of the more innovative projects in this space solve very interesting problems (Nest Learning Thermostat) and connect something typically disconnected from the internet (Rainmachine Wi-Fi Sprinkler System) that results in an overall improvement in efficiency, productivity, and / or utilization.

In this project, we will use a Particle Photon, powered by solar energy with a battery backup cell to push soil saturation data for six tree zones up into the Azure cloud for visualization in Power Bi.  The purpose being that we will be able to monitor availability of water for these trees and eventually optimize the watering pattern to ensure optimal growth using the collected data.

The project will familiarize you with the following IoT related concepts:

  • Powering a device using solar power and battery backup
  • Best practices for designing testable hardware interfaces
  • Code tricks for reducing power utilization to a minimum
  • Limiting current draw from active circuitry
  • Building a resilient weather-proof enclosure
  • Storing sensor data into the cloud for archiving and processing
  • Visualizing data in Power Bi

Building out systems like this has become simplified due to breakthroughs in small connected prototyping devices which build on tried and true specifications (Arduino).  For example, take a look at some of the innovative devices offered through  They specialize in creating Wi-Fi / GSM connected, Arduino-compatible microcontrollers that can be programmed through a cloud interface.  Not only that, the product they produce is extremely resilient.  In fact, I have 3 devices running in my home that have been operational without any need for maintenance over a 6 month time period.

These resilient web programmable devices can be augmented with cloud-connectivity.  Using three lines of code and a custom library, we can connect our particle device up to the Azure cloud for storage in a mobile service.  These mobile service may be deployed in a single click and offer an immediate scalable database infrastructure, REST API, Identity Management, and ability to provide push notifications to mobile devices.  Essentially, a full production backend in a box, except you leave the management up to the provider, allowing the developer to focus on code.  Finally, we bring in visualization through Power Bi tools which are created specifically for aggregating large datasets on-prem or in the cloud.






1.) Powering Particle Photon using Solar Power and Battery Backup

Disclaimer: I have no idea how electricity actually works, I’m a software developer first =)

To power the Photon device using Solar Power, I opted for the solution employed in Dan Fein’s Particle Photon Weather Station Project on  It’s pretty straightforward, connect your SparkFun Sunny Buddy to an appropriate Solar Cell (Volts * Amps = Watts), and a battery source (larger capacity = better ability to run without sun).

2.) Best practices for designing testable hardware interfaces

When building circuity, especially when designed with permanent placement in mind, it is imperative to create a testable interface.  What this means is that I can test the circuit outside of it’s place of intended operation.  To do this, I used the Pi Cobbler to create a plug and play interface.

20151001_183344_resized_120151001_152540_resized_1 20151001_191550_resized_1


This allowed me to create a test environment for the completed project.  Notice that I am connecting the interface to two probes, one in a cup of moist soil and another in a cup of dry soil and monitoring the readings as they are sent into an Azure Mobile Service.


3.) Code tricks for reducing power utilization to a minimum

When testing the device, it became clear that I could extend the life of the device when solar power is unavailable (night time, cloudy days, etc.) by employing certain programming tricks.  The first was taking readings at a minimal interval and putting the device to sleep when not in use.  Since we are dealing with soil readings I found a reading every 30 minutes was sufficient.  In the meantime, I put the device to sleep using System.sleep(SLEEP_MODE_DEEP, ReportAndSleepIntervalInSeconds);  to allow the device to operate at a mere 200uA while in the Sleep state.  I also discovered that a good bit of power is used in making web requests.  In my initial code, I was sending one reading per zone, but found that by consolidating all of the data into a single request, I could obtain much longer operation times from the battery.

4.) Limiting current draw from active circuitry

  It makes sense to disable active circuity (Soil Moisture Monitors) while not in use.  By supplying voltage to the Soil Moisture Monitors off of pin D7 on the Photon, I can turn this active circuitry on and off for reading through code without any additional hardware.

//Turn Saturation Sensors On
digitalWrite(D7, HIGH);
//Turn Saturation Sensors Off
digitalWrite(D7, LOW);

5.) Building a resilient weather-proof enclosure

I found the local electronics store in town stocks Gasket Sealed Polycarbonate Enclosures, which are perfect for outdoor electronics projects.  Make sure that your enclosure utilizes a gasket seal and that any wiring in / out of the device can be sufficiently closed off.  In my design, I was able to run a ribbon cable out of the enclosure by sanding down a portion of the outer lip.

20151001_123616_resized_1 20151001_123630_resized_1 20151001_191641_resized_1

7.) Storing sensor data into the cloud for archiving and processing

For this I leverage the AzureMobileService Library for connecting publishing data from the Particle to an Azure Mobile Service.  Simply create a new Mobile Service, under Data section in the Azure Portal for your service, click to add a new table named “Data” and leave all setting default.  On the Photon side, import the “AzureMobileService” library, then add your MobileService name and Application Key to the MYSERVICE and MYKEY variables in your code.

Capture Capture2 Capture3lib

8.) Visualizing data in Power Bi

Once we get the data into Microsoft Azure, it becomes very simple to create a visualization by leveraging PowerBi.  to be honest, I found the tool to be very intuitive but you may wish to consult the documentation on Getting Started.  After connecting to my Data Source, I simply dragged and dropped components until my data looked appropriate.  The whole process took around 15 minutes with no prior training on the tool.  The best part, not only is my data available for viewing in the PowerBi web dashboard, there are also Windows Store, IOS, and Android apps available for persuing the data as well!

PowerBi powerbi2


Creating this system was really interesting, mainly due to my curiosity around the above sub-problems.  I found that using the code tricks mentioned above, I have been able to run the Soil Monitor system for over a month without charge!  In addition, I now have a cross-platform way of viewing my data across devices that took less than 30 minutes to set up.

The data itself is interesting.  We had two recent floods that I was able to view as spikes within my dashboard, this allowed me to know when to resume watering of my trees through my Rainmachine Sprinkler system.  However, I have noticed that readings do change as temperature changes.  This is due to temperature coefficient effects and could be explored more.

I want to stress that while the main data being gathered in this domain may not be your cup of tea, the techniques may prove valuable for other projects.  I recently assisted three teams who placed at the recent TAMUHack Hackathon using techniques outlined in this post (storing data in AzureMobileService, visualization in PowerBi).

Hopefully this post has inspired a few hackers out there!  Let me know what ideas you might have in the comments.  Until next time, Happy Hacking!



Windows 10 IoT Core Breathalyzer


Windows 10 IoT Core was created to build powerful solutions on low-powered devices with the potential to bridge the physical world into the nearly endless power of the cloud.  It offers a world of synergistic opportunity within the Microsoft ecosystem, which includes technologies such as Visual Studio, .NET, and Microsoft Azure.  All of this is available on a variety of devices including the ever popular Raspberry Pi 2, a $35 computer with 1 GB RAM, quad-core processor, 4-port USB hub, and HDMI out.  And get this, all of this comes packed on a machine about the size of a credit card at a little over 1.5 inches thick.  This little computer also boasts a GPIO hub or General Purpose Input / Output.  This GPIO hub can allow you to take readings from the physical world and even perform operations on physical devices that create actions in the real-world.

Okay, you get the picture, big opportunity, small package, can perceive and effect the physical world…  How can this change your life or the world for that matter?

That all depends, what is the problem that you wish to solve?

In this case, I decided to pay attention to a problem that is encountered very often, especially in college campuses and in the realm of night-life entertainment.  Have you ever known someone to have too much to drink to where it might pose a problem for yourself or others?  We have kiosks which will display arcade games and such, but that really just passes the time.  What if we had an internet connected device that could track your alcohol consumption?  We could identify those who are at risk, or even on their way to becoming a risk to themselves and/or others and intervene.

In this project, we will look at creating an internet connected breathalyzer with cloud-based reporting and logging.  From a technical perspective, this project will introduce us to using an analog sensor for detecting alcohol and a standalone LCD screen for prompting a user with instructions.  We will tie information gathered from the sensor up to an external interface that logs results and displays them locally while also storing into an Azure Mobile Service.

You may have heard that Windows 10 IoT Core runs Universal Windows Platform apps, meaning that our code should potentially run across any device that supports the Windows 10 Core APIs including Windows Mobile, Xbox One, Windows 10 desktop and potentially Hololens.   As such, this implies that it should be able to support UWP controls offered by third parties.  We are going to include a charting package from SyncFusion to prove that this is in fact possible.

Note: *Accuracy of this device is not guaranteed.  Any replication of this device should be considered as a novelty and not a substitute for more scientifically accurate or legally acceptable methods of measurement

Software Prerequisites:



Circuit Layout:
In this picture, the screen is located to the far left, Analog Sensor (orange circle thing) in center, followed by the MCP3208 ADC chip.


RPi2 Pinout:
The pinout diagram will assist in connecting the components of the project


Connecting the ADC:

By design, the GPIO pins on the RPi2 are digital only, but we can read from analog sensors using the MCP3208 Analog to Digital Converter chip. We need to connect the following pins (left side is MCP3208 , right side is Raspberry Pi):

  • DGND -> GND
  • CS/SHDN -> SPI1 CS0
  • DIN -> SPI1 MOSI
  • CLK -> SPI1 SCLK
  • AGND -> GND
  • VREF-> 5V
  • VDD -> 5V

Connecting the MQ2 sensor to the ADC :


With the ADC wired up, we can now connect our Analog Sensor to one of the channels on the ADC for converting it’s Analog signal to a Digital signal that can be rad from the RPi2

  • Signal (A0) -> CH0 (ADC)
  • – -> GND (Pi)
  • + -> 5V (Pi)

Connecting the 1.8″ TFT Color Display Module:


This piece is optional and the code will work regardless of whether this device is connected.  We need to connect the following pins (left side is 1.8″ TFT Color Display Module, right side is Raspberry Pi):

  • VCC -> 5V
  • GND -> GND
  • SCL -> SPI0 SCLK
  • SDA -> SPI0 MOSI
  • RS/DC -> GPIO 12
  • RES -> GPIO 16
  • CS -> SPI0 CS0



The full project and code is contained as a sample within the Microsoft iot-devices project on GitHub.  You may wish to modify the member variables within MainPage.xaml.cs


As mentioned, this project is intended for novelty purposes only as it simply measure alcohol concentration in the air sampled by the MQ2 sensor.  This does not equate to the popular Blood Alcohol Content measurement.  For more information on how to implement that measure, take a look at this research from nootropicdesign.

Modifying this project to support completely different scenarios:

Technically, this project could be modified to report any analog reading over some predefined ambient threshold.  By simply swapping out the MQ2 sensor with say a light sensor or soil saturation sensor, you could monitor sunlight and saturation in a garden.  An audio sensor would allow you to determine if city ordinance audio levels are being violated in a controlled area.  You could even put in a pressure sensor and build a Strongman game like you see at carnivals.

Publishing Data to Azure Event Hubs from Particle Core using Webhooks, creators of the most excellent Spark Core device are now known as Particle!  Please keep in mind as a reader that the following information applies to the Spark Core device as well as Particle Core. 

In a previous post I discussed sending messages from a Spark Core Device to Azure Event Hubs by means of an Azure Mobile Service Proxy.  The solution raises a variety of concerns around security and is a bit cumbersome to implement.  It was then revealed to me that Spark (now Particle) employs a WebHooks service that can allow for triggering a request to a remote endpoint.  I had the experience of working together with David Middlecamp from Particle’s engineering team to create an extension for enabling Azure Event Hubs through this service.  In this post, I will guide you through setting up an Particle WebHook capable of sending data to an Azure Event Hub.



  1. Create an Azure Event Hub and configure a Shared Access Policy with the Send permission enabled
  2. Install the Particle CLI tool on your operating system of choice (You may skip the portion on enabling DFU-Util)
  3. Create a new file named webhook.json with a structure similar to the following (See: Particle Webhooks Documentation):
Note: There is a max send size of 255 characters from the Particle Core Device, please keep this in mind when naming variables!  Also, the “azure_sas_token” is very important as it is used server-side by the WebHooks service to appropriately forward to your Event Hub REST API.
  1. Launch the Particle CLI tool (Open CMD prompt on Windows) and type “particle login” then login with your credentials
  2. Navigate to the folder containing webhook.json and type “particle webhook create webhook.json”
  3. Verify your webhook was created with “particle webhook list”
  4. Now in the Particle Web IDE you can send data to your Azure Event Hub using “Spark.publish(“NAME_OF_YOUR_EVENT“, payload);”
  5. Verify your data is sending appropriately from the Particle CLI tool by running “particle subscribe mine”

Conclusion: is an excellent device for microcontroller prototyping, especially where web connectivity is required.  I have found my Particle Core devices are extremely resilient, being 100% operational after using non-stop for months at a time.  The device is even smart enough to reconnect if the network goes out.  As a result, I believe this device to be a an excellent contender in the Internet of Things space.  Furthermore, now that the device supports Azure Events Hubs via WebHooks, one can relatively easily craft a scenario involving up to 1 million messages coming in per second for processing via Microsoft Azure.  For a full working example, check out this implementation of Particle WebHooks in the open-source ConnectTheDots project from MSOpenTech.

Streaming Xbox One Games to Windows 10 Preview – Tutorial

Earlier today, a variety of exciting announcements were made from Microsoft at this year’s E3 gaming conference regarding Xbox One.  These include backwards compatibility with Xbox 360, playing Xbox One games on Oculus Rift, and streaming games to Windows 10 devices!  I was unaware that this was already available until it was alluded to by a tweet from XboxQwik.  Having the Preview bits for Windows 10 already installed, I decided to explore and wound up figuring out how to enable Xbox One streaming to your Windows 10 device!




Please keep in mind, Xbox One to Windows 10 streaming is currently in preview and subject to change.  For more information, check the official streaming FAQ from  It is very exciting to see the direction the Xbox team has taken with integrating into Windows 10.  I can’t wait to see the final product when Windows 10 is officially released in late July!  Feel free to post your findings and experience in the comments!

Happy Hacking and Game On!

RPi + WinPhone + MS Band + Azure + Excel + Audio-Controlled LEDs = Hot Tub Time Machine From the Future

Hot Tub Time Machine from The Future – Music Entertainment System

The Internet of Things and Houston weather have one thing very much in common.  They are sooooo hot right now!  Inspired by this, I have been thinking a lot about outdoor projects that interact with the cloud, for example my recent Spark Core powered Hot Tub monitor.  This trend is only just now beginning to take off with plenty of exciting projects forming in the space including Rachio’s IoT sprinkler system and this most excellent homebrew soil monitor running on an Intel Edison.  These examples highlight how we can operate on data to produce interactions and inferences which apply to the physical world.  This, I believe, is the core of IoT’s ability to change our lives in the future.

I propose that if the Internet of Things is the future, then projects which incorporate it bring the future to those “things” involved.  Deriving from my personal passion for music and entertaining, I decided to explore how IoT could assist in amplifying those passions.  As a software developer, there is no better feeling than creatively applying our talent to produce extensions of our interests which serve to enhance our experience.   Today’s project combines an array of seemingly disparate technologies to produce a voice-controlled music entertainment system combined with flashing lights and a good old cloud-enabled Excel report for analyzing playback data.   I call it, “Hot Tub Time Machine From the Future”.



All code with instructions on use and configuration can be found in the MusicNet repository.  Simply follow the instructions in the Readme and deploy the Windows Phone project to your device.

How it works:

We leverage the PiMusicBox project to turn the Raspberry Pi into a network enabled Jukebox.  This project is amazing and allows for playing back from a variety of sources including Youtube, Spotify, SoundCloud etc., in addition to SMB share and local files.  After installing and configuring PiMusicBox, simply plug up some speakers and anyone on your home network can now access the device by ip or using the “musicbox.local” hostname.  We then modify the Last.FM Scrobbler plugin on the PiMusicBox to push the playback result into an Azure Mobile Service Table.  We can then connect to this Data Source via Excel and provide a variety of visualizations by using a pivot table over Artist and TrackName.

The Windows Phone app connects to the the Mopidy service running on PiMusicBox to allow for API level access for controlling things like Pause, Play, Next Track etc.  Using the speech API on the phone we define a series of voice commands that can launch the app from Cortana and speak to the PiMusicBox through the aforementioned Mopidy service.  As a result, this just works from the Microsoft Band with no modifcation needed because the Band supports Cortana out the box!

Finally, the blinking lights connect into the Audio Controller and are mounted.  Make sure to place the LED Audio Controller within reasonable proximity to the speaker system connected to the PI.


Voice-controlled music playback with blinky lights and Azure-powered Excel reporting is awesome!  Now the idea is how to take it further!  What if we took the result of the current playing track and displayed it along with current listener satisfaction on a projector of sorts allowing for dissatisfied listeners to upvote or downvote in real-time?  What if Cortana controlled the hot tub itself?  What if the music genre changed depending on the temperature of the hot tub?  What if the “Hot Tub Time Machine” knew what the best music was for the mood based on weather data?  Feel free to leave your suggestions in the comments.  Until next time, Happy Hacking!


Spark Core + DS18B20 + Azure Event Hubs = IoT Hot Tub / Pool Monitor



Summer is here in Houston and there is no better time to get in the water to cool off or warm up.  I am particularly fond of the latter, especially with a good group of friends, food, and refreshments.  The problem is,  it can be hard to tell when the hot tub is ready without actually getting in or looking at a temperature gauge.

Enter Spark Core, a Wi-Fi enabled micro-controller that can be programmed remotely from  a web-based IDE.  I absolutely love this device for ease of programming, portability, and resilience.  Let me reiterate on resilience.  I have successfully ran my Spark Core devices for weeks at a time with zero downtime.  I think these things are pretty much impervious to breakdown so far.  To give a visual output of the temperature, I added a Spark Button device to glow a specific color in relation to the current reading.

I have had alot of fun getting this device to work with the ConnectTheDots project from MSOpenTech.  This project allows for connecting a variety of sensors either directly or through a gateway, into Azure Event Hubs for real-time Streaming Analytics processing.  The results are then displayed in an Azure Web Portal.  I absolutely love this project for it’s ability to walkthrough a serious end-to-end IoT solution and have enjoyed both contributing and delivering IoT Dev Camps based on the project.

Build your Own!

The hot tub monitor solution leverages my previous post on “Sending messages to Azure Event Hub with Spark over AMS API Proxy“.  All code has been contributed to the ConnectTheDots project.  To replicate, simply follow the instructions for setting up the Spark+DS18B20 documentaion and insert into your pool or Hot Tub!  The only modification I made was to use a portable 5v battery source which could be greatly improved using a 5v solar cell.


He’s heating up!


He’s on FIRE!








You may be wondering, is this really useful?  I actually ended up using the device over the weekend, and was happy to see the ring turn red indicating to guests that Hot Tub was in fact ready!  This no doubt got a lot of questions and spawned some discussion on improving the Hot Tub experience further.  Stay tuned for an update on what we plan to implement.  A few hints, it involves a Microsoft Band, Windows Phone App, raspberry Pi, and audio responsive LED driver!


Microsoft IoT DevCamps Announced!

Microsoft has recently announced a series of Microsoft IoT DevCamps taking place across the United States through May and June with more being planned (stay tuned here for updates) or check the official announcement.


Date                          Speaker                                         Locale

5/12/15 Paul DeCarlo Chicago
5/15/15 Stacey Mulcahy New York
5/29/15 Bret Stateham Sunnyvale


We just wrapped up the first event in Chicago, Illinois with an excellent group of industry professionals and IoT enthusiasts.  The content is very exciting as it leverages connecting the popular Raspberry Pi 2 device + Arudnio + WeatherShield and .NET Gadgeteer to an Azure Event Hub for real-time processing and visualization of data through Streaming Analytics and an Azure Websites front-end.  This is a truly hands-on lab where we outfit attendees with the hardware and Azure services to walkthrough a complete end-to-end Iot solution in the cloud.  The lab content comes from the ConnectTheDots project from MSOpenTech.



Introducing IoT


Hands-On Development


Raspberry Pi + Gadgeteer Kits


Rasp Pi + Arduino



Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Dots Evvvvvveryboddddddy!

Discover how to use Microsoft Azure services as a full, end-to-end IoT solution.  We’ll put Event Hubs, Streaming Analytics, and Websites to the task of presenting data from a variety of hardware devices.

Sending messages to Azure Event Hub with Spark over AMS API Proxy

In this article, I will describe how to publish data from a Spark Core to an Azure Event Hub for real-time processing using Azure Mobile Services as a message proxy.

Spark OS is a distributed operating system for the Internet of Things that brings the power of the cloud to low-cost connected hardware.  Spark provides an online IDE for programming a Wi-Fi enabled Arduino-like device known as the Spark Core.  Azure Event Hubs are a highly scalable publish-subscribe ingestor that can intake millions of events per second so that you can process and analyze the massive amounts of data produced by your connected devices and applications. Once collected into Event Hubs you can transform and store data using any real-time analytics provider or with batching/storage adapters.

To begin, I took an approach of using the Event Hubs Rest API Send Event.  This seemed straightforward, simply create a request with the appropriate Request Headers over HTTP as HTTP and HTTPS are mentioned as supported in the documentation.  However, when sending this request over HTTP, I received “Transport security is required to protect the security token” when including the necessary “Authorization” header.  This poses a bit of a problem as the light-weight Spark device is unable to perform the computations necessary to send SSL requests.

Azure Mobile Services to the rescue!

I first created a new Azure Mobile Service with a Javscript backend:

1- Create Mobile Service

1.5 - Create Mobile Service

Next, I created a new API within the service named “temp”:

2 - Create API


Finally, create an Azure Service Bus with an Event Hub by following the instructions in Hypernephelist’s “Sending data to Azure Event Hubs from Node.JS using the REST API“.


The idea being, that I could successfully send data to the Mobile Service API as documented in Brian Sherwin’s “Wiring Up the Spark Core To Azure” then forward this data to the event hub using the information provided in Hypernephelist’s “Sending data to Azure Event Hubs from Node.JS using the REST API“.  Essentially creating an Proxy via Azure Mobile Services to get data from the Spark in to an Azure Event Hub.


Let’s begin, by building the Event Hub Proxy in the “temp” API.  This API will require Custom Node.JS packages that can be installed by following Redbit’s “Using Custom NodeJS Modules with Azure Mobile Services“.  Follow the instructions and be sure to run an npm install for https, crypto, and moment as these are required to generate the SAS Key for sending data through the Event Hub rest Service.

The actual API code is below (with heavy reliance on Hypernephelist’s example), you will need to modify this by editing in the Azure API editor within the Azure portal, or modifying on disk after cloning per Redbit’s instructions.  Be sure to edit the namespace, hubname, my_key_name, and my_key variables with the appropriate values from your Azure Event Hub.

3 - Azure API Editor

/************************Begin AMS Code**********************

var https = require('https');
var crypto = require('crypto');
var moment = require('moment'); = function(request, response) {

function sendTemperature(payload) {
// Event Hubs parameters
var namespace = 'EVENTHUBNAMESPACE';
var hubname ='EVENTHUBNAME';

// Shared access key (from Event Hub configuration) 
var my_key_name = 'KEYNAME'; 
var my_key = 'KEY';
// Payload to send
//payload = "{ \"temp\": \"100\", \"hmdt\": \"78\", \"subject\": \"wthr\", \"dspl\": \"test\"," + "\"time\": " + "\"" + new Date().toISOString() + "\" }";

// Full Event Hub publisher URI
var my_uri = 'https://' + namespace + '' + '/' + hubname  + '/messages';

// Create a SAS token
// See

function create_sas_token(uri, key_name, key)
    // Token expires in one hour
    var expiry = moment().add(1, 'hours').unix();

    var string_to_sign = encodeURIComponent(uri) + '\n' + expiry;
    var hmac = crypto.createHmac('sha256', key);
    var signature = hmac.digest('base64');
    var token = 'SharedAccessSignature sr=' + encodeURIComponent(uri) + '&sig=' + encodeURIComponent(signature) + '&se=' + expiry + '&skn=' + key_name;

    return token;

var my_sas = create_sas_token(my_uri, my_key_name, my_key)


// Send the request to the Event Hub

var options = {
  hostname: namespace + '',
  port: 443,
  path: '/' + hubname + '/messages',
  method: 'POST',
  headers: {
    'Authorization': my_sas,
    'Content-Length': payload.length,
    'Content-Type': 'application/atom+xml;type=entry;charset=utf-8'

var req = https.request(options, function(res) {
  //console.log("statusCode: ", res.statusCode);
  //console.log("headers: ", res.headers);

  res.on('data', function(d) {

req.on('error', function(e) {



/************************End AMS Code**********************

Finally, we need to set up the Spark Core with appropriate code to push data to the API in our Mobile Service. I leveraged the HttpClient as it has great logging features for debugging and is a bit easier to wield compared to Spark’s lightweight TCPClient. I also import SparkTime.h to generate the timestamp for messages from the Spark itself. Simply flash this code to your spark device, taking care to appropriately modify the AzureMobileService, AzureMobileServiceAPI, AzureMobileServiceKey, and deviceName variables. Note that the payload sent in this particular example corresponds to the expected payload in the Connect the Dots Project from MSOpenTech. This implies that there will soon be support for the Spark Core in this amazing project!

4 - Spark Editor

/************************Begin Spark Code******************

// This #include statement was automatically added by the Spark IDE.
#include "HttpClient/HttpClient.h"

// This #include statement was automatically added by the Spark IDE.
#include "SparkTime/SparkTime.h"

String AzureMobileService = "";
String AzureMobileSeriveAPI = "APINAME";
char AzureMobileServiceKey[40] = "MOBILESERVICEKEY";
char deviceName[40] = "SparkCore"; 

UDP UDPClient;
SparkTime rtc;
HttpClient http;

void setup()
    rtc.begin(&UDPClient, "");
    rtc.setTimeZone(-5); // gmt offset

void loop()
    unsigned long currentTime;
    currentTime =;
    String timeNowString = rtc.ISODateUTCString(currentTime);
    char timeNowChar[sizeof(timeNowString)]; 
    strcpy(timeNowChar, timeNowString.c_str());
    char payload[120];
    snprintf(payload, sizeof(payload), "{ \"temp\": \"76\", \"hmdt\": \"32\", \"subject\": \"wthr\", \"dspl\": \"%s\", \"time\": \"%s\" }", deviceName, timeNowChar);
    http_header_t headers[] = {
        { "X-ZUMO-APPLICATION", AzureMobileServiceKey },
        { "Cache-Control", "no-cache" },
        { NULL, NULL } // NOTE: Always terminate headers with NULL
    http_request_t request;
    http_response_t response;
    request.hostname = AzureMobileService;
    request.port = 80;
    request.path = "/api/" + AzureMobileSeriveAPI;
    request.body = payload;, response, headers);
    Serial.print("Application>\tResponse status: ");

    Serial.print("Application>\tHTTP Response Body: ");

/************************End Spark Code********************

Voila! I am able to verify my Spark is appropriately forwarding data to my ConnectTheDots portal!

5 - CTD Portal

We can also verify / debug by connecting to our Spark Core over serial and monitoring the output of HTTPClient

6 - Putty
I absolutely love developing on the Spark device do to it’s simplicity to update and convenient online IDE. Now with the power of Azure, we can real-time analyze data coming from one of these devices!

You can find the latest code included in this project at the DXHacker/SparkEventHub repo on Github.

Training Kinect4NES to Control Mike Tyson’s Punch-Out!

Kinect4NES @ HackRice – First Time Player Knocks out Glass Joe

In a previous post, I talked about how to create an interface to send controller commands to an NES based on interaction with the Kinect v2.  The idea was successful, but I received a bit of feedback on the control being less than optimal and a suggestion that it would likely work well with a game like Mike Tyson’s Punch-Out.

This aroused an interesting challenge, could I create a control mechanism that could allow me to play Mike Tyson’s Punch-Out using Kinect4NES with enough stability to accurately beat the first couple characters?

Let’s first look at how control was achieved in the first iteration of Kinect4NES.  There are essentially 2 ways of reacting to input on the Kinect, using a heuristic-based approach based on relatively inexpensive positional comparison of tracked joints or gesture based tracking (either discrete or continuous).  For my initial proof of concept, I used the following heuristic-based approach:


Taken from CalcController(Body body) in MainWindow.xaml.cs

* DPad from Calc

var dpadLeft = ((leftWrist.Position.Y > mid.Position.Y – 0.20) && (leftWrist.Position.X < mid.Position.X -0.5));
var dpadRight = ((rightWrist.Position.Y > mid.Position.Y – 0.20) && (rightWrist.Position.X > mid.Position.X + 0.5));
var dpadUp = ((leftWrist.Position.Y > head.Position.Y) || (rightWrist.Position.Y > head.Position.Y));
var dpadDown = ((spineBase.Position.Y – knee.Position.Y) < 0.10);
var start = ((head.Position.Y < shoulder.Position.Y));


As you can see this a basic approach that just compares current joint positions and if the condition is satisfied, it activates that controller input.

Ideally, we would like to have natural body movements drive our interaction with Mike Tyson’s Punch-Out.  To begin, we need to familiarize with they way the game is controlled by the NES controller.  I was lucky enough to come across a copy of the game at a local flea market around the time this project idea was going on in my head, the same one where I had found boxed NES controllers a couple weeks earlier.  I found an online manual which described the various game inputs and used these as a basis for defining my gestures.


-)  : Dodge to right
(-  : Dodge to left
DOWN: Once: block
      Twice rapidly: ducking

--- Left body blow (B + UP = Punch to left face)
|    -- Right body blow (A + UP = Punch to right face)
|    |
B    A

(When Mac is knocked down, press rapidly and he'll get 

SELECT: If pressed between rounds, Doc's encoraging
        advice can increase Mac's stamina
START:  Uppercut (If the number of stars is 1 or


Take note of how some of these inputs are button combinations or rapid presses.  We will revisit later how I optimized the mechanism to account for these cases.

To begin with creating the gestures, I started a new solution using the Visual Gesture Builder Preview included in the Kinect v2 SDK to create a series of Discrete Gesture projects for each of the behaviors identified in the Punch-Out manual.


For each of these projects, I had my brother perform a decided gesture with approximately 20 positive cases (gestures that should be considered as performed successfully) and 5 or so negatives (gestures that should not be considered performed successfully).  I.E. for the Uppercut, he would perform 20 uppercuts with the right hand for positive cases and a few regular left and right punches for the negative cases.  This way, we won’t accidentally perform an uppercut when a regular left or right punch is thrown.

KinectStudioAfter obtaining a successful recording, we add the clip to the appropriate project in our Visual Gesture Builder project.  Here we meticulously tag the key frames to indicate the frames where a successful gesture is performed.  As a result, areas that are not tagged are considered negative cases.


We then perform a build of the project which uses the Adaboost algorithm to learn the intended positions of the joints to create a state machine for determining a successful gesture.  Each project outputs a .gba file which are composed into a .gbd when building the solution.


We repeat this for all of our projects and then verify the .gbd with “File => Live Preview” in Visual Gesture Builder.  This allows us to see the signal generated by our current pose for all produced gesture projects, very handy for determining whether a given gesture creates interference with another.  In the image below, you see a very clear signal is generated by the uppercut pose.


With the recorded gestures verified, I looked at the sample code used in the “Visual Studio Gesture Builder – Preview” project included in the Kinect SDK browser.



From here, I incorporated the relevant bits into GestureDetector.cs.  In my original implementation, I iterated through all recorded gestures and employed a switch to perform the button press when one was detected.  This proved to be ineffecient and created inconsistent button presses.  I improved this significantly in my second update using a dictionary to hold a series of Actions (anonymous functions that return void) and a parallel foreach, allowing me to eliminate cyclomatic complexity in the previous switch while allowing me to process all potential gestures in parallel.  I also created a Press method for simulating presses.  This allowed me to send in any combination of buttons to perform behaviors like HeadBlow_Right (UP + A).  I also implemented a Hold method to make it possible to perform the duck behavior (press down, hold down).  In the final tweak, I implemented a method to produce a RapidPress for the Recover gesture.  This allowed me to reproduce a well known tip in Punch-Out where you can regain health in between matches by rapidly pressing select.

This was a rather interesting programming excercise, imagine coding at 2 in the morning with the goal of optimizing code for the intent of knocking out Glass Joe in a stable repeatable manner.  The end result wound up working well enough to where a ‘seasoned’ player can actually TKO the first two characters with relative regularity.  In the video at the top of this post, the player had actually never used the Kinect4NES and TKO’d Glass Joe on his first try.  As a result, I am satisfied with this experiment, it was certainly a fun project that allowed me to become more familiar with programming for the Kinect while also having the joy of merging modern technology with the classic NES.  For those interested in replicating, you can find the source code on github. If you have any ideas on future games that you would like to see controlled with Kinect4NES, please let me know in the comments!

Porting Open Source Libraries to Windows for IoT (mincore)

Microsoft is bringing Windows to a new class of small devices. Riding the crest of the “Internet of Things” movement, Microsoft is looking to capitalize on devices and sensor capabilities of popular development boards. Recently, members of the Windows Developer Program for IoT have been able to gain access to a build of Windows which supports the Intel Galileo chipset.

Bringing Windows to small devices is a huge feat that opens the door to many development opportunities.  Of course, this means, a lot of existing code can be brought over to aid in creating IoT solutions.  This post aims to identify the specifics of compiling two open source libraries to this new version of Windows.

The libraries in question concern apache-qpid-proton a light-weight messaging framework for sending AMQP messages and OpenSSL, an open-source library for implement Secure Socket Layer protocols.

Why these two libraries?  They were necessary for creating a Win32 application capable of sending AMQPS messages up to an Azure Event Hub as part of Galieo device support in the super awesome Connect the Dots project from MSOpenTech. More importantly, we get to encounter two rather distinct compilation exercises.  Apache Qpid can send AMQP (without the S) messages on its own, but Azure requires these are sent over SSL.  So we need to compile Apache Qpid against OpenSSL to get AMQPS support.  In addition, Apache Qpid gives us a Visual Studio Solution to work with while OpenSSL is built using a makefile in combination with Perl and python processors for producing the makefile itself.  This post will explain these scenarios and the necessary changes required to target Windows for IoT through the Visual Studio project for Apache Qpid and the makefile for Open SSL.

Let’s begin by looking at the default property configuration for the Apache Qpid Visual Studio Project:


Normally, when compiling a Win32 application for a desktop PC, we will compile against Win32 libraries contained in C:\Windows\System32.  When targeting the Intel Galileo board you will notice that the default Intel Galileo Wiring App template contained in the Windows Developer Program for IoT MSI links against a single library, mincore.lib.  Jer’s blog goes into the best known detail on what this is. Long story short, we need to compile against mincore.lib in order to obtain code capable of running on the Galileo as the mappings for System and Win32 functions are completely different in Windows for IoT and contained in this particular lib.  This sets the basis for rules #1 and #2.


1. Remove all references to System 32 libs and replace with a reference to Mincore.lib



2. For all references removed in step 1, add these Dlls to the IgnoreDefaultLibraries Collection, this ensures that the linker will not attempt to link to these Dlls, as we want to link to references in Mincore only.  Note: I have added compatible OpenSSL binaries to Additional Dependencies to enable OpenSSL support


In addition, we need to consider the hardware present on the Galileo board itself.  Intel outfits the board with an Intel® Quark™ SoC X1000 application processor, a 32-bit, single-core, single-thread, Intel® Pentium® processor instruction set architecture (ISA)-compatible, operating at speeds up to 400 MHz.  This processor does not support enhanced instruction sets including SSE, SSE2, AVX, or AVX 2.    This sets the basis for rule #3.


3.  Ensure all code is compiled with the /arch:IA32 compiler flag



You can now build Apache-Qpid-Proton to target Windows for IoT on the Intel Galileo, however, in order to be useful, we need to compile again OpenSSL as Azure event hubs require that we send messages using AMQPS.  Without OpenSSL support, we can only send AMQP messages which will be ignored by the Azure event hub.

There is an excellent article on compiling Apache-Qpid Proton against OpenSSL for Windows @

I don’t want to reproduce the content there so let’s talk about the changes necessary to target the Windows on the Galileo board.

In step A.3 the author describes the process for compiling the OpenSSL dynamic linking libraries using “nmake –f ms\ntdll.mak install”.  Nmake is Microsoft’s build tool for building makefiles.  To use the tool you can access it within a Visual Studio command prompt, from it’s actual location in C:\Program Files (x86)\Microsoft Visual Studio X.X\VC\bin, or call C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools\vsvars32.bat in a standard command prompt to allow for the path to nmake to be available in your current shell.  The problem is the default makefile is configured to build against Win32, i.e. Win32 on the desktop.

Let’s take what we learned above and apply it to the ntdll makefile:

Inside the untouched ntdll.mak you will see the following:

# Set your compiler options
APP_CFLAG= /Zi /Fd$(TMP_D)/app
APP_EX_OBJ=setargv.obj $(OBJ_D)\applink.obj /implib:$(TMP_D)\junk.lib
# add extra libraries to this define, for solaris -lsocket -lnsl would
# be added
EX_LIBS=ws2_32.lib gdi32.lib advapi32.lib crypt32.lib user32.lib

# The OpenSSL directory

LFLAGS=/nologo /subsystem:console /opt:ref /debug


We essentially have a section of the makefile which outlines compiler flags and linker flags.  Here we can apply the rules from above to create a makefile that will produce a Win32 compatible library which targets the Intel Galileo.

Applying Rule #1 we remove the libs mentioned in EX_LIBS and replace with mincore.lib

Applying Rule #2 we take the libs that were in EX_LIBS and add to the linker flags (LFLAG): /NODEFAULTLIB:NAMEOFLIBARY

Applying Rule #3 we add /arch:IA32 to each compiler flag (*CFLAG)


This yields the following changes:

# Set your compiler options
APP_CFLAG= /arch:IA32 /Zi /Fd$(TMP_D)/app
LIB_CFLAG= /arch:IA32 /Zi /Fd$(TMP_D)/lib -D_WINDLL
APP_EX_OBJ=setargv.obj $(OBJ_D)\applink.obj /implib:$(TMP_D)\junk.lib
# add extra libraries to this define, for solaris -lsocket -lnsl would
# be added

# The OpenSSL directory

LFLAGS=/NODEFAULTLIB:kernel32.lib /NODEFAULTLIB:ws2_32.lib /NODEFAULTLIB:gdi32.lib /NODEFAULTLIB:advapi32.lib /NODEFAULTLIB:crypt32.lib /NODEFAULTLIB:user32.lib /nologo /subsystem:console /opt:ref /debug

I have posted the the complete changes made to ntdll.mak compatible with the Windows for IoT on Intel Galileo @


We can now build the OpenSSL libraries, but you will notice you receive a variety of errors.  This is due to missing functions in mincore.lib that were available in the original System32 dlls.

For example:

Creating library out32dll\libeay32.lib and object out32dll\libeay32.exp
cryptlib.obj : error LNK2019: unresolved external symbol __imp__DeregisterEventSource@4 referenced in function _OPENSSL_showfatal
cryptlib.obj : error LNK2019: unresolved external symbol __imp__RegisterEventSourceA@8 referenced in function _OPENSSL_showfatal
cryptlib.obj : error LNK2019: unresolved external symbol __imp__ReportEventA@36 referenced in function _OPENSSL_showfatal
cryptlib.obj : error LNK2019: unresolved external symbol __imp__GetProcessWindowStation@0 referenced in function _OPENSSL_isservice
cryptlib.obj : error LNK2019: unresolved external symbol __imp__GetUserObjectInformationW@20 referenced in function _OPENSSL_isservice
cryptlib.obj : error LNK2019: unresolved external symbol __imp__MessageBoxA@16 referenced in function _OPENSSL_showfatal
cryptlib.obj : error LNK2019: unresolved external symbol __imp__GetDesktopWindow@0 referenced in function _OPENSSL_isservice
rand_win.obj : error LNK2019: unresolved external symbol __imp__CreateCompatibleBitmap@12 referenced in function _readscreen
rand_win.obj : error LNK2019: unresolved external symbol __imp__DeleteObject@4 referenced in function _readscreen
rand_win.obj : error LNK2019: unresolved external symbol __imp__GetDeviceCaps@8 referenced in function _readscreen
rand_win.obj : error LNK2019: unresolved external symbol __imp__GetDIBits@28 referenced in function _readscreen
rand_win.obj : error LNK2019: unresolved external symbol __imp__GetObjectA@12 referenced in function _readscreen
rand_win.obj : error LNK2019: unresolved external symbol __imp__GetDC@4 referenced in function _readscreen
rand_win.obj : error LNK2019: unresolved external symbol __imp__ReleaseDC@8 referenced in function _readscreen

You will notice that these errors actually kind of make sense.  Recall Window for IoT (mincore) is stripped down to approximately 171 MB.  As a result, many unnecessary functions are removed, such as GetProessWindow and MessageBox as shown above (as there isn’t a GUI available on the stripped down mincore).  We now need to modify the source (as safely as possible) to resolve these externals.  In my case, I simply commented out the missing method where necessary.  Of course, this may have unintended side effects, but due to the fact that most of the missing calls deal with the GUI, you are probably okay.

Continue this until the only errors you receive are in creating the e_capi.obj

Now run nmake -i -f ms\ntdll.mak install (-i will ignore compilation errors, namely the ones coming from e_capi)

Capi is one of the engines used by OpenSSL and it is probably important but I could not get around the compilation errors without essentially breaking it completely so I left it out.  This will still produce a valid libeay32.dll and ssleay32.dll.  You can verify by copying these dlls along with the created openssl.exe and not that it runs on the Galileo! (Note: you can resolve the error mentioned by copying the produced openssl.cnf to the directory mentioned)



Now to truly compile Apache-Qpid_Proton with OpenSSL support, you would continue forward from step B of

Upon recreating and opening the Apache-Qpid-Proton Visual Studio solution, you would need to modify all the proton project using Rules #1 – #3 as defined above.

Of course, if you wish to obtain the precompiled binaries and see an example of using Apache-Qpid-Proton with OpenSSL support in a Galileo Wiring app, you may refer to this pull request in the Connect the Dots Project by MS OpenTech:


Happy Hacking!  Here’s to a great ideas and developments on Windows for IoT!