Hello and welcome back to this two part blog series where we revisit our Sensors to Servers demonstration (part 1 can be found here). In this instalment we will take a look at how we modified the demo to enable cloud hosting and featured it running concurrently at Mobile World Congress and Embedded World 2016.
Before retiring Sensors to Servers we decided to give it one last hurrah to show off some of ARM’s latest products, notably mbed OS and mbed Device Connector. This year's Mobile World Congress (MWC) in Barcelona and Embedded World (EW) in Nuremberg were the perfect stages as these two major trade shows happened to be held on the same week. We came up with the idea of displaying the live data from both shows, at both shows. Where in all previous deployments we used a local ARMv8-A based 64-bit server, to make this work we had to put the entire back end of the demo in the "cloud" and update our sensor node software to work with this topography.
System diagram
You can see in the diagram that each show required an internet connected router. If you recall the sensor nodes can communicate with the server using either 6LoWPAN or Ethernet. Typically, large trade show floors tend to create hostile RF environments, which led us to choose Ethernet to guarantee a stable and reliable connection. We connected up all of the sensor nodes and camera feed (see below) to the router. To view the visualisations all we needed was a web browser running on a Internet connected ARM Cortex-A based client device.
Approximately a year ago ARM announced its plans for the next generation of mbed, mbed OS. mbed OS (v3.0) is superseding mbed "Classic" as our Internet of Things (IoT) embedded operating system for ARM Cortex-M based microcontrollers. mbed OS went into its beta release phase in August last year. We immediately got our hands on the yotta build tools and started playing around with the new software. At the time there was no integrated development environment (IDE) support for the tools so we downloaded and customised a "clean" version of the open-source Eclipse IDE to manage the project, edit source files and run the yotta commands to update modules, build etc.
Once we were familiar with the OS and tools we quickly turned our attention to porting our sensor node application software from mbed "Classic" to mbed OS. When porting between the two versions the main difference to be aware of is that mbed OS has an event driven architecture rather than a multi-threaded real time operating system (RTOS). This is due to the highly integrated security and power management functions which allows developers to pull in a variety of different communication stacks while the OS keeps the application secure and power efficient. Luckily for us we had already written our original node software in an event driven manner so the port was fairly straight forward.
We were able to use the same sensor libraries as we had previously. Some of these libraries were imported form GitHub repositories and some were custom written. The built-in scheduler used in mbed OS (MINAR) allowed us to post periodic callbacks to read and update sensor data where appropriate. Two versions of the node software were written; one for 6LoWPAN communications and one for Ethernet communications. We used a modular approach for integrating the sensor's libraries' allowing us to choose which sensors were active in any one node so the software project was as flexible as the hardware.
Next, we turned our attention to the server. We needed to determine how the sensor data should be collected and handled in the cloud. For this, mbed has two offerings; mbed Device Server and mbed Device Connector. To make the distinction mbed Device Server is the middleware that connects your IoT devices to your web applications and mbed Device Connector is a cloud hosted service including mbed Device Server and a developer console. This allows your mbed Enabled IoT device to connect to the cloud without the need to build your own infrastructure.
ARM booths at MWC (left) and EW (right) 2016
To move from our local server to the cloud we first had to choose a third party cloud service. We choose Microsoft Azure cloud computing platform. I would love to give a technical reason why we choose Azure but being honest it was recommend to us by ARM's IT department as they had used them for previous projects but frankly anyone one of our cloud partners would have been suitable.
Previously, we had written an application which used Device Server's REST APIs to filter and post the received sensor updates in to a SQLite database. The original application was written in Java. With the updated version of Device Server and the switch to the cloud we decided to move from Java to Node.JS. This did mean we had to re-write our application but Node.JS made it much easier to handle the REST APIs and it was only a few hours work. To test the demo, some of my colleagues took a bunch of sensor nodes home and plugged them in to their home LANs. A tweak of their firewall settings and they were away. Now we were ready to plug our sensor nodes in to any internet connected router anywhere in the world and our application would receive the updates.
Apart from the odd tweak, how we visualised the collected data was largely unchanged from the original version of the demo. However to contextualise the data from the two different locations we added a small camera feed on one of the three scrolling pages. One interesting note here is how we displayed the visualisations at Embedded World; the design of the booth left us with only a very small compartment to hide away our equipment. Where we would normally have use a Google Chromebook we were able to use the ASUS Chrombit CS10 powered by an ARM Cortex-A17 and ARM Mali-T760 based system on chip (SoC) from Rockchip. This small HDMI stick running Chrome OS was perfect for hiding away behind the monitor while giving us the same functionality as using a clamshell Chromebook.
Data captured during Embedded World 2016 (click to enlarge)
Sensors to Servers demo station during Embedded World 2016 setup
Once every minute a still image was captured at each show and displayed on the corresponding visualisation. The camera feed came courtesy of a quad-core ARM Cortex-A7 powered Raspberry Pi 2 and Raspberry Pi Camera Module. The Raspberry Pi was running the Raspbian Jessie Lite Linux based operating system. A small bash script was written and scheduled to run once every minute by a cron job. The bash script captured the image using the raspistill command line tool and upload the data to the cloud server via the curl command line tool using HTTP. By compressing the images down to only several hundred kilobytes we were able to minimise the upload time and we were able to save a copy of each image on the Raspberry Pi's memory card to create a time lapse video.
That's it for these two Sensors to Servers demo revisited blogs, I hope you've enjoyed reading them. If you are visiting a large trade show in the future be sure to call by the ARM booth and see what great demos ARM are showcasing, our friendly and knowledgeable engineers will be very happy to give you a demonstration and answer any questions you may have on ARM and our technology. Thanks for reading!