1 2 3 Previous Next

Internet of Things

479 posts

Are you curious to know more about the exciting new Windows 10 IoT platform for Embedded ARM Devices? Join Daniel Lang (CTO, Toradex Inc.) as he demystifies Windows 10 IoT for you in his speaker session at the upcoming ARM TechCon 2015, one of the world’s leading platforms for system developers working on the ARM® architecture.

ARM-TechCon 2015-Daniel-Lang-Toradex.jpgWindows 10 IoT marks the first time in the history when the Windows OS will be simultaneously used across the entire range of computing platforms, from the smaller mobile phones to our household PCs, servers and even devices like Xbox and HoloLens. Join the 50-minute Technical Session conducted by Daniel to understand Windows 10 IoT and know about its advantages and limitations. You will also learn about the developer experience and how you can gain access to low level interfaces like GPIOs, I2C and SPI during the conference. Get your passes now!

An update from Phill Smith, Senior Marketing Manager with ARM's IoTBU, fresh from the floor of Maker Faire New York:

I am now on my way back to Cambridge after visiting the 2015 edition of Maker Faire New York.  ARM sponsored a post set-up, pre-exhibition reception for the Makers on Friday night. This was a great chance to meet some of the most innovative people around and chat about what they're building and what their challenges are. I'm particularly interested in wearable devices just now, where power consumption is a recurring theme. We had some demos running the new mbed OS on hand - this has been designed specifically to work hand-in-hand with the hardware to maximise power efficiency, so watch this space!


I spent Saturday on the show floor, and it is clear 3D printing is rapidly improving with all shapes and sizes of thing off the vast range of printers and kits now available.  Couple 3D printing with easy to use electronics like Arduino, mbed, Raspberry Pi or Beagle boards, and easy access to large scale manufacturing in China, and we are going to see some really innovative products emerging from today’s Makers and the start-ups they create.


Some of the interesting ideas that stood out to me at the show were:


A tiny quadcopter made by seeed and bitcraze.io called Crazyflie 2.0 running on an STM32F405 with an AR marker tag attached (similar to the tag used in Magic Mirrors demo from the ARM Mali team) which is able to track the drones position using a simple webcam.  This is a really innovative approach to short range point-to-point navigation, which could unlock a whole range of applications.


SatNOGS is a network of open source satellite tracking ground stations.  The rig on show consisted of 3D printed cogs driven by a DC motor fitted with a magnet sensor for extreme precision, rather than stepper motors.  The helical antennas were suspended by plastic plumbing pipe while the tripod was constructed of metal tube.  The whole thing is driven by a Beagle Board connected to the cloud over Ethernet to receive tracking coordinates.


Ever arrived home and just not had the energy to make diner?  Well, the Food by Print: Chef E One could be the answer.  The prototype consists of a series of tubes that could hold almost any ingredients from slices of bread, peanut butter, jelly, pasta etc. which are all assembled on the moving plate below. Today’s prototype makes sandwiches but development for deep fat frying and boiling water is under way.  A peak at the back of the machine reveals a Raspberry Pi!


That's just some of the crazy, brilliant stuff going on at Maker Faire. There was an incredible buzz all weekend, and it is impossible to come away from this show without a tremendous sense of excitement. My colleague Mark Woods will be posting his summary soon, so watch for his picks from the show floor, and his big takeaways from the event.

At the beginning of September Freescale announced their key contribution of FNET to ARM® mbed™ OS. Just as the Offspark acquisition allowed mbed OS to provide a core mbed TLS stack under the Apache 2.0 license, this contribution will allow us to integrate a core network stack under the Apache 2.0 license as part of mbed OS. This contribution is a perfect example of the collaboration model we envision for mbed OS, where partners and the community can join their efforts to make mbed OS an even better platform.


The FNET software will serve as the basis for the new TCP/IP stack in mbed OS and will enhance the platform’s existing connectivity capabilities. What this really means for mbed OS developers is the availability of a solid network stack that can be used both for IPv4 and IPv6 environments. The FNET stack is designed for constrained embedded environments and the way it is structured fits well with the core design philosophy for mbed OS.


So what will happen now? This contribution is just the starting point. In the following months, FNET will be used as the basis for a new IP stack module for mbed OS. When that initial work has finished, mbed OS will switch from using the lwIP stack to using our new network module based on FNET. The new stack will complement our existing 6LoWPAN and Thread technologies, without interruption to our standards support. Tighter integration and combination between the stacks is something to look at on the road ahead.

Check this post out on DIY Audio.  Would you believe that ARM Cortex-M4 or Cortex-M7 MCU can do audio DSP?  This is using DSP Concepts' graphical user interface.  Why write DSP code when Audio Weaver has done all the optimization for you?  The modularized architecture let's you design any system you want.  Your work is graphically self-documented, and you can achieve your goal in 10% of the time.


This DIY example is a sophisticated speaker cross-over design done very elegantly and simply.


Lots more audio applications can be achieved.  Imagine sophisticated audio effects.  How about playback enhancements?  How about microphone enhancements?  How about medical acoustical applications?  With an exceptional development environment, you are not bogged down by the code.  Unleash your imagination and creativity.

Global Tour 2015.jpg

It's that time again

The Nordic Global Tech Tour is about to hit the road again. This time it is all about the goodies inside our nRF52 Series for Bluetooth Smart and what they enable you to do. The Nordic Global Tech Tour (GTT) is always an exciting and fun time for us here at Nordic. It is the marker that years of hard work on a new IC family and supporting software are complete, and we’re ready to take it on the road to spread the good news. It is eagerly anticipated by the staff engineers from R&D that we take with us as it is a unique opportunity for them to directly engage with developers using or considering using our devices.

Developers love it

We first embarked on our GTTs with the launch of the nRF51 series some 3 years ago. It was a bold step sending so many of our highly valuable and hard-working R&D engineers on the road to deliver a masterclass in the nRF51 and its benefits to you, our developers. It’s a lot of cost it’s a lot of logistics and a lot of planning. But it’s also a lot of fun.

But my, it was well worth it, every location was over booked and we received an incredible response from attendees saying this sort of high-value technical, marketing free training was exactly what you wanted.We continued to do local Tech Tours subsequent to the first GTT and expanded the topics each time to cover different demands.

nRF52 worlds most power efficient.pngWorlds most powerful Bluetooth Smart SoC.png

Getting you up and running on the nRF52

The GTT this time around is of course about our new nRF52 Series Bluetooth Smart SoC and getting you familiar with its features so that you are well placed to get up and running and doing great things with it as soon as possible.

The nRF52 Series, simply put, is the most powerful, capable Bluetooth Smart SoC out there. It has a Cortex-M4F at its heart taking Bluetooth Smart application possibilities to a completely new level. But we have gone to great lengths to make the nRF52 and its internal features as easy to use as possible and also to make them perfectly suited to the goals of ultra-low power operation. Our design engineers will explain the philosophy of using the nRF52 for maximum performance as needed and also how to consume the minimum possible power.

Some important features will get special attention such as the advanced automatic power control schemes and taking advantage of them. The nRF52 peripherals have their own dedicated on-chip regulators and clock sources. This allows only exactly the peripherals required to be energized as needed with all non-required peripherals and CPU disabled. You will learn how to set up a communication interface (I²C module) to read data from an external source and store it into SRAM at periodic intervals without the need of CPU intervention. This specific example will be shown to relate to all peripherals on the device as they follow the same common design philosophy. After this example you will then see it used in conjunction with the Programmable Peripheral Interconnect (PPI) and EasyDMA to link up peripherals together and to memory thus creating ultra-low power operation and task completion without the CPU being involved.



Bringing NFC to Bluetooth pairing

There are some very special features that make the nRF52 stand out from the Bluetooth Smart crowd. One special case is the on-chip support for NFC tag. We will show you how this feature, we believe, is going to be the de-facto standard for Bluetooth pairing in IoT. It just makes the whole process simpler and safer. Our engineers will run through this feature and demonstrate its simplicity in operation and how easy it is to add NFC touch to pair to your design.



Software support and real applications

At Nordic we know it isn’t just about great ICs we recognize the value of good reliable, supporting software, and lots of it to help you achieve your aims. The nRF52 SDK will be covered and used extensively in support of demonstrating the nRF52’s great hardware features. The SDK covers everything from low-level peripheral drivers all the way up to full-blown Bluetooth Smart applications.

SoftDevices are a Nordic concept, fully encapsulated protocol stacks with associated APIs that run in their own reserved memory space thus ensuring complete logical and physical separation from application code. During a walkthrough of a Bluetooth Smart application we show you how to configure and use a SoftDevice, specifically the S132 SoftDevice. The S132 is a perfect choice for this as it supports concurrency for all 4 Bluetooth Smart roles: Central, Peripheral, Broadcaster and Observer. Once you become familiar with the S132 in this session you’ll find it an easy switch over to the other SoftDevices for ANT and ANT/Bluetooth Smart combination SoftDevices.


Have your say

In addition to getting a day crammed full of great information, examples and tips. You get a unique opportunity to talk to our R&D people give them your thoughts and ask them questions. We’re always willing to listen and always value what you guys tell us. Who knows? Maybe your input could influence a new chip design? Wouldn’t that be cool?

The Nordic Global Tech Tour kicks off in Boston, US on October 5th 2015 and runs through 30 separate locations until December 10th in Osaka. It is free to attend and we’ll even feed you and supply you with beverages! All you need to do is register at the link below.

We can’t wait to see you there!

Click on the link below to register for this year's nRF52 Global Tech Tour



This is the third and final instalment of the Sensors to Servers demonstration blog. Today we’ll talk about the server side of the story, and the visualisation we provided to client devices.



One of the catalysts for building this demonstration was the availability of real, ARMv8a 64-bit server silicon. For our first deployment of the demo at the Consumer Electronics Show® 2015 in Las Vegas, we were able to get an Applied Micro® X-C1 development kit, a board featuring a real, production ready ARMv8-A 64-bit server on a chip (the ***883208 XGene® SoC). What was particularly impressive about this platform was just how easy it was to install our choice of Linux distribution and Java. We were ready to deploy our software within an hour of powering up the server. For anyone who has worked to get a Linux distribution running on an ARM powered system over the past several years, this is a significant mark of progress. Linaro and other initiatives have really made a difference.



We chose to run Ubuntu® Linux 14.4. Following some relatively straight forward instructions to boot and install this, we used Ubuntu’s built in package manager to install Java… and that was that. The mbed Device Server is a Java application. We wrote an additional Java app that used Device Server’s Restful API to subscribe to all inbound notifications. This app then logged these to a local SQLite database.

To really bring the data to life, we built a series of rich web pages with Javascript. Our custom Java app used the Jetty framework to host these pages. More details on the pages can be found in the next section.


Whilst our server ran on a local network on the booth, it could just as easily have been deployed remotely – as we demonstrated, the requisite software is available for this application and many others. Whilst our server was a development kit, real rack-mountable ARM powered servers are now coming to market. Connecting this demonstration to a cloud-hosted mbed Device Server and client application would have been just as straight forward.



We chose to implement our data visualisation as a series of rich web pages. Any ARM powered Smartphone, Tablet or other client device (Chromebook, Smart TV etc) would thus be able to display the live data from the booth. The ubiquity of these devices and the accessibility they give to anything that is online is incredible – we didn’t need to develop any special software on the client side to provide an extremely rich experience of the data from our sensors.



We developed some of our own custom visualisations, but also used Flot and Heatmap.js, two freely available online Javascript libraries, to present the data. Heatmaps were drawn on top of floor plans of the booth, with ‘hot’ representing more frequent activations of the various presence sensors. Most other data was presented either ‘weather map’ style on top of the booth floor plan or as a graph drawn by Flot. We also highlighted some specific statistics, such as the noisiest sensor station, or most frequently used door.  A sample of the visualisation is provided below.




That concludes our blog series on the Sensors to Servers demonstration – an end-to-end IoT system. We hope that providing some of the detail of how this system was built was informative, but also that it might help inspire or advance your own projects. An instrumented trade show booth is an interesting demonstration, but the real world applications of this technology are the real story. As an engineer I also found a real change here in the difficulty of developing a system like this compared to when I started as a young intern many years ago. It is easier to obtain development platforms suitable for your application, easier to deploy software to these platforms, and easier to connect this software to distributed networks and services. All this allows much more freedom to focus on the applications of this technology, and has widened the audience of makers, developers and entrepreneurs who can access it. Exciting times! Thanks for reading, and watch for future updates on this and other ARM demos.

SAN FRANCISCO—If you’re looking for something fun and exciting to do this weekend, take the Qualcomm challenge at the Techcrunch Disrupt Hackathon. dragonboard 410c.png

The hackathon is a 24-hour contest open to innovators who can devise solutions using various resources and means in one spot: Pier 70 here. (Check out some of the winners from 2014).

Qualcomm's challenge offers hackathon innovators the opportunity to create something that includes a smartphone-class processing capability.

The challenge is this: Build a multimedia-rich IOT application with the DragonBoard 410c from Arrow Electronics , featuring Qualcomm Snapdragon 410 processor. Three prizes will be awarded ($3,000, $1,500, $500) for teams that make the most of the multimedia capabilities, which include PC-class graphics support, 1080p HD video playback and capture with H.264 (AVC) and advanced image processing.

Here is a link to information about the Qualcomm challenge. And watch this space in the future for the winners and their innovations.


Related stories:


Qualcomm announces Snapdragon 410 based on 64-bit ARM Cortex-A53 & details on AnandTech's Q&A with ARM's Cortex-A53 Lead Architect.

That Just Happened (Aug. 13): Learning the Alphabet; Qualcomm's New Snapdragon; De Geus on Security

Qualcomm and the sixth sense


Is it possible to design a smart wearable with an 8 week battery life? To find out, we went and built one. This week at Dreamforce we're demonstrating mbed OS running on a Cortex-M3 with real-life 8 week battery time. This shows the benefit of fully designing mbed OS for energy efficiency, from the ground up, matched with the latest in low-power Cortex-M silicon. This is all part of a new ARM mbed wearables reference design. Stay tuned, we'll be providing more details at TechCon.

Over the last few years, we have been building an experimental smart wearable to explore new concepts and technologies that could contribute to the internet of things. We set ourselves some stretch goals; it would last months on a battery, connect and interact with all your devices seamlessly, enable new forms of trusted interactions and ultimately aim to fade in to the background. What we’ve come up with for a watch design itself is pretty interesting, but the experiment is working too; some of the concepts and technologies it has helped us develop are already making their way in to mbed OS this year, some will appear in products over the next year or so, and some ideas will never leave the bench but almost certainly teach us something!


It has given us the opportunity to get first-hand experience of the realities of building complete and complex physical products - the mechanical design, electronics, software and taking it all through the production process. We’ve now taken a complete watch design from concept through to manufacturing a few hundred working units ourselves, and learned a huge amount. Ultimately, we think these learnings will be interesting and helpful to others.


demo at dreamforce.JPG

Live demonstration of the wearables reference design integrating heart rate measurement with Salesforce.com at Dreamforce, come by this week and see it for yourself.

Hardware interoperability is a long-standing issue. Ideally different computer and microcontroller boards expose their internals the way it fits their capabilities best. On the other hand, ideally peripherals should be reusable between platforms. There were always some platforms which were ahead the others in terms of interoperability (think Raspberry Pi, or platforms with "shields" in general), but that still just meant reusability in a very narrow sense.


Recently an initiative by Seeed Studio, the hardware creator platform based in Shenzhen, came up with Grove to take another stab at the problem, but now from the other side - the peripherals instead of the platform. Grove is based on a standardized 4-pin connector, standardized PCB sizes for the peripherals, and adapter boards & shields for all the most popular platforms (and beyond). This comprehensive approach enables Grove to immensely extensible on both sides - the platforms and the add-ons. They even have made a Grove-focused remix of the BeagleBone - the BeagleBone Green, with two onboard I2C Grove connectors.


Working at VIA, I thought it would be interesting to bring Grove to our embedded ARM boards as well. This would potentially allow enterprise and industrial users to take advantage of the quickly expanding Grove peripheral offering. A simple Grove break-out board was easy enough to put together and get manufactured by Seeed Studio - the GroveHat. It was great to see it working on the first try:




To demo the system, I've added an LCD with RGB Backlight, and a system monitoring script to display current CPU and memory usage of a system. I can imagine such simple tool be helpful for example for headless systems to use instead of screens to display simple metrics.



This break-out board makes digital and I2C peripherals available for system, both under Linux and Android. The latter is enabled by the VIA Smart ETK, and as far as I know, this is the first (and so far only) Android system that can use Grove! Besides this very simple version, I'm already thinking about an expanded version which includes serial connector, and hopefully pulse-width-modulation (but that's for the future).


In the meantime, I'm trying more interesting use cases with the compatible Grove peripherals, checking out if this would work well enough for industrial IoT and connected devices.


For a much more detailed writeup, please check this blogpost! Also find GroveHat on Tindie, the indie hardware marketplace.

Zach Shelby

ARM mbed at Dreamforce

Posted by Zach Shelby Sep 15, 2015

Salesforce kicked off their conference today with the announcement of their new Salesforce IoT Cloud, Powered by Thunder. This new service will allow “billions of events from devices, sensors, applications and more from the Internet of Things (IoT) to Salesforce--enabling companies to unlock insights from the connected world.” ARM is one of Salesforce’s launch partners, enabling seamless and secure integration of devices with the Salesforce IoT Cloud using mbed. We’re excited how this will enable new innovative uses of IoT in enterprise and customer experience solutions.

Connecting devices to Salesforce using mbed

ARM also will be showcasing some of our latest IoT demos working with Salesforce in the Developer Zone in the Moscone West (level 2). ARM mbed IoT Device Platform makes it easy to connect IoT devices to Salesforce. At the ARM booth we will show you how a Barista can track how much coffee they serve each day and how your doctor can track and be notified of your health risks remotely.

ARM mbed Workshops

The ARM mbed™ team will be leading four hands-on workshops during the Dreamforce conference. If you want to learn how to connect mbed devices into Salesforce with mbed Device Server, these workshops shouldn’t be missed. In this workshop you will use a BLE-based heart rate sensor device to connect to mbed Device Server service and then will connect that service into Salesforce.

IoT Zone:  Connecting ARM mbed devices into Salesforce with mbed Device Server

Location: Moscone West, Internet of Things Lab #1

  • Tuesday, September 15 at 3pm & 5pm
  • Wednesday, September 16 at 11am
  • Thursday, September 17 at 3pm



Welcome back, to the second instalment of the Sensors to Servers demonstration blog. In this entry, we’ll talk in more detail about the sensor nodes developed for the demonstration. This will include some details on the development board we selected, along with the specific sensors we integrated.


Sensors: Platform

As previously noted, for Sensors 2 Servers we built the nodes utilising the existing mbed platform and online tool chain. One of the great things about mbed is the wide variety of development platforms to choose from - this list is constantly growing. Developing a demonstration gave us the luxury of relaxing some of the constraints that might otherwise have applied - cost and power became less important than ensuring sufficient interfaces and rapid development. So, we could afford to choose a board with more i/o, processing and memory than strictly required in exchange for simplifying our development. With this in mind, we looked for a device that could support both Ethernet and the 802.15.4 6LoWPAN radios we had chosen. These radios were available on daughter boards with an Arduino footprint, so a development platform with Arduino headers would be perfect. A Cortex-M3 or Cortex-M4 with plenty of memory would ensure we didn't bump up against these constraints during development, while plenty of i/o interfaces would ensure we could get all our sensors connected. With these requirements in mind, we chose the Freescale™ K64F. This features a Cortex®-M4 running at 120MHz, a built-in Ethernet port, and the expansion port footprint we desired. It also featured an on-board accelerometer, so we had one less sensor to incorporate.


Sensors: What to measure

Applying a criteria of what could be measured and what was likely to be interesting in a trade show context, a short list of things to measure on the booth was created:

  • Temperature
  • Ambient Noise
  • Passage through doors
  • Passage on/off booth
  • ‘Presence’ on different parts of the booth
  • Height of people passing through doors

This list is by no means exhaustive, and there were other candidates but these were considered the most achievable and probably most interesting.

Sensors: Temperature

Selecting a temperature sensor was straight forward, as many discreet sensors are available and mbed already has open source libraries or examples for many of these. We acquired some Arduino footprint prototype boards to serve as our ‘sensor shields’, and mounted all the external sensor modules on these. We used the RHT03 sensor and simply polled it once per minute, and reported the value to the server.


Sensors: Microphone

There are many microphone modules on the market as well. We selected one with a built in amplifier, the Maxim Integrated MAX9814. This has adjustable gain, and a simple analogue output with a range of about 2 Volts peak-to-peak.



We didn’t need any complex audio processing, just a very simple estimate of how ‘noisy’ a given station was. As our sensor node was doing several other things concurrently, we couldn’t be sampling the audio line continuously, and we also needed to avoid flooding the server with too much data. We decided to report a ‘noise level’ back to the server every 10 seconds. We experimented with different ways of generating that value, but settled on averaging the maximum peak-to-peak voltages measured on 50ms samples taken every 150ms. This meant our sensor was “listening” about one third of the time, leaving plenty of time for other processing whilst still giving a reasonable flavour for ambient noise over the course of the day.


Sensors: Door Trips

Passage through the door of the booth was implemented with ‘trip’ sensors. At first we experimented with some infrared devices that worked without reflectors, but these didn’t function well at the range we were targeting, and were more difficult to program for as they generate an analogue voltage that requires some interpretation. We switched to a laser break beam sensor, but one which didn’t require a reflector, but has a lens which detects the dot. This also resulted in much simpler software as the output line of the sensor simply pulled to ground when the laser triggered.



At the 2015 Embedded World conference in Nuremburg, in addition to doors the ARM booth was laid out in such a way that there were 4 ‘corridors’ leading on to the booth. We decided to deploy sensors here as well, which would give us a good idea of overall footfall onto the booth. We used a LED Retroreflective Photoelectric sensor with a range up to 7.5 m. This required a reflector, but achieved the distance necessary to instrument the booth corridors.



Both these detection methods would be prone to some error – if two people walked next to each other, for instance, or if someone lingered in the beam path. These scenarios would be more likely in the longer range scenario. But on the whole they proved accurate enough, and we got some good data from the events where they were deployed.

Sensors: Presence

‘Presence’, in the context of this demonstration, was treated as an indication of whether people were in a particular region of the booth or not – in front of a specific demo, in a given meeting room, or at a table or desk. We wanted to use this to create a ‘heatmap’ of which areas of the booth were most popular. We measured presence with three different sensors.


The simplest was a PIR motion sensor. This was suitable for the small meeting rooms, or constrained areas, where the sensor was unlikely to pick up spurious results. This is a simple sensor with a single output line that goes high when motion is detected. We had mixed results with this sensor, possibly due to the calibration requirements or the range and coverage zone of the sensor.



Demo stations were more difficult for a PIR, as background movement not focused on the demo would be more likely to cause false positives. So, we used a Maxbotix® MB1014 ultrasonic proximity sensor. This sensor gives a simple proximity alarm signal when an object enters a detection range of about 1.5 metres, in a fairly narrow cone. This allowed us to tell whether somebody was standing in front of a demo station with a much better degree of accuracy.



Finally, we had several tables at some of our booths, and we wanted to detect when somebody was sitting at the table. To do so, we elected to use the on-board accelerometer of the K64F board as a movement sensor. This device is very sensitive, so gave a good indication of when there was activity in the vicinity of the table.

Sensors: Height

The final instrumentation was measuring the height of people walking through a specific doorway or arch on the booth. We again used an ultrasonic device for this, but this time the MaxBotix MB1010. This works in a range-finding mode, with a fairly narrow detection cone.



The sensor was mounted directly above the door, pointed down. Given a known mount height, this made it a simple matter of subtraction to find the height of a subject passing through. The software was a bit more difficult than that though. It was necessary to tune the sampling rate, and set start and end conditions for each measurement. Our sample rate was 50 milliseconds, so each time someone walked through the doors, the sensor needed to watch for the maximum height (actually the minimum range) of the samples, and detect when the person had passed. Multiple people walking through the door in close succession was a likely scenario, so a fair bit of testing was required to tune the algorithm sensitivity for detecting when a person had passed. If it was too sensitive, one person would register as many, but if it wasn’t sensitive enough many people would register as one. This led to some rather comical scenes with four of our engineering interns trooping into and out of our office over the course of an afternoon’s testing.



That concludes the second episode of the Sensors to Servers blog, where we've covered platform and sensor selection. In the third, we’ll talk the server we used for the demo and the visualisation application we developed. Stay tuned!

They have different interests. One teaches dance. Another is a helicopter pilot-in-training. One is a software engineer. Another is a carpenter. Still another, a studio artist. But they came together with a shared purpose: to try to improve conditions in Third World countries.


half of team post hike.jpgMeet William Weatherholtz and team, who just won the inveneo  solar-powered Micro-Data Center Design Challenge (Inveneo Launches ARM Micro-Data Center Design Challenge 2015 Bruce Baikie) for their Micro Weather station design. The team’s winning entry is an object lesson in how creative methodology, a diverse team and carefully considered components selection just might help transform developing societies.


“I have a soft spot for Third World countries and I’m really interested in finding ways to improve conditions there,” Weatherholtz said in an interview. “I felt like this was a project that played to my strengths and my desire to educate.”


The design criteria for weather stations is unique: How do you deal with rain, rust, long-term durability, a lack of power sources, and little critters that like to gnaw on things in the wild? Weatherholtz (pictured to the far right of the nearby photo) and his team (pictured, L-R: Garrett Johnson, Victoria Johnson, Kelly Weatherholtz; not pictured Joshua Wickern, Bradley Weatherholtz, Landon Weatherholtz) embraced a unique methodology that included using Edward de Bono’s Six Thinking Hats philosophy. The approach is designed to help improve team perspective and collaboration during projects.  This was a particular interest because the seven team members were dispersed across the country.


“Everyone was assigned a different perspective,” said Weatherholtz, a mechatronics engineer. “So for example, someone was assigned an aspect of the design that only considered price; someone else would focus on aesthetics, and so forth.”


The team rotated through these different considerations and perspectives and then amalgamated different parts of the design into the one they liked.


The team started by identifying the customer needs and translating those into engineering characteristics:


  • What type of battery was required?
  • How much back-up power would be needed? (the team targeted five days for it to run on back-up power initially but ended up at 2.5-3 days—more on that shortly).
  • What other design considerations might be unique for a developing country?
  • What were the environmental needs of the device casing?


Here’s a look at how the team tackled some of the design considerations.



This was an extremely critical component that needed to be as reliable as possible. Additionally, the team had to understand how much power they could pack into a small space. Should they push the limits for longer back-up power capability and accept the consequences? Additionally, what type of batteries could be shipped internationally?Inveneo winning design v3 upright.jpg


“We tried to pick a battery with a very high energy density and moderate size, but the battery is still pretty heavy and large,” Weatherholtz said. “Adding another battery would mean another cubic foot of space and an extra 60 pounds in the design.”


At the end of the day, two and a half days backup capability seemed good enough for most applications, he said. That meant the battery could recharge in four hours with sufficient power, and most places get at least five hours of good sunlight, he added. The team ended up selecting an absorbent glass mat (AGM) battery—essentially a golf cart battery—that doesn’t spill, tip or have vulnerable components inside.


Solar technology

This was one of those developing-country considerations, where ready reliable power sources are hard to find, if not non-existent. Even though it was the team’s first time working with solar, adopting the technology was key. “It’s a fantastic solution in a remote data center application because a data center is a static structure,” Weatherholtz said. “It allowed us to take advantage of that big fusion generator in the sky.”



The team considered plastic but wanted the system to be able to take a pounding. So they settled on aluminum, a reliable material, which conveniently could serve as a sizable heat sink. They designed to a worst-case scenario of 50C ambient temperature with direct sunlight, no humidity, and no moving air.

“One of our main design criterion was to make the enclosure—and enclosed electronics—reliable.  For us, that meant it needed to be completely sealed with no moving parts,” Weatherholtz said.


Single board computer

The contest criteria specified the SBC. As a designer and engineer, Weatherholtz said he doesn’t really like being shoehorned into a solution, but, that said, “the Banana Pi boards were hard to beat,” he acknowledged. The Banana Pi, based on ARM Cortex-A7 with Mali-400 GPU and running open source software, is designed to be inexpensive, small and flexible.

The technology was “robust, open source and low power,” he said. “When you’re dealing with IoT applications and micro data centers, you don’t have a lot of power and you can’t have a lot of heat, so ARM is best.”


One challenge is that boards such as this typically have two sources of heat — RAM and processor.   The team undertook considerable thermodynamic analysis and determined that getting rid of heat was key. The Banana Pi boards were ideal, Weatherholtz said, because the two sources of heat were on the bottom face and as a result, the team was able to direct the heat in the optimal direction. Had the CPU/RAM been on top, then it would have been more challenging to get the heat out, he added.

What was his biggest lesson? Overview Thermal Analysis 4 sized.jpg


Weatherholtz and team spent a total of 150 engineering hours on the project, for which competitors used ARM-based solutions to create the “micro-board chassis” designs. They will share the $10,000 prize and the design will be built and deployed in the developing world.


“I really can’t overstate the importance of thermal analysis in projects like these,” Weatherholtz said. “If heat doesn’t have a good way to escape, it’s going to build up and cause high temperatures that make your electronics fail, or at least fail prematurely.”


He added:


“For us, making a low thermal-resistance path out of the case was a main design consideration.  We identified where the heat was being generated (see image right), and then got it out.  Everything centered on that.  Where we placed components, what we placed them on, how we connected them to what they were placed on… everything.”

Related stories:

Off-Grid Server Ecosystem Primer (PDF)

Zach Shelby

mbed OS Beta is here!

Posted by Zach Shelby Sep 8, 2015

Last October we announced that ARM and our partners were creating a software ecosystem and new operating system for Internet-connected devices with standards support, low power and security as key goals. We believe the Internet of Things is all about innovators finding new ways to solve problems in our cities, enterprises and homes. Over the past year we have worked closely with leading OEMs, silicon, infrastructure and cloud players to really understand what developers need to take new innovations from idea to deployment.

All of us involved are excited to announce that ARM® mbed™ OS is now available for Beta testing. Now that we have released mbed OS Beta and a new mbed.com web page, we will be releasing mbed Device Connector Beta, mbed Client Beta, mbed Device Server 2.5, and mbed TLS 2.1.0 over the next few weeks. We have a lot in store for the future and are driving our roadmaps based on what developers need to create incredible IoT solution, let us know what you think so far!

Zach Shelby


mbed OS (Beta)

mbed OS [link] is an open source embedded operating system designed specifically for the “things” in the IoT. It accelerates the process of moving from initial idea to deployed product by providing a core operating system, robust security foundations, standards based communication capabilities, and drivers for sensors, I/O devices and connectivity.  The current release of mbed OS is considered to be beta quality, as mbed OS is under development. Beta testers welcome!

mbed Device Connector (Beta)

The mbed Device Connector [link] is a new online service that is offered to developers and provides the security, simplicity and capacity developers require to quickly prototype and test their applications and ideas in the field. Itallows developers to connect IoT devices to the cloud without having to build the infrastructure, while providing the security, simplicity and capacity developers required to prove IoT applications at scale.


mbed Device Server (2.5 release)

The mbed Device Server [link] is a middleware that connects Internet of Things (IoT) devices to web applications, allowing our Cloud Partners to deploy mbed Enabled services. It enables efficient and secure communication and device management for quickly developing and deploying enterprise applications based on open standards.


mbed TLS (2.1.0 release)

The mbed TLS [link] simplifies development for developers to include cryptographic and TLS/DTLS capabilities in their products. Also, mbed TLS will now be delivered under the Apache 2.0 license.

mbed Client (Beta)

The mbed Client [link] helps non-mbed OS based devices to connect to mbed Device Connector, mbed Device Server and build web applications using cloud services provided by ARM  partners.. It implements a subset of mbed OS functionality and is optimized for constrained networks and devices; it uses the Constrained Application Protocol (CoAP) to provide energy-efficient communication, supports communication security using mbed TLS and device management via OMA LWM2M. All this functionality is provided via easy to use high level C++ APIs.


As an engineer in ARM's Applied Engineering Team, my job is to create proof-of-concepts with the latest ARM-based technologies. This sounds like an incredibly fun remit - and it is! - the biggest challenge choosing which ARM technology to focus on. It was thus fairly exciting to flip this on its head as we were designing Sensors to Servers, and to actually strive to show the breadth of ARM technologies involved in an Internet of Things (IoT) system.


IoT refers to the connection of all kinds of objects, big and small, static or mobile, to the Internet and all the services available there. If you've been in the industry for some time, you'll recognise this as an evolution of embedded and distributed M2M systems that has be building over the course of many years. Ease of development is one theme we explored quite thoroughly in building this demonstration, and the changes here made the biggest impression on me personally relative to when I got my start in building not dissimilar systems almost two decades ago.

The Sensors to Servers demo is an end-to-end demonstration touching several of the ARM technologies driving IoT. This includes ARM Cortex® processors on the embedded sensor or device side with the Cortex-M processor family, but also in the data centre with one of the first commercial ARMv8-A 64-bit Cortex-A server-specific System on a Chip (SoC), and in the client devices we used to present and interact with our model IoT environment. On the software side, the ARM mbed IoT Device Platform gave us the core functionality for standards based communication in our system, so that we could focus our development effort on the device and application functionality.


This blog is the first in a series on the Sensors to Servers demonstration. This first episode will present the basic topology of the demonstration and discuss the connectivity standards used.


Sensors 2 Servers Demonstration – What is it?

The demonstration is a network of connected sensors, reporting to an ARM powered server. The server then presents this sensor data as a series of rich web page visualisations which can be viewed on your ARM powered mobile, tablet, Smart TV, or any gadget with a good web browser. You might recognise this as a Sensor Network - which is exactly what it is, only with an engaging visual front end for the purpose of attracting attention at trade shows, as opposed to a practical application such as closed loop climatic control, or interactive media control. We deployed this system to instrument our booths at several industry trade shows, including the Consumer Electronics Show® (CES), Embedded World (EW) and Mobile World Congress™ (MWC).






For connectivity in our trade show booth environment, we elected to use a mixture of wireless and wired communications. For wireless, we selected a 2.4Ghz 802.15.4 6LoWPAN solution that had been used in previous ARM mbed demonstrations. We also planned to have about half of our sensors connected via wired Ethernet. This gave us a mixed topography demonstrating some of the breadth of mbed connectivity support.



Along with the main sensor nodes, on some occasions we also integrated an additional sensor which used ARM’s new Cordio® Radio IP for Bluetooth communications. These were bridged into the rest of the system via an Android Smartphone, analogous to how a real world wearable sensor might be integrated to such an environment.


The physical medium is only one part of the communications story. As mentioned before, to implement the crucial link between the sensors and the server, we leveraged the ARM mbed IoT platform. Utilising the existing mbed 2.0 environment, with some of the additional capabilities that will be baked into mbed OS (arriving very soon now), we were able to rapidly establish connectivity between our sensors and an instance of the mbed Device Server running on the server. This allowed us to focus on the development of our sensor capabilities on the embedded side, and on the collation and presentation of the data on the server side.



That concludes this first instalment of the Sensors to Servers blog. Next time we’ll talk more about the sensors we developed for the demonstration. Thanks for reading!

The Internet of Things is one of the most exciting new platforms for app development, especially as more and more people interact with connected devices every day. But it also poses a host of challenges for developers, as they must wrestle with the complex task of maintaining a backend with a whole new set of constraints. Many IoT devices also need to be personalized and paired with a mobile companion app. Cognizant of this, the Parse team is striving to make it simpler.


At F8 this year, Parse for IoT was announced — an official new line of SDKs for connected devices, starting with an SDK targeted for the Arduino Yún (ATmega32U4). Now, Parse has shared that they are expanding their lineup with four new SDKs built with Atmel, Broadcom, Intel and TI. This will make it easier than ever to use Parse with more types of hardware and a broader range of connected devices. For example, you can build an app for the Atmel | SMART SAM D21and WINC1500 — and connect it to the Parse cloud in minutes, with nothing more than a few lines of code.


“We’ve been excited to see the creative and innovative things our developer community has built since we first launched Parse for IoT at F8. Already, hundreds of apps for connected devices have been created with the new SDKs,” explains Parse software engineer Damian Kowalewski. “Our tools have been used to build exciting and diverse products like a farm-to-table growing system that lets farmers remotely control their equipment with an app (Freight Farms); a smart wireless HiFi system that syncs music, lighting and more (Musaic); and even a smart BBQ smoker that can sense when meat is perfectly done (Trignis). Here at Parse, we had fun building a connected car and a one-click order button. And we’ve heard that our SDKs are even being used as teaching tools in several college courses.”


As to what’s ahead, this lies in the hands and minds of Makers. From a garage hacker’s weekend project to a production-ready connected product, manufactured at scale — Parse can power them all. Ready to get started? You can download the new SDKs and access QuickStart guides here.

Filter Blog

By date:
By tag: