(This is a wrap up of my thoughts from day 3 of Embedded World 2017. For highlights from day 1 and 2, check out EW17 day 1 blog: IoT security and Lego cities and EW17 day 2 blog: All about automotive)
One of the things that has struck me walking around the halls of Nuremberg Messe this week is that the classic definition of ‘embedded computing’ needs to be updated as the market has evolved significantly over the last few years. Small 16 or 32-bit MCUs that perform primarily control functions will always have their use, of course, but the level of connectivity and intelligence that modern MCUs are capable of has opened up a wide range of entirely new applications. First, a note on the evolution that ARM itself has undergone over the past three decades. People walking around the ARM booth will have noticed the chip layout gallery on one of the exterior walls. To-scale layouts of the ARM1 and Juno chip designs were displayed side by side, showing the explosion in complexity that has occurred between 1985 and 2013 when both chips were respectively designed.
Three chip design layouts all at the same scale: ARM1 on the left, Cortex-M0 in the middle, Juno on the right
Considering both chips had a similar physical footprint (ARM1 – 50mm2, Juno – 65mm2), the up-close difference in complexity is striking as well as the transistor count (25,000 vs 500,000,000), providing an example of Moore’s Law in action. For comparison, a scaled version of the ARM Cortex-M0+ was included as well which is roughly 1000 times smaller than both designs, but packing in 50,000 transistors.
Moving on from ARM history and into the new areas that could have a huge influence on the industry over the coming years, two of the new trends I was able to catch over the few days of the show were embedded vision and heterogeneous multi-processing systems.
On a panel discussion on "Embedded vision - the next big thing?", ARM's Richard York explained that it is similar to computer vision but with lower power consumption and smaller area for use in constrained environments. Of course one of the biggest and most exciting applications for embedded vision will be in ADAS and autonomous driving, but it has the potential to effect everything from industrial factories, retail supermarkets, home security and crop monitoring. The technology is still in a development phase, so there is a lot to be learned from the whole industry about embedded vision through collaboration. This brings benefits for the entire ecosystem, as the ARM architecture is a key enabling factor to a lot of what is going on here.
"Embedded vision can have an almost infinite list of applications" Richard York, ARM
Stepping from embedded vision to a new way of designing systems, heterogeneous multi-processing (HMP) systems have been gaining traction due to the increased requirements on the power and computing profiles of embedded systems. In his talk, "Optimizing ARM Cortex-A and Cortex-M based heterogeneous multiprocessor systems for Internet of "Intelligent" Things", Kinjal Dave explained that modern compute systems must be designed to:
To meet these conflicting requirements, modern system designers rely heavily on building heterogeneous compute systems.
Heterogeneous compute is fundamentally about using the right processor optimized for a set of tasks.
Applications such as wearable devices, smart home assistants and even small robots like the Toradex are all examples of HMP systems today. As the concept gains in popularity the need for efficient software development increases, as addressed by Stefano Cadario in his talk "Efficient software development with heterogeneous devices". You can learn more about this emerging system design in Kinjal's blog Understand why system architects are implementing HMP systems for embedded applications.
And with that, the Embedded World show is over for another year! Please share your own impressions of the event in the comments section, what were the things that caught your attention?