The Rest of the Story: Sensor Integration and Processor Selection

By John Blyler

Tighter integration of sensors via hubs and changes in the way ARM Processors are selected encourages IP-based ecosystems.

During the recent ARM TechCon 2013 I interviewed ARM’s Will Tu, Director of Embedded and diyasoubra Soubra, CPU Product Manager, about challenges that Internet of Things (IoT) developers face in dealing with the inherent analog nature of sensors and the resulting data algorithms for application like contextual awareness and predictive analysis. We also talked about the innovation that is emerging from the evolution of connected sensors and many other things. You can read the main interview at: “Sensors and Algorithms Challenge IoT Developers

One of our discussions that didn't make it into that article was about the trend to tighter integration of sensor systems. This discussion, in turn, let to an important observation about processor selection in today’s design.  Let’s begin with the sensor integration discussion.

Typically, sensor developers are great at making MEMS devices. That is what they do best and really all that most of them want to do.  But a few would like to develop more vertically oriented systems. That’s one area where unique intellectual property (IP) can be created.

Such companies are moving toward the architecture of the sensor hub to achieve denser integrations and more vertically oriented systems. In his TechCon presentation, Steve Scheirey, VP of Software Development for Hillcrest Labs Labs, and ARM’s diyasoubra Soubra defined a sensor hub as a combination of a low power MCU and embedded software that provides aggregated access to multiple sensors for use in various applications. In terms of functionality, a sensor hub might perform computations for a 9-axis array, aggregating the sensor drive and running algorithms for a limited set of activities. The output would feed real-time information to a power sensitive Application Processor (AP) – such as a smartpohone - or to any application that requires low power consumption, e.g., a smart watch.


Typically, the MCU for most sensor hubs might be an ARM M0 through M4, depending on the performance needs. In the automotive space, a Cortex-A series with a graphic processing unit (GPU) might even be required, explained Tu. (see, Ford Sync In-Car Communications & Entertainment System) The GPU offers a more generic way to perform hardware acceleration or digital signal processing.

Coincidentally, the selection of the processing system – be it a MCU or microprocessor with a hardware accelerator or GPU – has changed over the last decade or so. Back when process geometries were measured at 1 or 2 microns, the resulting 8-bit processor occupied a significant portion of the die size, observed Wu. But today, at 65nm nodes and below, more and more memory has been integrated onto the die. Thanks to a sharp drop in the price of memory, today’s chips have gone from being dominated by the CPU to being a vehicle for memory.

“That is why processor selection has moved away from bit channel specific (8, 16 or 32-bits) to performance based within a supporting ecosystem,” said Tu. “Many studies have shown that the IP ecosystem is a key factor in making a processor selection, because companies don’t have the time or resources to create their own System-on-Chip (SoC) ecosystem.” Collaboration and agnostic interface standards become critical factors in this approach.