Sensor hubs have gained traction among design teams in recent years, as the marriage blossoms between evolving (and power-sensitive) hardware and smart software algorithms.
To me, it’s an interesting phenomena in system design, where it’s almost always about tradeoffs: Reduce your footprint, and you’re probably raising cost (at least initially); cut your power and you’re likely giving up some performance and so on.
But in sensor hubs, we’re seeing lower system power without sacrificing performance. That is delivering added functionality and value for end devices and systems.
Our smart phones know better where we are (or should be) thanks to sensor fusion, and they’re sleeping more intelligently (and waking up more accurately) depending on the sensor data they’re receiving. With advances in low power MCUs and efficient sensor processing software, sensor hub solutions are now enabling a new generation of wearables too.
Why this win-win situation?
“The whole idea of a sensor hub is, on the one hand, you have a tiny MCU with an ARM Cortex series low-power processor. On the other hand, you have these big application processors with multithreading and tasking for applications,” said Roy Illingworth, director of systems engineering with Hillcrest Labs. “The apps processors need to have a lot of power, and if you have sensors attached, then you keep getting interrupts all the time. That means your apps processor is prevented from sleeping or going idle.”
The sensor hub connects the sensors to manage the interrupts, which means the apps processor is up only when it needs to be. At the same time, the sensor hub is optimized with software to handle some complex sensor fusion tasks.
Take, for example, one of the holy-grail functions: pedestrian dead reckoning. In this use case, you’re inside a building where GPS is useless but you need location granularity. A combination of accelerometers, gyroscopes, and magnetometers helps estimate the phone’s orientation through sensor fusion. A pedestrian navigation algorithm then enables location granularity.
There are design considerations to be aware of, according to Illingworth. Chief among them is the sensor hub SoC’s MCU.
“You need to account for the fact that you now have an MCU where you didn’t have one before,” he said in an interview. “There are fundamental changes to the architecture. You have to connect sensors into the MCU and connect the controller itself into the apps processor. You have to think about interconnect.”
Designers also need to think about what kinds of applications they’ll be targeting because they’ll have to identify the necessary drivers and how they function within, say, the Android application they may be targeting.
“There’s lots to do with testing, quality assurance with new sensor hubs,” he said. Calibration is another consideration, Illingworth notes.
This too is a non-trivial challenge: Since MEMS are small machines, they’re affected by temperature and other environmental factors.
“We found a lot of issues related to the magnetometer,” Illingworth said. “There can be a lot of magnetic interference. Smart phones, speakers, even headsets have magnets. Those affect the magnetometer. You have to constantly calibrate for different variations.”
There’s no doubt that the rise of sensor hub architectures and algorithms is adding huge value to end devices, and more and more design teams are considering them.
For an excellent overview, check out Diya Soubra’s sensor hub slide show (Sensor Fusion, Sensor Hubs and the Future of Smartphone Intelligence) and the Hillcrest Labs Company Overview page. And make sure you download the White Paper: Cortex-M7 in Sensor Fusion.