The evolution of vision (the eye) is considered one of the most significant events in the history of life on Earth. 540 million years ago, during the Cambrian period, there was a sudden burst of evolutionary activity that resulted in the appearance of a variety of new species. Many of these species were characterized by the development of an eye which allowed them to perceive and interact with their environment in a more sophisticated way.
Similarly, the integration of vision into Internet of Things (IoT) devices revolutionizes how these devices interact with and perceive the world. With the ability to see and interpret their surroundings, IoT devices will drive a Cambrian explosion in IoT use cases that were not economically viable before. Many use cases are much simpler to implement with vision. For example, there is no need for a car presence sensor under each parking place as a camera sees the entire street, and with the proper AI model, will identify empty parking spaces.
The following table shows a mapping of use cases per location. Clearly, the same use case is applicable across all categories, but the deployment and management will be different.
Vision-related applications for the smart home:
Vision-related applications for the smart building or office:
Vision-related applications for smart retail or shop:
Vision-related applications for the smart factory:
Vision-related applications for the smart city:
The Machine Learning (ML) Neural Networks used for vision use cases are similar across these verticals:
The latest reference design, the Corstone-320, is the best means to capture all these IoT vision market opportunities. The reference design integrates IP, software and prototyping platforms to reduce complexity for the SoC designer and the software developer. The Corstone package includes a collection of system IP, and a subsystem designed to integrate the following Arm IP:
The Corstone package includes technical reference manuals, configuration and rendering scripts, plus the verification reports. ASIC developers then build an SoC around the subsystem to a specific segment requirement or use the package to explore Arm’s compute design intent before starting their custom design.
The vision use case is the most complex of the Edge AI use cases. Since most of the IP components are optional and the design is configurable and modifiable, the simpler use cases are easily addressed too.
The package includes an FVP which models the subsystem. The FVP is used by software developers to accelerate development by removing the need for hardware to start application development. https://developer.arm.com/downloads/-/arm-ecosystem-fvps.
The reference design software includes firmware, drivers for all the IP, middleware, RTOS and cloud integrations, ML models and reference applications. Software developers select the components required for their specific segment and build the IoT stack for that device using the development tools of their choice.
Three different software packages are available to developers.https://review.mlplatform.org/plugins/gitiles/ml/ethos-u/ml-embedded-evaluation-kit
Another for building functional ML enabled applications. https://review.mlplatform.org/c/ml/ethos-u/ml-embedded-evaluation-kit/+/12569
A full device stack with firmware update, middleware, Reference ML application and cloud connectivity. The open-source applications demonstrate keyword spotting, speech recognition and object recognition use cases. https://github.com/FreeRTOS/iot-reference-arm-corstone3xx
Refer to the Corstone-320 software blog post that goes into the technical details on each of those components.
Overall, the Arm Corstone has been designed for:
Below is a high-level description of the key IP that the subsystem integrates:
Cortex-M85:
The highest performing Cortex-M processor with Arm Helium technology provides the natural upgrade path for Cortex-M based applications that require significantly higher performance and increased security.
In addition to Arm TrustZone technology, Cortex-M85 integrates the new pointer authentication and branch target identification (PACBTI) architectural extension to mitigate against return-oriented programming (ROP) and jump-oriented programming (JOP) security exploit attacks.
Advantages of using Arm Cortex-M processors in low-cost low-power IoT vision device:
Arm Cortex-M85 is a great choice for such cameras because they offer a combination of energy efficiency, high-performance, security, and flexibility.
Ethos-U85:
3rd generation NPU from Arm
Mali-C55:
DMA-350:
The integration of vision into IoT devices represents a significant opportunity for innovation. By allowing these devices to perceive and interpret their surroundings in a more sophisticated manner, it unlocks a wide range of new applications and capabilities that were previously not economically viable. Just as the evolution of the eye triggered the Cambrian explosion in species, the integration of vision into IoT devices has the potential to drive a similar explosion of innovation and evolution in IoT devices.
The Corstone-320 reference design for low-cost, low-power Intelligent IoT Vision is the easiest means to develop devices for these subsegments as the combination of integrated software and hardware dramatically reduces the complexity of SoC design and accelerates software development.
Finally, Arm has the largest ecosystem of AI partners that supply competitive ML models, and software to meet the diversity of IoT vision use case requirements from the most powerful high-end to the battery-operated ambient applications.
Browse partners by industry, by location and by products: (https://www.arm.com/partners/ai-ecosystem-catalog) Option 1: Arm® ML embedded evaluation kitOption 2: CMSIS-Pack based Machine Learning ExamplesOption 3: IoT Reference Integration