I have a question about how to make Ethos-U NPU work on a ARM Cortex-A + Cortex-M processor. First, I found ethos-u-linux-driver-stack and ethos-u-core-software on https://git.mlplatform.org/.
1. I know ethos-u-linux-driver-stack is Ethos-U kernel driver. Should it be integrated into the Linux OS running on Cortex-A or be integrated into the Linux OS running on Cortex-M? I am nor clear about which core it need to perform on.
2. For ethos-u-core-software, how to run it? I did't find the detail steps to run it. Does it run on NPU or any core?
3. Except the above two repos, is there any other repo necessory to make Ethos-U NPU work on an ARM Cortex-A + Cortex-M processor?
Thanks for your suggestion in advance.
Arm has analyzed the most common AI networks in the embedded space and tried to map the operators to the Arm Ethos-U. How well the this maps for you depends on what networks you want to run.
The software stack for Arm Ethos-U has been designed to fall back to Cortex-M for operators that are not supported by the NPU. Running TLFu on Cortex-A and dispatching custom operators to Cortex-M and the NPU could be possible, but is nothing we have planned to implement. In the Linux Driver Stack for Ethos-U we have provided an example how a Linux user space process can dispatch inferences to an Arm Ethos-U subsystem.