Hi,In the ARM ML Zoo, https://github.com/ARM-software/ML-zoo only KWS models have an example TensorFlowLite for Microcontrollers code at https://github.com/ARM-software/ML-examples/tree/main/tflu-kws-cortex-m/kws_cortex_m
For noise suppression https://github.com/ARM-software/ML-zoo/tree/master/models/noise_suppression/RNNoise/tflite_int8 there is no TensorFlowLite for Microcontrollers example code.
Same for the speech recognition using wav2letter, github.com/.../tflite_pruned_int8
How did ARM benchmark the performance on ARM Cortex M, A etc ?
Are the inferencing codes on microcontrollers using TensorFlowLite Micro available for public access ?
Thanks,
I think using Tensorflow in Raspberry Pi is easier and more popular.
Thanks tepalia02. Yes, Running the TF/python codes in Raspberry Pi would be much straightforward and suitable for Cortex A.But I am intending to deploy the models (speech recognition or noise suppression) to ARM Cortex M series (M4 or M7) microcontrollers, hence this approach would not work.That is why I was checking if the inferencing codes using TFLite-Micro would be available for public access, similar to the KWS code that ARM has provided in the github.
Looks like till date tensorflow lite supports ARMcortex M3. Could not find any update about M4 or M7 at tensorflow's website. Let's see if anyone from the ARM authority says something positive.
tepalia02TensorFlow Lite Micro can run on ARM Cortex M4 and M7 for eg) Arduino Nano 33 BLE Sense and even their ARM ML Zoo KWS models and TFLite-micro code is demonstrated to run on ARM Cortex M7 and M4.The partial list is available at www.tensorflow.org/.../microcontrollers