Hello Everyone, I am Emmanuel Katto. I'm working on a mobile app that uses AI to perform image recognition, and I'm struggling to optimize the model deployment for low-power consumption. I've noticed that the app's power consumption is quite high, especially when running the AI model, which is affecting the user experience.
Can anyone share some best practices for optimizing mobile AI model deployment for low-power consumption?
I'm currently using TensorFlow Lite and OpenVINO on Android, but I'm open to exploring other options.
Are there any specific ARM-specific features or tools that can help with AI model optimization and deployment?
Please let me know.
Thanks!
Emmanuel Katto
Hi Emmanuel
You may find this recent announcement, and associated learning paths very useful and informative:
https://newsroom.arm.com/blog/kleidiai-integration-mediapipe
Thanks Ronan Synnott