We are very pleased to announce the launch of a machine learning how-to guide – Deploying a quantized TensorFlow Lite MobileNet V1 model.
The guide provides an end-to-end solution on using the Arm NN SDK. It walks you through creating a program which can take a .tflite model file and real images and produce usable labels. The resulting program is complete and can be used “in the wild”. The new guide fills the gap left by other guides by including sections on pre-processing images for the quantized model and interpreting the output labels.
The focus on quantization also sets the guide apart. Quantized models tend to perform better in terms of inference time, at the cost of marginal accuracy loss. In addition, quantized models are smaller than their floating-point counterparts in memory footprint, making them ideal for being shipped with mobile applications.
For mobile app developers who are eager to use and test Arm NN for image classification on mobile devices and those that want a minimal codebase to play with and embed in their apps as quickly as possible, the guide is a great place to get started.
[CTAToken URL = "https://developer.arm.com/solutions/machine-learning-on-arm/developer-material/how-to-guides/deploying-a-quantized-tensorflow-lite-mobilenet-v1-model-using-the-arm-nn-sdk/single-page" target="_blank" text="View guide" class ="green"]