How to prepare tflite for vela compiler?

First of all, I'm not sure if it should be posted here.

If there is an error, please forgive me.

I want to try ethus-U NPU. (developer.arm.com/.../The-Vela-compiler

Therefore I need to use the vela compiler (pypi.org/.../)

I use the example of the following website to make a tflite (www.tensorflow.org/.../post_training_integer_quant)

Then use vela compiler to compile tflite, and then see this warning

 Warning: PACK 'sequential/reshape/Reshape/shape' is not supported on the NPU. Placing on CPU instead
  - Input(s) and Output tensors must not be dynamic
   Op has dynamic tensor(s): sequential/reshape/strided_slice2
 Warning: STRIDED_SLICE 'sequential/reshape/strided_slice2' is not supported on the NPU. Placing on CPU instead
  - Input(s) and Output tensors must not be dynamic
   Op has dynamic tensor(s): sequential/reshape/strided_slice2


I'm not sure what does ''dynamic'' mean for this compiler?

This code comes from part of the example mentioned above:

model = tf.keras.Sequential([
        tf.keras.layers.InputLayer(input_shape=(28, 28)),
        tf.keras.layers.Reshape(target_shape=(28,28,1),input_shape=(28,28,1)),
        tf.keras.layers.Conv2D(filters=12, kernel_size=(3,3),activation='relu'),
        tf.keras.layers.MaxPool2D(pool_size=(2, 2)),
        tf.keras.layers.Flatten(),
        tf.keras.layers.Dense(10)
])
l

I’m not sure if this is the case for functions like reshpae, or it’s my coding problem.

Parents
  • Hi everyone,

    About a year ago, I started working on mobile machine learning and although I don't claim to be an expert by any means, I wanted to share with you all a technical guide to train a custom model with TensorFlow Lite and then build an Android app that uses that same model. The post includes 2 parts:

    This post is about using the TensorFlow framework to evaluate and train a model for inference on mobile - that includes optimizing and converting it for the TFLite mobile framework.

    In the 2nd post, I go over how to add the model that we've trained to your app - this includes pre and post processing and using the TFLite framework.

    Let me know what you think and I hope this helps you start getting into mobile ML

    www.targetpayandbenefits.com

Reply
  • Hi everyone,

    About a year ago, I started working on mobile machine learning and although I don't claim to be an expert by any means, I wanted to share with you all a technical guide to train a custom model with TensorFlow Lite and then build an Android app that uses that same model. The post includes 2 parts:

    This post is about using the TensorFlow framework to evaluate and train a model for inference on mobile - that includes optimizing and converting it for the TFLite mobile framework.

    In the 2nd post, I go over how to add the model that we've trained to your app - this includes pre and post processing and using the TFLite framework.

    Let me know what you think and I hope this helps you start getting into mobile ML

    www.targetpayandbenefits.com

Children
No data
More questions in this forum