This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

How to implement an infused relu layer for a FC layer?

Dear all,

I was trying to convert a tflite model to an ARM NN model. In tensorflow lite implementation, there is a mechanism called fusedActivation, which infuse the relu activation function with a FC layer. However, I didn't see any parameter in Fully-connected Layer Functions that can set up infusing in the ARM NN library.  Does ARM-NN support this mechanism natively?

Many thanks!