I am trying to generate a piece of code on an M4 with an exact known runtime, independent of the input.
Currently my bottleneck is that the duration of a division (udiv) is dependent on the input and therefore variable in execution time. Is there a way to ensure that my division lasts a same amount of instructions for each input?
Note: I am trying to write this with as minimal overhead as possible due to rather extreme execution time constraints.
CredibleBH Wrote:
Thanks for the update and quick reply. I'll be sure to keep an eye on this thread. Looking for the same issue. Bumped into your thread. Thanks for creating it. Looking forward for solution.