How can i achieve a fixed timing using udiv?

I am trying to generate a piece of code on an M4 with an exact known runtime, independent of the input. 

Currently my bottleneck is that the duration of a division (udiv) is dependent on the input and therefore variable in execution time. Is there a way to ensure that my division lasts a same amount of instructions for each input? 

Note: I am trying to write this with as minimal overhead as possible due to rather extreme execution time constraints. 

Parents
  • From the doc:

    a. Division operations use early termination to minimize the number of cycles required based
    on the number of leading ones and zeroes in the input operands.

    Maybe you can try to adjust those leading zeroes before division and the right shift the result.

    Or you can measure the cycles and do a bunch of NOPs to reach the WCET.

Reply
  • From the doc:

    a. Division operations use early termination to minimize the number of cycles required based
    on the number of leading ones and zeroes in the input operands.

    Maybe you can try to adjust those leading zeroes before division and the right shift the result.

    Or you can measure the cycles and do a bunch of NOPs to reach the WCET.

Children