For ARM7, A hardware timer is 32-bits, and unsigned int is 32-bits too; For a 16-bits MCU, A hardware timer is 16-bits, and unsigned int is 16-bits too;
So,
t0 = timer_val; while ( ( timer_val - t0 ) < delay_val );
provide a simple and good delay;
Not so sure about, if a cast is needed.
t0 = timer_val; while ( (unsigned int)( timer_val - t0 ) < delay_val );
Just noticed that, due to the integral promotion,
unsigned short t0, t1; ( t1 - t0 ) is a signed int.
It seems that,
unsigned int t0, t1; ( t1 - t0 ) is still an unsigned int.
I noticed this, because I use unsigned short to present a 16-bits timer_val on my X86-PC for testing/simulating purpose, and the result is not what I expected.
It doesn't matter if the ticks parameter is small.
Comparing (a-b) < 0 works as long as the tick doesn't run so fast that it overflows and changes sign before you have time to check the result of the subtraction.
Obviously, if the timer ticks really fast, and the function is called with a very short delay, then more time than intended will pass before the function manages to check the result of the subtraction. If the timer ticks every 50ns (20MHz) and I request a delay of 1, I will not be able to call the function and have the function perform first iteration of the loop within 50ns. So there is a minimum delay involved in the actual function call. And there is a limit to how often the loop can perform the check.
You can try, but you might need to change quite a few lines of code in order for it to compile depending how older the lib you have is from the lib you want to upgrade to. I had similar issues with a example running old CMSIS and my implementation using newer, that I tried to mix together. I lost a couple of days before I fixed it.