We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I would like to be able to write code that implements short delays (a few microseconds) but where the code generated automatically adapts to the speed of the processor. i.e. as the processor speed is changed, the delay remains constant. I don't suppose there is a predefined macro constant available (that Keil hasn't told us about) that makes the cycle time available to the source code? I guess this is quite a common problem. uVision seems to know all about time, so would it be difficult for a predefined constant to be provided?