I would like to be able to write code that implements short delays (a few microseconds) but where the code generated automatically adapts to the speed of the processor. i.e. as the processor speed is changed, the delay remains constant. I don't suppose there is a predefined macro constant available (that Keil hasn't told us about) that makes the cycle time available to the source code? I guess this is quite a common problem. uVision seems to know all about time, so would it be difficult for a predefined constant to be provided?
This may help you out ... in "Delays.h" have the following:
// NOTE: "Crystal" as used below is best defined in some other header unique to your project. unsigned char Delay8Plus2xCycles (unsigned char x); #define Crystal 18432000L /* Crystal speed */ #define ClockRes (Crystal/12) /* Clock ticks per second */ #define NanoSecondsPerCycle (1000000000L / ClockRes) #define uSecToInstCycles(x) ((1000L * (x)) / NanoSecondsPerCycle) #define uSecDelay(InstCycles) \ (((InstCycles) & 1) ? _nop_() : 0), \ (((InstCycles) & 2) ? _nop_(),_nop_() : 0), \ (((InstCycles) & ~7) ? Delay8Plus2nCycles(((InstCycles/4)-2)*2) \ : (((InstCycles) & 4) ? _nop_(),_nop_(),_nop_(),_nop_() : 0))
unsigned char Delay8Plus2nCycles (unsigned char LoopCount) { while (--LoopCount); return (LoopCount); }
#include "Delays.h" uSecDelay (uSecToInstCycles(15));