delay time coding make me puzzled. for example: void delay() { int x=20000; do{x=x-1;}while(x>1); } how to understand it's delay.and how long time??? sincere your in advance!
delay routines should never be written in C Or, to elaborate, delay routines that rely on a processor busy-loop and require precision based on counting instruction cycles are at some degree of risk. Feel free to write delay routines that read a hardware timer in C.