delay time coding make me puzzled. for example: void delay() { int x=20000; do{x=x-1;}while(x>1); } how to understand it's delay.and how long time??? sincere your in advance!
Examine the assembly output and count up the instruction cycles.
Remembering of course to take your clock frequency and clock divider into account...