delay time coding make me puzzled. for example: void delay() { int x=20000; do{x=x-1;}while(x>1); } how to understand it's delay.and how long time??? sincere your in advance!
Liya, How long that delay loop will take to execute depends on a great number of things. For instance, could it be interrupted and have to wait for an ISR to execute? Also, what chip are you using? What is your clock speed? Assuming that you have a "standard" 8051 type chip (like an 87C51FA for example), you should know that every assembly instruction takes 12 clock cycles to execute. So, if you were running at 16MHz, every instruction would take 750ns. So... after you write your delay function, you can look at the assembly generated and count the instructions both within the loop(for the time for each loop) and to set up the loop variables (the fixed overhead time). Then you can figure out how many loops need to execute to achieve a given delay time. Hope that helps.
every assembly instruction takes 12 clock cycles to execute. That is not correct "every assembly instruction takes a multiple of 12 clock cycles to execute" see chapter 2 of "the bible" http://www.semiconductors.philips.com/acrobat/various/80C51_FAM_PROG_GUIDE_1.pdf as to what that multiple is for the various instructions for a "standard" '51 For "unique" derivatibves such as SILabs, refer to the datasheet/user manual. ALSO delay routines should never be written in C they may cange when the compiler is revised. Erik
Erik, Good catch... Sorry if I led anyone astray. As a matter of fact, the loop presented would almost certainly generate a DJNZ which would leave someone pulling out their hair.
leave someone pulling out their hair. <.i> not me, don't have any :) Erik