This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

-

delay time coding make me puzzled.
for example:

void delay()
{
int x=20000;
do{x=x-1;}while(x>1);
}

how to understand it's delay.and how long time???
sincere your in advance!

Parents
  • Liya,

    How long that delay loop will take to execute depends on a great number of things. For instance, could it be interrupted and have to wait for an ISR to execute? Also, what chip are you using? What is your clock speed?

    Assuming that you have a "standard" 8051 type chip (like an 87C51FA for example), you should know that every assembly instruction takes 12 clock cycles to execute. So, if you were running at 16MHz, every instruction would take 750ns. So... after you write your delay function, you can look at the assembly generated and count the instructions both within the loop(for the time for each loop) and to set up the loop variables (the fixed overhead time). Then you can figure out how many loops need to execute to achieve a given delay time.

    Hope that helps.

Reply
  • Liya,

    How long that delay loop will take to execute depends on a great number of things. For instance, could it be interrupted and have to wait for an ISR to execute? Also, what chip are you using? What is your clock speed?

    Assuming that you have a "standard" 8051 type chip (like an 87C51FA for example), you should know that every assembly instruction takes 12 clock cycles to execute. So, if you were running at 16MHz, every instruction would take 750ns. So... after you write your delay function, you can look at the assembly generated and count the instructions both within the loop(for the time for each loop) and to set up the loop variables (the fixed overhead time). Then you can figure out how many loops need to execute to achieve a given delay time.

    Hope that helps.

Children