This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

-

delay time coding make me puzzled.
for example:

void delay()
{
int x=20000;
do{x=x-1;}while(x>1);
}

how to understand it's delay.and how long time???
sincere your in advance!

Parents
  • Alright... given my screw up in the previous post, I thought I'd take a crack at actually creating a program with this particular delay routine in it. I made a main() function with an infinite loop that just called Delay() and then incremented another dummy variable. In my project, this generated an LCALL to Delay which would take 24 Clocks. The actual assembly produced for the function in my case was as follows:

    MOV     R7,#020H	-	12 Clocks
    MOV     R6,#04EH	-	12 Clocks
    
    ?C0003:
    
    MOV     A,R7		-	12 Clocks
    DEC     R7		-	12 Clocks
    JNZ     ?C0008		-	24 Clocks
    DEC     R6		-	12 Clocks
    
    ?C0008:
    
    SETB    C		-	12 Clocks
    MOV     A,R7		-	12 Clocks
    SUBB    A,#01H		-	12 Clocks
    MOV     A,R6		-	12 Clocks
    XRL     A,#080H		-	12 Clocks
    SUBB    A,#080H		-	12 Clocks
    JNC     ?C0003		-	24 Clocks
    RET     		-	24 Clocks
    

    So... what I would call the "fixed overhead" of this function is the LCALL to get there, the RET to return, and the preloading of x (R6/R7) for a total of 72 Clocks.

    Next, we know that the loop is going to execute n-1 or 19999 times. Also, there's a conditional in there that will cause the DEC R6 to execute only every 256th time through the loop. So 19999 / 256 = 78.12, so 78 times through the loop will take 156 Clocks, while the remaining 19921 will take 144 Clocks.

    Total execution time would then be:

    72 + (78 x 156) + (19921 x 144) = 2,880,864 Clocks

    Assuming a standard 8051 with a 16MHz clock, this Delay function call would take 180.054ms to execute.

    Using the previously mentioned method of checking the "sec" field in Keil's debug gives me an execution time of 180.05405ms. Figuring out where the .05us difference came from is left as an exercise for the student since I've spent far too much time on this.

    Anyhow Liya, this is all likely useless for you since so many variables in your design could affect the timing. Nonetheless, this is the rigorous method of doing things and it confirms the MUCH simpler method of using Keil's built-in debug, so I'd go that route instead.

Reply
  • Alright... given my screw up in the previous post, I thought I'd take a crack at actually creating a program with this particular delay routine in it. I made a main() function with an infinite loop that just called Delay() and then incremented another dummy variable. In my project, this generated an LCALL to Delay which would take 24 Clocks. The actual assembly produced for the function in my case was as follows:

    MOV     R7,#020H	-	12 Clocks
    MOV     R6,#04EH	-	12 Clocks
    
    ?C0003:
    
    MOV     A,R7		-	12 Clocks
    DEC     R7		-	12 Clocks
    JNZ     ?C0008		-	24 Clocks
    DEC     R6		-	12 Clocks
    
    ?C0008:
    
    SETB    C		-	12 Clocks
    MOV     A,R7		-	12 Clocks
    SUBB    A,#01H		-	12 Clocks
    MOV     A,R6		-	12 Clocks
    XRL     A,#080H		-	12 Clocks
    SUBB    A,#080H		-	12 Clocks
    JNC     ?C0003		-	24 Clocks
    RET     		-	24 Clocks
    

    So... what I would call the "fixed overhead" of this function is the LCALL to get there, the RET to return, and the preloading of x (R6/R7) for a total of 72 Clocks.

    Next, we know that the loop is going to execute n-1 or 19999 times. Also, there's a conditional in there that will cause the DEC R6 to execute only every 256th time through the loop. So 19999 / 256 = 78.12, so 78 times through the loop will take 156 Clocks, while the remaining 19921 will take 144 Clocks.

    Total execution time would then be:

    72 + (78 x 156) + (19921 x 144) = 2,880,864 Clocks

    Assuming a standard 8051 with a 16MHz clock, this Delay function call would take 180.054ms to execute.

    Using the previously mentioned method of checking the "sec" field in Keil's debug gives me an execution time of 180.05405ms. Figuring out where the .05us difference came from is left as an exercise for the student since I've spent far too much time on this.

    Anyhow Liya, this is all likely useless for you since so many variables in your design could affect the timing. Nonetheless, this is the rigorous method of doing things and it confirms the MUCH simpler method of using Keil's built-in debug, so I'd go that route instead.

Children
No data