We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
delay time coding make me puzzled. for example: void delay() { int x=20000; do{x=x-1;}while(x>1); } how to understand it's delay.and how long time??? sincere your in advance!
Alright... given my screw up in the previous post, I thought I'd take a crack at actually creating a program with this particular delay routine in it. I made a main() function with an infinite loop that just called Delay() and then incremented another dummy variable. In my project, this generated an LCALL to Delay which would take 24 Clocks. The actual assembly produced for the function in my case was as follows:
MOV R7,#020H - 12 Clocks MOV R6,#04EH - 12 Clocks ?C0003: MOV A,R7 - 12 Clocks DEC R7 - 12 Clocks JNZ ?C0008 - 24 Clocks DEC R6 - 12 Clocks ?C0008: SETB C - 12 Clocks MOV A,R7 - 12 Clocks SUBB A,#01H - 12 Clocks MOV A,R6 - 12 Clocks XRL A,#080H - 12 Clocks SUBB A,#080H - 12 Clocks JNC ?C0003 - 24 Clocks RET - 24 Clocks