How to calculate the exact delay without the use of the timer?
For example if i want a 10ms delay, i need to achieve this delay using the for loop not with help of the in built timer.
Is there any formula or calculation to do the delay?
I dont know how to get it. I am using P89c51RD2HBP microcontroller.
Currently i am using a random delay which is worst case delay.
Anybody can say how to do the correct delay using the loop method.
There have been a number of threads in the forum that go into detail on the subject. I'd suggest searching the Knowledgebase for old threads.
In short, you can count instruction execution cycles and execute code that takes a known amount of time in a loop. (To be precise, you also need to count the time it takes for the loop overhead.)
This sort of code is best done in assembly so that you can maintain control over the exact sequence of instruction and therefore timing. If you write a timing loop in C, then you can't be sure that the compiler will always generate exactly the same code from release to release, or optimization to optimization.
This sort of code is best done in assembly
I disagree
This sort of code can only be done in assembly.
I have, on two occasions had to 'rescue' something where a C delay loop changed that iming "unexplainably".
Erik
"This sort of code can only be done in assembly."
Or even:
This sort of code can only be reliably done in assembly.
thanks "Mr language" I agree.
However, my problem with including 'reliably' is that it seems to be one of those where someone will "prove" by "testing" that it is 'reliable' and then happily sit back till the crash happens.
No, Gentlemen, I am not posting that you are stupid, just that everyone want to read what he wants which is human.
View all questions in Keil forum