We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hi,
I am performing simulation on ARM7, LPC2368 micrcontroller using Keil IDE UV4. I had written a delay function using timer. This works perfectly on hardware. When i try to simulate it using simulator it is taking more time than expected.
Say if i make the timer to run for 500mSec then it runs for 4 sec on real time. So every time it encounters a delay function it is taking more time to execute.
How do i solve this timing issue.
TIA.
For those more inquisitive than assuming in nature, check out the ANSI standards.
A computation involving unsigned operands can never overflow, because a result that cannot be represented by the resulting unsigned integer type is reduced modulo the number that is one greater than the largest value that can be represented by the resulting type.
This is taken from C99.
It is safe to do what was illustrated above. But it's one of those things that can be easily forgotten for those who don't come across such methods very often, so a few explanatory comments would be advisable in real-life source code.
ralph,
thanks for the head up.