This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Timer Simulation

Hi,

I am performing simulation on ARM7, LPC2368 micrcontroller using Keil IDE UV4. I had written a delay function using timer. This works perfectly on hardware. When i try to simulate it using simulator it is taking more time than expected.

Say if i make the timer to run for 500mSec then it runs for 4 sec on real time.
So every time it encounters a delay function it is taking more time to execute.

How do i solve this timing issue.

TIA.

Parents Reply Children
  • Doesn't matter...

    He's doing a delta measurement, if the size of the unsigned variable is the same size as the unsigned counter (uint16_t or uint32_t) then the unsigned arithmetic will hide the rollover. The point to worry is where the delay exceeds the rollover period of the timer, in which case you'd be smart enough to decimate the delays so as not to encounter that issue. ie if the counter rolls every 10 seconds, you don't ask to wait 11, you ask to wait 1 second 11 times.

  • For those more inquisitive than assuming in nature, check out the ANSI standards.

    
    A computation involving unsigned operands can never overflow,
    because a result that cannot be represented by the resulting unsigned integer
    type is reduced modulo the number that is one greater than the largest value
    that can be represented by the resulting type.
    
    

    This is taken from C99.

    It is safe to do what was illustrated above. But it's one of those things that can be easily forgotten for those who don't come across such methods very often, so a few explanatory comments would be advisable in real-life source code.