This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Timer Simulation

Hi,

I am performing simulation on ARM7, LPC2368 micrcontroller using Keil IDE UV4. I had written a delay function using timer. This works perfectly on hardware. When i try to simulate it using simulator it is taking more time than expected.

Say if i make the timer to run for 500mSec then it runs for 4 sec on real time.
So every time it encounters a delay function it is taking more time to execute.

How do i solve this timing issue.

TIA.

Parents
  • For those more inquisitive than assuming in nature, check out the ANSI standards.

    
    A computation involving unsigned operands can never overflow,
    because a result that cannot be represented by the resulting unsigned integer
    type is reduced modulo the number that is one greater than the largest value
    that can be represented by the resulting type.
    
    

    This is taken from C99.

    It is safe to do what was illustrated above. But it's one of those things that can be easily forgotten for those who don't come across such methods very often, so a few explanatory comments would be advisable in real-life source code.

Reply
  • For those more inquisitive than assuming in nature, check out the ANSI standards.

    
    A computation involving unsigned operands can never overflow,
    because a result that cannot be represented by the resulting unsigned integer
    type is reduced modulo the number that is one greater than the largest value
    that can be represented by the resulting type.
    
    

    This is taken from C99.

    It is safe to do what was illustrated above. But it's one of those things that can be easily forgotten for those who don't come across such methods very often, so a few explanatory comments would be advisable in real-life source code.

Children