This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Timer Simulation

Hi,

I am performing simulation on ARM7, LPC2368 micrcontroller using Keil IDE UV4. I had written a delay function using timer. This works perfectly on hardware. When i try to simulate it using simulator it is taking more time than expected.

Say if i make the timer to run for 500mSec then it runs for 4 sec on real time.
So every time it encounters a delay function it is taking more time to execute.

How do i solve this timing issue.

TIA.

Parents
  • Most people are comfortable with adding unsigned numbers. And when it overflows, we just react to the 'carry' bit and add one to the next higher word.

    But I think too many people forget that subtract of unsigned numbers works similarly - it's just that we talk about a 'borrow' bit when there was an underflow.

    But in the end - it would be hard for an 8-bit processor to work with 16-bit or 32-bit numbers if these two properties didn't held true. Lots of the assembler day knowledge gets forgotten when people move to higher-level languages.

Reply
  • Most people are comfortable with adding unsigned numbers. And when it overflows, we just react to the 'carry' bit and add one to the next higher word.

    But I think too many people forget that subtract of unsigned numbers works similarly - it's just that we talk about a 'borrow' bit when there was an underflow.

    But in the end - it would be hard for an 8-bit processor to work with 16-bit or 32-bit numbers if these two properties didn't held true. Lots of the assembler day knowledge gets forgotten when people move to higher-level languages.

Children