We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hi,
I am performing simulation on ARM7, LPC2368 micrcontroller using Keil IDE UV4. I had written a delay function using timer. This works perfectly on hardware. When i try to simulate it using simulator it is taking more time than expected.
Say if i make the timer to run for 500mSec then it runs for 4 sec on real time. So every time it encounters a delay function it is taking more time to execute.
How do i solve this timing issue.
TIA.
Most people are comfortable with adding unsigned numbers. And when it overflows, we just react to the 'carry' bit and add one to the next higher word.
But I think too many people forget that subtract of unsigned numbers works similarly - it's just that we talk about a 'borrow' bit when there was an underflow.
But in the end - it would be hard for an 8-bit processor to work with 16-bit or 32-bit numbers if these two properties didn't held true. Lots of the assembler day knowledge gets forgotten when people move to higher-level languages.
Lots of the assembler day knowledge gets forgotten when people move to higher-level languages.
True. It's also worse with many new candidates not having any assembler knowledge to begin with.