HI I am using LPC2378 microcontroller...running at CCLK=68.57MHz. I want a delay of 1us. Plz tell if this function is correct or not??
#define MICROSECONDS(x) (x*15)
__asm void delayhk(unsigned int a_microseconds) { loop MOVS R1, R0 SUB R0, R0, #1 BNE loop BX LR }
Usage: delay(MICROSECONDS(10));
I have copied this above code from one of the thread...but in that they were using 72Mhz.. If i continue with this code only without changing my settings of system clock it would generate a lag of .291545 usec. I am a student, Plz tell any alternative for the same. Thanks
1) Is it meaningful to use an interrupt for a 1us delay, unless the ISR requiers way less than 1us?
2) What happens if your interrupt doesn't turn off the timer but tries to service the 1MHz interrupts as fast as possible? How much CPU time will you have for your main program?
2) Is an interrupt the only way you can use a timer? Don't you think the main program can just poll the timer counters? Already covered a large number of times on this and other forums.
One important thing is that if you feel like experimenting, you should always think first. Write down what you _think_ will happen. Compare with what really happens. Spend some time pondering any differences found. Is there anything you can learn from it? Just experimenting without first spending some time thinking about possible outcomes will not take you very far. Trial-and-error isn't a very fast-converging algorithm. Algorithms that tries to interpolate/extrapolate and then looks at the resulting error term converges so very much faster.
One thing the teachers hopes for when giving out school assignments is that the students will not just push forward to something to turn in, but that they keep their eyes open and tries to learn as much as possible on the route from receiving the assignment until turning in a solution. A student that also documents this route can hope for way better grades.
Sorry to say that...but you are asking too much questions without giving a single answer... Plz try to help if you can ...dont try to confuse me.. Thanks..
I already have told you how you can poll a timer for delays.
I have already told you the difference between a single interrupt, and having the timer produce continuous interrupts. You do realize that an ISR need not just acknowledge an interrupt - it can also turn off the interrupt source?
But for extremely short delays (like 1us, 2us) I would probably create dedicated functions that doesn't take any parameters.
Thats what I am asking help for ...to create dedicated function for 68.57MHz CCLK...and to generate 1 usec delays...with it..
If you need 1us - just concatenate enough instructions until you have consumed enough cycles. You know the CPU frequency of your processor core, so you know the number of nop instructions or similar that you need.
Just notice that a delay based on CPU cycles will always fail when you reconfigure the PLL to run the processor in a different speed. And if you change to an ARM processor with a different core, the instruction timings may change.
But depending on core speed, it is possible to create delays down to us range with a polled timer too. All the code needs is to pick up the current value of the free-running timer, and then enter a loop while waiting for the difference between current and start value to be large enough - the error you get is up to 1 granularity of the timer + maybe two instruction times + entry/exit time of the function. Of course, you can inline short delays to remove any errors from entry/exit code.
If the peripherials runs at 10MHz, then the granularity error from polling the timer will be no more than 0.1us, which might be acceptable. And such a polled delay can span very long times with the same resolution. So you can just as well delay for 60 seconds with similar errors.
The advantage with a free-running timer for delays is that for longer delays (where you can accept slightly worse jitter) you can perform a mini-loop that performs actions (scanning keyboard, processing serial data, ...) while you poll the timer. Just the same as when you wait for an interrupt handler to set a flag that enough time have passed.
But the big question is - why are you still stuck? A bit of own work should have got you past these delay problems already.