We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I want 10 sec delay for the system how to implement it in keil c. kindly help me.
for example: consider a road signal which has three aspects and when the aspect changes from green to red it goes to yellow aspect and there is a delay of 5 to 10 sec. how to implement it. I implemented it by using for loops. But there is any way to implement using Hardware timer concepts
you can use loop technique or you can use timers, check this link for generating delays www.dnatechindia.com/.../ if you want code for traffic light controller then here is a tested code www.dnatechindia.com/.../Traffic-Light-Controller.html
I am angry. very angry. The link you are spreading is misleading. It is wrong. It is so wrong that the OP might one day have somebody killed because he adopted your false and wrong methods. A small clue, Mr. Shah: writing delay functions is C is almost never going to perform the way you need it to, especially if the delay required is very short. OP: use interrupt driven delays or assembly. read the damn processor manual, for a change!
C busy-loop delays should be synchronized by some form of hardware - for example constantly checking a hardware timer and not break until x ticks has passed.
The posted link "computes" the delay based on the clock speed - but the processor clock isn't the only variable. The compiler version, optimization level, availability of processor registers etc can also affect a delay. And adaptive delay should at least try to figure out itself the number of loops in a ms - similar to the old Turbo Pascal delay() function (that later resulted in quite a lot of failed Turbo Pascal programs since the the computers became so fast that the adaptive analysis overflowed...)
Any processor that can't supply hardware for keeping track of the time should be thrown away.
Another important thing: A busy-loop without feedback, i.e. just counting cycles, (assembler or C) only have a lower bound on the delay. With high interrupt load, the true delay may become very much longer. A CPU running with 50% interrupt load will make all delays twice as long... A robust program should try to be reliable even at high loads.
Check the user guide for the processor that you are using and learn about the timer/counter system.
That's what it's there for.
comment on the following program for approx 10 sec delay.
#include <REGX52.H> #include<stdio.h> static unsigned long overflow_count = 0; void timer1_ISR (void) interrupt 3 { overflow_count++; if(overflow_count==(0x60000)) { P0=P0^0xFF; overflow_count=0; } } void main (void) { TMOD = (TMOD & 0x0F) | 0x20; TH1 = 0XFF ; TL1 = TH1; ET1 = 1; TR1 = 1; EA = 1; while(1); }
thanks, vijay
what about this program for 10 sec delay without interrupt.
#include<stdio.h> #include <REG54.H> #define S 10 /*select your time in seconds*/ void delay(); void main() { int i; while(1) { P0=~P0; for(i=0;i<20*S;i++) / delay(); } } void delay() /* delay subroutine */ { TMOD &= 0xF0; // Clear all T0 bits (T1 left unchanged) TMOD |= 0x01; // Set required T0 bits (T1 left unchanged) ET0 = 0; // No interupts // Values for 50 ms delay /* calculation for 50 ms (50ms/1000ms)*1000000=50000 65536-50000=15536=3CB0 TH0=3C TL0=B0 */ TH0 = 0x3C; // Timer 0 initial value (High Byte) TL0 = 0xB0; // Timer 0 initial value (Low Byte) TF0 = 0; // Clear overflow flag TR0 = 1; // Start timer 0 while (TF0 == 0); // Loop until Timer 0 overflows (TF0 == 1) TR0 = 0; // Stop Timer 0 }