We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
hi,
i found that there is a mistake in the delay i have provided,after a start bit a half-bit delay has to be given.and after data bits only a full bit delay is given.am giving a full bit delay for all..and i doubt the timer delay generation is buggy or not.to be exact i loaded TH0=0xFD; to get a 9600 baud rate..for a half bit delay i might be going for conventional loops to generate one..
Did you mean to post this as a Reply in some other thread?
It makes no sense on its own.
Of course it makes a difference if you are transmitter or receiver side.
When transmitting, you can work in full-bit delays.
When receiving, you must poll the data much quicker to figure out the exact position for start bit - or use interrupt-driven detection of start bit.
Then you need to decide where to sample your received data bits. Either by having a suitable delay, or by having interrupt-driven code measure all detected data line changes with suitable time resolution.
But I did mention this in an earlier post.
Another thing - a good UART samples the data signal multiple times within every bit cell and takes a majority vote to decide if the bit is high or low. An interrupt-driven receiver may on the other hand suffer large number of interrupts if the signal is noisy.