Hi, I tried communication between two uC. Steps are 1) uC1 sends a request (1 byte) to uC2. 2) On receiving this byte, uC2 sends three bytes of data to uC1. 3) uC1 waits for completion of the data reception and stores the data in an array.
All the three steps works ( I am able to display the received bytes correctly using LEDs) But the problem is I need to add about 1sec delay before every time request is sent from uC1.
(I set a flag to low before sending the request, this flag (global bit) is set to one by the UART ISR when 3 bytes are received. After sending the request, the main program waits for this bit to become high, and then proceeds.Without the additional delay, this bit does not seem to change.)
Is there anything wrong in my method? Why this delay becomes necessary?
But you do need a delay since it takes more time to send 3 characters than it takes to send 1 character. So the processor sending 1 character have to spend some time waiting until 3 characters have been received before starting a new request -
In my program I am counting the received bytes and setting the 'data_received ' bit to 1 only after the count reaches 3. The code is given below.
void Scc1InterruptHandler() interrupt SIO_VECTOR using 2 { ES = 0; // disable port1 interrupt. if(TI) { TI = 0; xmitempty = 1; ES = 1; // enable port1 interrupt. return; } RI = 0; // clear RX flag. if(k<3)ph[k]=SBUF; k++; if(k>=3) { k=0; data_received=1; } ES = 1; // enable port1 interrupt. }
Do you remember to set them as volatile YES
In the main program I am waiting for the bit 'data_received' to become high
data_received=0; sendchar('V'); while(!data_received);
But this works only if I add a delay of 1 sec. My doubt is why it doesn't work without that extra delay.
Where do you set the delay? After the send but before the while loop?
It so very much sounds like the while loop is initially sampling the variable "data_received" before it is toggled, and then not performing any other read - i.e. that "data_received" isn't volatile for force a read each turn of the loop.
By the way - as I mentioned before, your while loop have no timeout. If you somehow get a problem resulting in only two characters being received, you would never get out of the loop. Any distributed code that is expecting an answer from another node must _always_ contain timeout handling to support link breakage, transfer errors, or that the other side may reboot/hang.
In your case, you don't even know if the other side is powered up when you send your command the first time - so maybe it is that your delay is needed _before_ you send your command, just because you power up both processors simultaneously and tha master is ready to send the first character before the slave is ready to start accepting answers?
I only quoted part of the program. This is in another while loop and I added delay before send.
By the way - as I mentioned before, your while loop have no timeout.
With a time out, it will naturally come out of that loop. But I wanted to see why it doesn't work by checking the data_received bit. Once I ensure this working, I will definitely be adding a time out.
In your case, you don't even know if the other side is powered up when you send your command the first time -
I wait till uC2 start sending data using the following code in the beginning.
while(!data_received)// to give time to sensot to boot { sendchar('V'); wait(1000);
This part works. Also if I avoid
while(!data_received);
in the main program and replace it with a fixed delay of 10ms, then also it worke (data is received and displayed correctly). }