Hi everyone,
I should start by introducing the context of my application: I am currently working on a cryptosystem using the MCBSTR9. My evaluation board is connected to a modem through UART0, so I can react to specific modem interrupts (Data Carrier, Ring Indicator, etc...). At a first stage, I am required to sample voice and transmit/receive it via the modem.
My problem is the following: what if no data arrives? Going through the reference manual UM0216, I found out that I can enable the receive timeout interrupt for UART. However, I have been struggling since then with accomplishing an apparently casual functionality: How to set the timeout period for the receive timeout interrupt? especially that I may need to change it during runtime.
Here is why: My application would need to adjust to the modem (DCE) data rate, so if an increase in the data rate is perceived, I would need to adjust the UART baud rate in order to not suffer from any potential overrun. (Please Bear in mind that the microprocessor would be busy doing encryption,decryption,sampling,... in a round robin fashion -- I could have done it using interrupts instead, maybe I will - so I would not want to lose any data arriving from the modem before I can actually process it--.
Hope I was clear enough.
Thank you for your help
All right, I was not clear enough about what I want.
In a nutshell, my application will 1 - Handle key negotiation using IKE (outside the scope of this thread) 1- Sample voice using the ADC. 2- Encrypt it using the established key. 3- Encode the data using a suitable codec. Now here is the tricky part I would need to check the DCE data rate in order to select among potential codecs (GSM,...)which one would allow me to preserve some 'quality' of the transmitted speech. For example if my data rate is 8Kbps I would use G.729. I would then construct a frame in the following manner:
Sequence number | codec used |payload | FCS
Sequence number = (# bits still to be decided, will be used for Chaining mode in Encryption) Codec used = 2 or 3 bits depending on the codecs I will use Payload = Encrypted and encoded data FCS = To recover from any transmission error
Now, to answer you Per I would need timeout processing in case the modem is unplugged for example, so that I will immediatly stop the ongoing communication. If I can not receive anything for a while it means that something went wrong. I can even display a customized message to the user informing him about the current state of the modem.
As for you Andy, The line quality can degrade during an ongoing communication (at least that's what many telephony professionals have explained to me). I think that the codec selection explains this autonomous adjustement of my application.
Thank you guys for your time. Any feedback is appreciated.
The data rate between DCE and DTE is totally unrelated to the data rate on the line.
Thank you Andy, I understand that the data rate between DCE and DTE is unrelated to the DCE data rate. But do you have any suggestion as to how to control the receiver timeout?
In your case, you are not processing a protocol with very, very short pauses as only separator between packets, so you don't need to care about break detection in the serial port.
If you want to detect a modem that is no longer receiving any new data, it is enough to just update a time stamp whenever you receive any data. Then regularly poll that value and if too old, take some form of action.
You may also consider looking at the hardware handshade signals or possibly XON/XOFF, i.e. how long your application has been blocked from sending more data.
Don't know what handshade is. Should of course be handshake...