I'm trying to communicate with an old device that operates at 1200 baud only. The tablet I'm using to control the old device will only go as low as 2400 baud. Any ideas on how to bridge this gap would be greatly appreciated.
I can't see how you can get normal UART communication with a 2:1 baudrate error, since that is totally outside of what the asynchronous format and the hardware implementations are designed for.
But if you control the software on both sides, then I would think you could potentially run the link with the 1200 baud device running 7N1 and the 2400 baud device runnigng 7N2 - and with an actual transfer of 3 bits of actual data used to tunnel more complex messages.
So the start bit of the slow device is twice as long consuming the time slot of the first data bit directly following the start bit. Then 6 more data bits that may only encode 3 bits because each bit takes two bit slots. Then the single stop bit ends up looking like two stop bits.
I have obviously never tested the above concept, but just as wrong baudrates in a number of situations does result in the receiver seeing "random" characters, there are situations where transmitted bytes with the wrong baudrate will end up as something the other side can decode repeatable.
Here's more information on the old device. It operates at 1200.8.N.1. Sadly, this is set in stone.
Well, 1200 baud 8N1 can trig reception of characters for a listener that runs at 2400 baud, so you can still send 3 bits/byte.
But since each bit will take twice as long, the last data bits must be given the same polarity as the stop bit. So it will basically emit an extreme number of stop bits - but additional stop bits on transmission isn't a problem since that's just additional idle state.
The thing to remember is that the device running at 2400 baud can't send a burst of characters - it can manage to send 3 bits of data in a single transmitted byte. Then it needs to wait a full additional character length with zero transmissions just to produce the required stop bit of the 1200 baud device. Then it may emit the next 3-bit character and a new delay.
So pack 8-bit data into many more "3-bit characters". Send character. Wait. Send character. Wait.
And when receiving extract from each received byte 3 bits of data and convert back into a full message.
Very messy and no fun for bigger data blocks. But can be a way to transfer commands between two devices - the serial channel would manage 20-50 usable bytes/second depending on how well it's possible to generate an optimal delay between each single-byte transfer from the "fast" device, and how much stuffing you might need to make the receiving side detect the start of an encoded message. If that is usable or not will then just depend on what you need the link for.
You say "the tablet". As in, a pre-fabricated device that you can run "apps" on, but have no actual control of the hardware? Then I'm afraid this cannot work.
The obvious solution, and possibly the only feasible one, would be bit-banging. 1200 Baud is slow enough to do entirely in software if you have direct control of port pins and a timer or two. But that won't be possible when all you can do is write an app in some high-level language, that talks to existing lower-level drivers via a nice API.
In that case you'll need a gateway device between the two existing ones, which talks 1200 baud to one side, and some other, possibly much higher baudrate to the other. It may need some buffering of fast messages going to the slow side.
The obvious solution, and possibly the only feasible one, would be bit-banging to me the obvious solution would be to run whatever clock the UART is driven by at half the speed.
Thank you to each of you responding to this thread. I'm traveling at the moment, and don't have time for much of a reply. I'll read carefully and respond when possible. Again, thank you all for giving me your combined knowledge on this thread.
Alright, back to it. I'm using an industrial tablet with a RS-232 port and running CE6. The included software only slows to 2400.
now I realize the only one thing to say is
WHAT DOES THIS HAVE TO DO WITH KEIL?
A very valid point.
I do sometimes wonder what the proportion of Keil related (vs unrelated) questions is on this forum.
A forum needs a critical mass of posts to be meaningful to visit. So if filtering too hard, no one will visit the forum and care to write answers.
Better to allow a bit of off-topic questions and if the volume becomes too high add subforum or filtering support. 8052.com is an example of a forum that got too low number of posts, making everyone stop visiting the site.
Maybe you need to be talking to the vendor of the tablet, or writing the drivers to suit your needs? Got to presume someone selling a WinCE device has some people with some programming competence behind them.
Alternatively why not make a small board that converts/bridges serial ports of different rates. I've seen people use 30 cent Cortex-M0 parts to mitigate between serial rates and formats. You'd probably want buffering and flow control to hold off the faster device, but not an unduly complicated set of requirements.
Sorry... I suppose the more appropriate question should be, would an ARM11 have any bearing on minimum baud rate?
Most processors can do almost any baudrate (within reasonable limits) - so the driver vendor might create a list of standard baudrates instead of allowing the user to specify arbitrary baudrates.
So you might get a driver with a parameter: 0: 300 baud 1: 600 baud 2: 1200 baud ... even if the hardware could do 75 baud or 5350 baud.
Most newer processors starts from a very high clock frequency, allowing a quite big divisor to span a large baudrate range. And lots of newer processors also have fractional baudrate support, to allow them to divide that input clock with 4.55 if that is what it takes to get a perfect output baudrate.
So I'm pretty sure it's a driver limitation and not a hardware limitation that the tablet doesn't do 1200 baud. It is normally split-speed (different baudrate for transmit and receive), or a specific number of data bits in combination with number of stop bits in combination with even/odd/mark/space/none parity where there may be actual hardware limitations because of the bit fields used to configure the UART.
For older processors, it was common that you had to select specific clock crystals to be able to get "correct" baudrates, but in a world of PLL-driven hardware that is quite uncommon today.
I'm aware of a couple of chips where the width of the baud rate divider would be insufficient to get the APB clock down far enough to get to 1200 or 600 baud. One might be able to slow the APB down to a point it works but that would slow down a lot of other peripherals in the process.
Supporting 30 year old equipment is a bit niche at this point.
Thank you gentlemen. It seems clear to me now that the processor is not the probable issue. I will now pursue a driver based solution, with or without the vendor, to reach the necessary speed. Sorry that I jumped ahead about 4 mental pages when initially addressing this problem. All the help has been greatly appreciated. Thank you all again.