I'm trying to communicate with an old device that operates at 1200 baud only. The tablet I'm using to control the old device will only go as low as 2400 baud. Any ideas on how to bridge this gap would be greatly appreciated.
I can't see how you can get normal UART communication with a 2:1 baudrate error, since that is totally outside of what the asynchronous format and the hardware implementations are designed for.
But if you control the software on both sides, then I would think you could potentially run the link with the 1200 baud device running 7N1 and the 2400 baud device runnigng 7N2 - and with an actual transfer of 3 bits of actual data used to tunnel more complex messages.
So the start bit of the slow device is twice as long consuming the time slot of the first data bit directly following the start bit. Then 6 more data bits that may only encode 3 bits because each bit takes two bit slots. Then the single stop bit ends up looking like two stop bits.
I have obviously never tested the above concept, but just as wrong baudrates in a number of situations does result in the receiver seeing "random" characters, there are situations where transmitted bytes with the wrong baudrate will end up as something the other side can decode repeatable.
Here's more information on the old device. It operates at 1200.8.N.1. Sadly, this is set in stone.
Well, 1200 baud 8N1 can trig reception of characters for a listener that runs at 2400 baud, so you can still send 3 bits/byte.
But since each bit will take twice as long, the last data bits must be given the same polarity as the stop bit. So it will basically emit an extreme number of stop bits - but additional stop bits on transmission isn't a problem since that's just additional idle state.
The thing to remember is that the device running at 2400 baud can't send a burst of characters - it can manage to send 3 bits of data in a single transmitted byte. Then it needs to wait a full additional character length with zero transmissions just to produce the required stop bit of the 1200 baud device. Then it may emit the next 3-bit character and a new delay.
So pack 8-bit data into many more "3-bit characters". Send character. Wait. Send character. Wait.
And when receiving extract from each received byte 3 bits of data and convert back into a full message.
Very messy and no fun for bigger data blocks. But can be a way to transfer commands between two devices - the serial channel would manage 20-50 usable bytes/second depending on how well it's possible to generate an optimal delay between each single-byte transfer from the "fast" device, and how much stuffing you might need to make the receiving side detect the start of an encoded message. If that is usable or not will then just depend on what you need the link for.