Hi,
I would like to optimize some existent old school code (C99) to demodulate an FSK signal 1200baud, 3,6kHz and 4,8kHz (1startbit, 1stopbit, parity uneven)
There's an 10µs interrupt ADC_A_IRQHandler (16+24 Combined ADC_A end-of-sequence A and threshold crossing interrupts)
Inside the "THR_LOW upwards" is triggered every time the sine (logical 0 or 1 or noise) passed that threshold.
With an HW timer of 20µs i measure the time between two triggers, like this i can evaluate if its a 3,6kHz or 4,8kHz signal or noise
Every 0 (3,6kHz) or 1 (4,8kHz) is stored in a buffer. At this time is not clear how many 0 or 1 are needed for 1 bit. I see in my array some bits are missing. Can be a timing issue or because of noise.
The result isn't that great, and I suppose I should use some filtering or other measuring method.
The problem is also the noise, there's a lot of distortion on the signal because of interference with other electronics (electromotors) nearby the cable. Also there's possible a 10kHz signal from a nearby wire that might influence the quality of the signal.
Does someone has some experience with this kind of situation, how I can filter this and get more reliable data even at a high error rate?
Your advise is much appreciated.
Thanks!