Hi , I am not sure if my question is relavent to this forum or not but its an interesting programing prblem that i am facing. I hope someone here would be able to give usefull advice on it. my microcontroller is receiving a stream of data of 25bits. The stream is 0000011100000001000111000. These bits are encoded strangely. A zero appears as follows, (please ignore the dots, they are just there to hang the pattern.) ___ |...| |...| |...| |...|________ The ratio of high-time to low-time is 1:3. while one appears as follows: ________ |...........| |...........| |...........| |...........|___ The ratio of high-time to low-time is 3:1. My question is that what is this encoding scheme, and how can i read this in a microcontroller ?? I am receiving this stream on a normal pin which is not an interrupt pin. I am looking for some neat algorithm to read the stream faithfully. Any help would be great; thankz & Bye .
You shoudl rather try the newsgroup comp.arch.embedded