I’m working on a project where I have a device that is putting out a digital data signal at 1200 bits/s.
The signal is basically a square wave that goes from 0 - 2.7V and I level shifted it up to 5V with a hex level shifter.
I’m still waiting for my new Arduino to arrive so I can start programming a decoder, but honestly, I’ve never done this before and I cannot find any sample code that would at least give me a hint on how to tackle this.
I know what protocol it uses so my task here is to create a program that will sample the signal (read either digital 0-s or 1-s) and put them into some kind of buffer where I can process the data and convert bits to ASCII.
I’ve had a few ideas but I’m not sure if they would work because they seem too simple, they are probably utterly stupid… But I’ve never done this before, so yeah…
1. idea: Attach an interrupt to a digital pin. Once the interrupt is triggered, I would start reading the pin every 1/1200 seconds and adding the pin reading into the buffer for decoding.
2. idea: Also using an interrupt to trigger readings but now with the pulseIn() function. After that, deviding the time readings by 1/1200 and adding the calculated amount of bits into the buffer.
Honestly, I think this wouldn’t work due to unprecize timing…
How should I tackle this problem? Has anyone done this before with an Arduino?