Hey guys, Hoping someone can give me some gudience on this issue I am having.
Current Setup: A1 - Arduino 1 setup to transmit a Complete Frame using a LIN Style protocol B1 - Arduino 2 setup to receive a Frame Transmission Speed is 1bit per 100us (10Kbps) I have a Potentiometer as a input so I can adjust timing to fine tune it.
These units are using my own code designed to transmit and receive my custom frame formats. They are confirmed working have had it successfully encode transmit and decode and display. I have no issues with my hardware/software in that since.
My issue is with timing drift, When I receive the frame from A1-Transmitter I sample every 100us after the SOF bit is received and confirmed. What happens next is that If I send a frame with data on the B2-Receiver unit in the serial console I will see that my bytes are not right . I adjust the timing with my POT and then all my bytes will show correctly and timing will be at say 97us then I'll change the data to something else and send again and will have the same issue as before none of the bytes show correctly.
I again adjust timing and it might be 95us and it will show them all correct again.
I put it under a scope and can see that I'm sampling in the middle of the bits but then the frame slightly drifts give or take 50us by the end and starts putting my sample points on the leading edge of the bits and this causes it to some times read in front of the bit it was suppose to be on.
I don't know if its common place in these type of protocols that do not have timing syncing lines if I should build into my software a time adjustment to match the problem I am seeing. I always see it happen starting at the 3rd byte so if I was to add to my software to subtract 20us after the 3rd byte my timing would always be right. This just seems like hack way of doing it but would this be par for the course when it comes to bit banging a bus.