Hello!
I've written a piece of blocking code to watch a pin and read a char when available. The char is encoded in regular old ASCII over RS232 serial - nothing fancy here. By and large, it works - however, half the time, I get the wrong char saved. I send an 'e', I get an '%'. I send an 'f' I get a '&' about half the time. The other half of the time, I get the expected 'e' or 'f'.
Those familiar with your ASCII tables will notice that 'e' and '%' are identical save their MSB, which is broadcast last. Ditto for 'f' and '&'. So what's happening is the last bit in the serial stream isn't being picked up and is undefined.
int currentChar = 0;
int iInputArray[8] = {0,0,0,0,0,0,0,0};
// Listen for rxPin to go high - that'll be the rise of the start bit
while(!digitalRead(rxPin)){}
delayMicroseconds(iHalfCycleDuration); // Wait half a cycle (mid of start bit)
delayMicroseconds(iCycleDuration);// Wait until mid of first bit
for (int i = 7; i > 0; i-- )
{
iInputArray[i] = !digitalRead(rxPin); // grab this bit
delayMicroseconds(iCycleDuration); // Wait until mid of next bit
}
for (int i = 0; i < 8; i++ ) // Show me what I saved, raw!
{
sendChar(' ');
sendChar(iInputArray[i] + 48); // 1 or 0, please
}
sendChar(0x0A);
for (int i = 1; i < 8; i++ ) // Show me what I saved, in character form!
{
currentChar = currentChar + (iInputArray[i] * round(pow(2, (7 - i))));
}
sendChar(currentChar);
From my experimentation, I find that the codes are recieved in reverse (LSB first) and inverted. This is why I do a count-down loop (to fill my bit array in the right order) and invert the digitalRead(). I only grab the first 7 bits (the last of which is the offending bit).
The output is as follows. The MSB is ignored as it's not part of the ASCII definition. The translation between bit array and char is correct.
0 0 1 0 0 1 0 1
%
0 1 1 0 0 1 0 1
e
0 0 1 0 0 1 0 1
%
0 1 1 0 0 1 0 1
e
Now, I'm pretty sure the problem is timing drift. Currently, iCycleDuration = 104, which I derived from the bit pulse width at 9k6 baud (1 / 9600 = 0.000104). So, I position myself in the middle of the start bit then wait until I'm in the middle of the data bits and repeat. I wouldn't envisage enough drift to put me 52uS out, necessary to start causing problems.
But, if I adjust my in-loop delay to
delayMicroseconds(iCycleDuration - 3); // Wait 101 instead of 104uS
The detected values are always right. No randomness - 'e's and 'f's all the time. Which suggests that I've massively underestimated the time it takes for an ATMega168 to execute an assignment into an array.
Sound plausible? Anyone see where I'm going wrong?