The project involves reading data from a Mitutoyo Digital Indicator to an Arduino Uno. I've found a few people posting about projects similar to mine. This example is comparable. A brief of what's going on:
The Arduino pulls a data req. pin high; the indicator (a height gage with linear encoders) responds with a numerical value. However, occasionally the value received is way off.
I think I know why, but it made me curious about the code which I used from aforementioned examples:
digitalWrite(req, HIGH); // generate set request
for( i = 0; i < 13; i++ ) {
k = 0;
for (j = 0; j < 4; j++) {
while( digitalRead(clk) == LOW) {
} // hold until clock is high
while( digitalRead(clk) == HIGH) {
} // hold until clock is low
bitWrite(k, j, (digitalRead(dat) & 0x1));
}
mydata[i] = k;
}
sign = mydata[4];
value_str = String(mydata[5]) + String(mydata[6]) + String(mydata[7]) + String(mydata[8] + String(mydata[9] + String(mydata[10]))) ;
decimal = mydata[11];
units = mydata[12];
value_int = value_str.toInt();
if (decimal == 0) dpp = 1.0;
if (decimal == 1) dpp = 10.0;
if (decimal == 2) dpp = 100.0;
if (decimal == 3) dpp = 1000.0;
if (decimal == 4) dpp = 10000.0;
if (decimal == 5) dpp = 100000.0;
value = value_int / dpp;
digitalWrite(req,LOW);
This picture might be helpful:
Shouldn't there be an interrupt set up on the clock pin? It looks to me like-- the clock pin triggers when the data starts, but what is keeping the two devices in sync? Is it just coincidence that the two devices are at a similar enough frequency that the 9 bytes of data are polled at the right times? Could this be why I occasionally get bad data? Or is the compiler smart enough to know what I want?
I can later share my thoughts why I think I get bad data, which may be unrelated to the questions posed above, but does anyone have any comments?
