I’ve been implementing a speedometer for an old vehicle and I’ve noticed a small (around 1-2%) inaccuracy vs. a reference signal.
The speedometer generates 16,000 pulses per mile, thus Speed(in mph) = #_of_pulses/Time*Scaling_Constant, where the # of pulses are counted by an interrupt over a defined period of time.
My code counts the number of interrupts over a delay, in this case a delay(100). It calculates Time by subtracting milis() before the delay from milis() after the delay. Time should be exactly 100 ms each period (assuming the milis() instructions take an infinitesimal amount of time to execute) and indeed when I print Time in the serial monitor I get exactly 100 most of the time, with the occasional 99 and 101 mixed in.
On the other hand, even though I’m artificially feeding a signal corresponding to a defined speed (for example, at 80 mph I would need 80*16000/3600 pulses per second) my arduino UNO outputs a measured speed of 77-78 mph.
The error seems to grow non-linearly as the speed increases (i.e., at low speeds it is less than 1% and at high speeds it’s closer to 2%). The small inaccuracy does not go away if I use a constant 100 ms for Time rather than calculate the time period in each loop.
At this point my hypothesis is that I’m either (a) “missing” pulses that fall in between cycles (but this would not explain why the inaccuracy seems to grow non-linearly with speed; it should in fact decrease with speed), or (b) somehow the delay or real time calculation is somehow affected by the interrupts.
I read somewhere that milis() depends on interrupts and is therefore not too reliable to use when multiple interrupts are in use. I wonder if delay(time) also suffers from the same issue. If so, would using micros() (which I also read somewhere does NOT depend on interrupts) help?
Anyone seen a similar issue?
Code is below…
SpeedometerNoSend.ino (988 Bytes)