Hello all,
please be don't mad at me if something similar was asked multiple times already, i couldn't find existing stuff which exactly matches my problem:
Following setup:
- arduino uno, connected to an external devices
- the uno sets a pin to HIGH every 4-5 seconds (lets say pin8)
- it awaits a response on its interrupt line (pin2) und measures the time between setting pin8 to high and the response (and sets the output back to low)
- repeat
Here is what bothers me:
I need that measurement to be precise as possible, ideally accurate to the single digit microsecond, do you think that is feasible with an arduino or should i get some better hardware? Besides, since this is part of a corporate project, would you say an arduino (as a hobbyist device, no offense meant is suitable?
In a simple test like:
loop
{
unsigned long t0 = micros();
delayMicroseconds(400);
outputToSomething( micros() - t0 ); // pseudocode
}
i'm measuring values with a varying bias/offset, 20-36 microseconds. The same happens of course in the actual code as described above.
Now to the why...
My first thought was, that the 'software solution' (micros/delayMicroseconds) is just taking to many cycles. Afaik the Uni has three counting register (1x16,2x8 Bit) and delay/micros measures in 32 Bit, so there is obviously some work going on behind the scenes to provide those 32 Bit values. On the other hand, the uno is clocked at 16 Mhz, which should not amount to such a huge error.
Anyway, would it be a feasible approach to realize the timing manually, that is using the 16 Bit register directly (with suitable prescale etc). The shorter period is ok, i don't even need to care about the overflow since i know i only need to measure every 4-5 seconds. Or would it lead to the same error because.
...my next guess would lead to the resonator/oscillator which clocks the arduino (a ceramic resonator afaik). These devices have a technical imprecision of something around 0.5%. Is it just possible this small errors accumulates enough to lead to this error?
Roughly doing the math: 16 Mhz is 0,0625 times per microseconds, assuming +0.5% offset i'm looking at ~62.8 ns instead of 62.5 ns.
The time i expect to measure varys between 500 us and 10 ms (could change in both directions), in the 500 us case, again roughly doing the math i get (5001000)/62.8 vs (5001000)/62.5 which is 7961 vs 8000 ('ticks').
Depending how micros/delayMicros is implemented, could this amount to my observed error? So, what about getting a better clock source (or a TDC on the extreme end?), like just a simple external quartz? (which are more precise as far as i know compared to the ceramic stuff)? Or, just get another board with higher Clockspeed and maybe a 16 or 32 bit controller? (and thus 32 Bit counters)
Third thought, what if just the delay function introduces this delay, but micros itself measures correctly (although i know it measures in 4 us increments)? At the moment i'm just simulating my external device (which is not built yet) by a second arduino which does the critical delay call?
TiA for your help