Go Down

Topic: Bug: millis() delay off by +1 ms (Read 9002 times) previous topic - next topic


I  like BenF's "compromise."   Depending on how much we trust micros()


Hi, err... since this is in the hardware part of this forum there is another potential solution...

Instead of trying to make time sync with the system clock, make the system clock sync with time. Timing crystals with a frequency of 16.384 Mhz are commonly available. The timers would overflow on exact 1 ms boundaries. PWM frequency would be exactly 500 Khz. The USART baud rate prescalars (UBRRn's) would need re-calibrating. Nice clean interrupt code.

Something to think about.


I have done some testing of BenF's code solution and I think it is the best.  The use of uint16 types saves memory.  Overall the code is 20 bytes smaller than the existing delay() function.  In addition, the accuracy is impressive.  It displays an absolute accuracy of 6 microseconds on average.  The fractional error at the smallest reasonable delay, 1 millisecond, is just 0.6%.  I also confirmed that it handles rollovers in the micros() counter correctly.  There are no glitches in delay() every 70 minutes.  Adopting this code also means not having to update the delay() documentation.

In short, it's smaller, better, and requires less work from developers.  Is there someone with authority to make this change monitoring this bug forum topic?  Could you check in his proposed code?

The only change I would make is in the calling convention for delay where BenF has used 'uint32_t millis_delay' and I think it would be preferable to keep the existing 'unsigned long ms' since this matches the prototype of the function in wiring.h


I created http://code.google.com/p/arduino/issues/detail?id=237
and attached BenF's code...

We'll see...


Why don't you talk to Hans-Juergen Heinrichs to use these and be done with it:

Unlike the standard AVR libC <util/delay.h> routines, these give you very accurate timings with good resolution even in the sub microsecond range.

--- bill

Go Up