How accurate is millis()?

azarur:
Has anyone encountered issues with the accuracy of millis() when using interrupts?

millis() drifts a bit, by design.

There is an interrupt (Timer 0 overflow interrupt) every 1024 µS which updates the variable used by millis(). Therefore millis() will be out by 24 µS after the first interrupt, 48 µS after the second interrupt, and so on. The code eventually compensates so this inaccuracy is not cumulative over time.

In other words, millis() will run slow (it should update every 1000 µS but actually updates every 1024 µS). However the overflow interrupt which is normally called keeps track of the amount it is out, and eventually adds one to compensate (and reduces the overflow amount to compensate). This will happen approximately every 42 overflows (interrupts). At this point, of course, the count returned by millis() will "jump" as this extra compensating amount is added in.

If you are timing small intervals, micros() will be much more accurate, as that reads from the hardware directly, and does not suffer from this creeping error. However it wraps after around 71 minutes. Also, micros has a resolution of 4 µS (not 1 µS) because of the way the timer is configured.

1 Like