Simple question I guess. Does millis() run off of the clock and thus is totally accurate from a milliseconds perspective or is it dependent on the programming load. For example, if I had a loop() that only took 5 milliseconds to complete and I would expect millis() to increment by 5 each time, then I added a whole ton of code that increased the real time that it took a loop to complete to to 100 milliseconds, would millis() show 100 per iteration or would the programming load slow down the computation of mills() in some way?
the millis() function counts milliseconds in the background. As long as interrupts are enabled it will count. Other processes will not affect the count not even delay(). However, millis() is not an accurate real time clock. The amount of inaccuracy depends, mainly, on the board and environmental factors. Processors with crystal clocks will be more accurate than processors with resonators, for instance.
On the Uno/Mega and similar Arduinos millis() is based on the hardware timers, but is not perfect. It
sometimes skips a value. The problem being that the relevant timer cycles every 1024µs, not every 1000µs.
So if you need timing more accurate than +/-1ms, use micros(), not millis().
And the timers are only as accurate as the quartz crystal or resonator on the board.
Uno's used to use a quartz crystal (reasonable timekeeping, +/-30ppm), now use a ceramic resonator (poor timekeeping, +/- 0.6% or something like that).
You do know you can look at the source code for the whole Arduino runtime? Its open source,
simply grep for millis() and micros() for the grisly details.
MarkT:
On the Uno/Mega and similar Arduinos millis() is based on the hardware timers, but is not perfect. It
sometimes skips a value. The problem being that the relevant timer cycles every 1024µs, not every 1000µs
@mmitch, This does not mean, however, that it runs slow. The routine adjusts for the difference and that is the problem, when it has to make an adjustment there can briefly be a 1ms difference.