Lets get this nice and clear:
common idiom:
unsigned long prev_t = 0L ;
void loop ()
{
unsigned long now = millis () ;
if (now - prev_t >= DELAY)
{
prev_t = now ;
.. do stuff ..
}
.. do other stuff ..
}
Note the two flaws here:
-
If milis () skips values (which it does on many Arduino boards) then
every skip will guarantee a 1ms slip in time carries forwards into the future. -
If the "stuff" or "other stuff" happens to take longer than DELAY ms then
the whole sketch will slip back in time never to recover. If you were meant
to be providing time to a clock display this would be an issue.
This idiom cures both:
unsigned long prev_t = 0L ;
void loop ()
{
unsigned long now = millis () ;
if (now - prev_t >= DELAY)
{
prev_t += DELAY ;
.. do stuff ..
}
.. do other stuff ..
}
The "do stuff" will keep running until the sketch catches up with the absolute
time, should something delay the sketch. Missed millis() values merely have
a 1ms temporary jitter effect (unavoidable if only calling millis()).
When wanting drift-free timing via micros(), this idiom is invaluable as microseconds
then matter.