In an example using millis() for setting a delay, I saw the following to set a delay between samples.
delay((LOG_INTERVAL -1) - (millis() % LOG_INTERVAL));
I understand setting the delay to LOG_INTERVAL, but not sure what this is doing. Why subtract 1, and why subtract the mills() modulo LOG_INTERVAL? Anyone know?
Thanks in advance, Dave
BTW I hate using delay, preferring to trigger functions based on time elapsed instead. But sometimes delay() can be useful.