I didn't want to hijack a thread since this is a bit off-topic.
I am asking because I do not quite understand why not always append the time interval.
It's not that I disagree, more that I'd like a more in-depth explanation, @Koepel
?
If that's your question consider that you have time_interval = 1000 as an example. Your code comes round to where it checks and it so happens that millis is 1005, so > than 1000. If you just do
previous_time = millis();
previous_time now = 1005, so it's gained 5ms. What you really want is for previous_time to equal 1000, hence += time_interval.
Does that make sense and have I answered what you were asking?
The delay spoken of is just the fact that if your other code overruns the time slice, you have choice:
Increment by the interval, and all your "missed" opportunities are addressed rapidly.
or
Update to now, and let the water under the bridge not matter, just get back into the rhythm.
Which to use? It like all things in code depends.
I like to let the missed opportunities get a chance. Like if every loop dropped a dollar into my hat.
In other cases, doing N calls to make up for the lost time might be a bad choice.
It only matters if you overrun your slice. Also, it makes the best sense for me anyway to get millis() once per loop right away, and use now or if you are full of it today currentMillis in all places where you had millis() calls.
If you have a perpetually fixed periodic interval, the most basic of which would be a square wave, you would of course use timestamp += interval, provided you have sync'd with millis() or micros() at some point (if not at the start)... one clever way to do this (credit to @cedarlakeinstruments ) is to initialize your timestamp to millis() in loop(). Yes, this works, per stmt.dcl and has been tested with arduino and GCC, and it works...
void loop() {
static unsigned long timestamp = millis();
If the return value from millis() or micros() has elapsed several intervals from your timestamp, there will be a stutter in your square wave as your periodic function hurries to catch up.
If your application is NOT a perpetually fixed periodic interval, such as a One Shot, then timestamp = millis() is the obvious choice.
Any device that is a combination of the two, for example starting and stopping a square wave, will require a sync (timestamp = millis()) at start and an interval increment (timestamp += interval) while running.
That said, as long as ones makes a habit of non-blocking code (which would break either method for perpetually fixed periodic applications) using timestamp = millis() for everything simplifies things for the novice, while the drift is also mitigated by the fact that 16000 clock cycles happen each increment of millis() which is plenty of time for most code. And, if it were to drift a millisecond or two, a lot of uses, save precise timekeeping etc., will not be affected by the drift. For example, if you want to turn on your porch light from 6 to 11 every day, a couple of milliseconds early or late won't matter.
I was thinking about beginners that combine delay() or long execution times (SD library, Neopixel and more) with a bunch of millis-timers that are only a few milliseconds. The result will be horrific.
A delay by code that pushes everything further in time is so much more relaxing.
For a basic Arduino board from the AVR family, there is a correction for millis() to compensate that its interrupt does not run exactly every millisecond. Therefor a Leonardo board with a crystal can run a clock that is exactly the same accuracy as the crystal with "previous_time += time_interval;", So yes, as soon as I hear the word "clock", the more dangerous version should be used.
i doubt you need faster, just something that is reliable, conventional and readable. something you can routinely do without much thought (of course there are exceptions)