Millis timing incrementing best practice

I didn't want to hijack a thread since this is a bit off-topic.
I am asking because I do not quite understand why not always append the time interval.
It's not that I disagree, more that I'd like a more in-depth explanation, @Koepel

Why would you combine delays with millis timing?

1 Like

You wouldn't, ideally.

Are you really asking why use

previous_time += time_interval;

?
If that's your question consider that you have time_interval = 1000 as an example. Your code comes round to where it checks and it so happens that millis is 1005, so > than 1000. If you just do

previous_time = millis();

previous_time now = 1005, so it's gained 5ms. What you really want is for previous_time to equal 1000, hence += time_interval.

Does that make sense and have I answered what you were asking?

2 Likes

I don't think that's what was meant.

The delay spoken of is just the fact that if your other code overruns the time slice, you have choice:

Increment by the interval, and all your "missed" opportunities are addressed rapidly.

or

Update to now, and let the water under the bridge not matter, just get back into the rhythm.

Which to use? It like all things in code depends.

I like to let the missed opportunities get a chance. Like if every loop dropped a dollar into my hat.

In other cases, doing N calls to make up for the lost time might be a bad choice.

It only matters if you overrun your slice. Also, it makes the best sense for me anyway to get millis() once per loop right away, and use now or if you are full of it today currentMillis in all places where you had millis() calls.

a7

2 Likes

Be aware of potential problems as discussed here:

1 Like

i think you're asking what the different is between

    if ((millis() - time) > period) {
        time = millis();
        ...
    }

and

    if (millis() > time) {
        time += period;
        ...
    }

in this later, consider that happens when millis() wraps back to zero.

in the former case, since both millis() and time are unsigned, the math results in a correct delta even those millis() is a smaller value than time

1 Like

If you have a perpetually fixed periodic interval, the most basic of which would be a square wave, you would of course use timestamp += interval, provided you have sync'd with millis() or micros() at some point (if not at the start)... one clever way to do this (credit to @cedarlakeinstruments ) is to initialize your timestamp to millis() in loop(). Yes, this works, per stmt.dcl and has been tested with arduino and GCC, and it works...

void loop() {
static unsigned long timestamp = millis();

If the return value from millis() or micros() has elapsed several intervals from your timestamp, there will be a stutter in your square wave as your periodic function hurries to catch up.

If your application is NOT a perpetually fixed periodic interval, such as a One Shot, then timestamp = millis() is the obvious choice.

Any device that is a combination of the two, for example starting and stopping a square wave, will require a sync (timestamp = millis()) at start and an interval increment (timestamp += interval) while running.

That said, as long as ones makes a habit of non-blocking code (which would break either method for perpetually fixed periodic applications) using timestamp = millis() for everything simplifies things for the novice, while the drift is also mitigated by the fact that 16000 clock cycles happen each increment of millis() which is plenty of time for most code. And, if it were to drift a millisecond or two, a lot of uses, save precise timekeeping etc., will not be affected by the drift. For example, if you want to turn on your porch light from 6 to 11 every day, a couple of milliseconds early or late won't matter.

2 Likes

I was thinking about beginners that combine delay() or long execution times (SD library, Neopixel and more) with a bunch of millis-timers that are only a few milliseconds. The result will be horrific.
A delay by code that pushes everything further in time is so much more relaxing.

For a basic Arduino board from the AVR family, there is a correction for millis() to compensate that its interrupt does not run exactly every millisecond. Therefor a Leonardo board with a crystal can run a clock that is exactly the same accuracy as the crystal with "previous_time += time_interval;", So yes, as soon as I hear the word "clock", the more dangerous version should be used.

1 Like

No, I was asking if there was a time not to use it, per the quote from @Koepel

Sure makes more sense now. Pick between a steady rhythm or a do everything flow.

Thank you. I will read that thread, looks interesting.

No, but I see now that failed to put enough context in my question.

That is now perfectly clear, thank you.

Thanks everybody.

1 Like

From the thread provided by @LarryD this appears to be one of the safest methods, unless there's a need for the potentially lost cycles.

Thanks @Coding_Badly for the original comment

1 Like

Yes, you win 1st prize !
:slight_smile:

1 Like

wouldn't it just be easier to initialize millisOne to millis() at the time you want to start the timer?

Would the mechanism for implementing that be any faster than this?
I see this one turning into an if and a jump command for the most part.

i doubt you need faster, just something that is reliable, conventional and readable. something you can routinely do without much thought (of course there are exceptions)

consider

// enable flashing LED with button

const byte LedPin = 13;

const byte ButPin = A1;
byte butState;

unsigned long msecLst;

enum { Off = HIGH, On = LOW };

void loop()
{
    unsigned long msec = millis();

    if (msecLst && (msec - msecLst) > 500)  {
        msecLst = msec;
        digitalWrite (LedPin, ! digitalRead (LedPin));
    }

    byte but = digitalRead (ButPin);
    if (butState != but)  {
        butState = but;
        delay (10);         // debounce

        if (LOW == but)  {
            if (msecLst)  {
                msecLst = 0;
                digitalWrite (LedPin, Off);
            }
            else  {
                msecLst = 0 != msec ? msec : 1;
                digitalWrite (LedPin, On);
            }
        }
    }
}

void setup()
{
    Serial.begin(115200);

    pinMode (LedPin, OUTPUT);
    digitalWrite (LedPin, Off);

    pinMode (ButPin, INPUT_PULLUP);
    butState = digitalRead (ButPin);
}

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.