MsTimer2: f overflow running time

Hi, I'm a newbie and am working on my first project, which is a clock. I was planning on using MsTimer2 with a one minute period. Incrementing the time should take about 200ms.

My question is, if my f overflow function has a 200ms running time, will the MsTimer interrupt get called 1 minute after the beginning of the previous call or after then end of the previous call (1minute and 200ms after the beginning of the previous call)? In other words, will my clock be late 200ms * the number of minutes since it was started or will it run on time? Does it make a difference whether my f overflow function has a delay() or not?

Sorry for the basic question or if this was answered before, I looked a little through the forum and couldn't find an answer.

Thanks! Phil

Incrementing the time should take about 200ms.

Why, are you doing it manually? Certainly doesn’t take 200mS to perform an update.
You can easily keep track of time down to 10mS, HH:MM:SS:TH.

unsigned long currentMicros;
unsigned long previousMicros;
unsigned long elapsedTime;

byte hundredths;
byte tenths;
byte secondsOnes;
byte oldsecondsOnes;
byte secondsTens;
byte minutesOnes;
byte minutesTens;


void setup(){

Serial.begin(115200); // make serial monitor match
Serial.println (Setup Done");
}

void loop(){

currentMicros = micros();

// how long's it been?
elapsedTime = currentMicros - previousMicros;
if ( elapsedTime >=10000UL){  // 0.01 second passed? Update the timers
previousMicros  = previousMicros + 10000UL;
hundredths = hundredths+1;
if (hundredths == 10){
    hundredths = 0;
    tenths = tenths +1;
    if (tenths == 10){
        tenths = 0;
        secondsOnes = secondsOnes + 1;
        if (secondsOnes == 10){
            secondsOnes = 0;
            secondsTens = secondsTens +1;
            if (secondsTens == 6){ 
                secondsTens = 0;
                minutesOnes =  minutesOnes + 1;
                if (minutesOnes == 10){
                    minutesOnes = 0;
                    minutesTens = minutesTens +1;
                    if (minutesTens == 6){
                        minutesTens = 0;
                        } // minutesTens rollover check
                     } // minutesOnes rollover check
                 } // secondsTens rollover check
              } // secondsOnes rollover check
          } // tenths rollover check
       } // hundredths rollover check
} // hundredths passing check

if (oldSecondsOnes != secondsOnes){  // show the elapsed time
oldSecondsOnes = secondsOnes;

Serial.print(minutesTens);
Serial.print(":");

Serial.print(minutesOnes);
Serial.print(":");

Serial.print(secondsTens);
Serial.print(":");

Serial.print(secondsOnes);
Serial.print(":");

} // end one second check

} // end loop

Because the clock is in the physical world :) I need to fire solenoids to step up the time. I have reels going from 0 to 9, and in the case of the 10 minutes reel I have to roll it over once it hits 6. So in reality it's more than 200ms, I'd step it 5 times and wait between each to give it time to fire and come back to a ready position.

But I do hear what you're saying with your elapsedTime = currentMicros - previousMicros; though. I guess I'd have to do the diff with the Millis() function to make sure I'm comparing with something trustworthy.

But if you're giving me this solution, does it mean that the answer is that indeed a lenghty f function will skew the timer, and it's not actually getting called every 60000ms but rather 60000ms after the end of the last f function execution?

micros() are more accurate than millis().

You could change these to 1 second (unsigned long allows up to 71 hours of microseconds)

if ( elapsedTime >=1000000UL){ // 1 second passed? Update the timers previousMicros = previousMicros + 1000000UL;

and take out the hundredth & tenth checks. Then you'd have a full second to do all the physical stuff.

Oh! Sorry I hadn't seen the currentMicros = micros(); line. I thought you were juste incrementing by hand... makes much more sense.

Ok that looks like it would work for sure.

Now, I'd still like to understand how the MsTimer works though. Reliably calls your function at every period no matter how long your function runs (provided it's not longer than the period itself), no matter whether you have a delay() in it or not? Or is affected by function running time and cannot be relied on to be called relentlessly at each X milliseconds?

(If you or somebody else don't have an answer I'll try it out myself when I get around to it...)

Thanks for your help!