Arduino MKR1010 - RTC skipr seconds when polling every second

Hi everyone,

I use Arduino MKR1010, and I want to capture the sensor's data every second and assign the timestamp. I use the RTCZero and NTP libraries. Once the Arduino starts, it requests the timestamp from an NTP server (pool.ntp.org), then it converts it to epoch time and finally feeds the RTC with that epoch time.

void RTC_init(void)
{
  unsigned long epochTime = 0; //variable to store the UNIX time
  sendNTPpacket(NTP_SERVER, NTP_PACKET_SIZE, packetBuffer);
  delay(1000);                                              // wait to receive reply from the NTP server. If you ommit this delay, then there is no time for the NTP server to respond
  epochTime = UDP_parseData(NTP_PACKET_SIZE, packetBuffer); // Unix Time Stamp acquisition
  rtc.setEpoch(epochTime);                                  // provide the epoch time to RTC
}

I have a non-blocking timer based on millis() function that expires every 1s to poll the sensor and capture the epoch time from the RTC.

My problem is that RTC skips 1s occasionally, as you can see from the picture below. I do not believe that this is due to RTC resolution and drifts. Could this be related to RTC roundings? Does anyone have experience on that issue? Any workaround solution?

Why not use the RTC for what it is good at, ie keeping time ?

Read the RTC more frequently than once per second, say every millisecond, and read the sensor when the RTC time changes by one second

Thank you for your feedback .@UKHeliBob. I will try it now and revert.

Perhaps something else in your sketch is blocking and keeping your millis() timer from being checked often enough.

1 Like

@johnwasser That's exactly what I am investigating right now.

Hence my suggestion to use the RTC to do the timing

Thank guys, issue solved. It was indeed some blocking code causing the delay!! Thanks a lot!!

Since the oscillator that runs millis() timing is independent of (and asynchronous to), the oscillator that runs the RTC, I'm thinking there's still a chance you could occasionally observe a reading with an increment of 0 or 2.

That would not happen if you read the time say every millisecond using millis() and only took action when the second changed.

I hope the OP takes note of that, because it's absolutely correct - sampling a one-second signal every second will experience doubles or noughts every so often unless the two timebases are exactly synchronised.

@UKHeliBob's solution is how it should be done, of course.

Seems like a lot of thrashing around on the I2C bus to read the RTC 1000 times / second. Don't many RTCs have the option for a 1Hz square wave output?

If so, I'd poll it in loop() for the rising or falling edge (i.e. State Change). I'd also maintain an int32_t variable whose value was set to Epoch time when first obtained. Then increment it every second as the RTC 1Hz edge is detected.

EDIT:
OK, my bad. I just looked up the RTCZero library. It's for accessing the processor's internal RTC. My comments above don't apply.

I agree. Every millisecond really is not necessary depending on how quickly you want to react to the change of second

Thank you for your help guys!! So the proper way is something like that?

  • Every 10ms aquire the epoch time
  • Every 1s of RTC change do something.

#define millis_interval 10 //10ms RTC polling
#define epoch_interval 1   //1s timer based on epoch time
long previousMillis = 0;
unsigned long epochTime;
unsigned long currentMillis;
unsigned long previous_epochTime = rtc.getEpoch(); //the epoch time resolution is 1s

void loop()
{

    currentMillis = millis();

    if (currentMillis - previousMillis > millis_interval)
    {
        epochTime = rtc.getEpoch();
        previousMillis = currentMillis;
        if (epochTime - previous_epochTime > epoch_interval)
        {
            previous_epochTime = epochTime;
            //Do something...
        }
    }
}

Probably not. Use millis or micros to control the interval. When it's time to do the thing, read the RTC.

I am confused, @UKHeliBob suggests reading the RTC frequently and taking action based on the RTC rate of change (e.g., 1s). Am I doing something wrong?

Please, guys if you have any working example, I would appreciate it.

I don't see any reason to read the clock at a high rate. I see that you found that the problem was blocking from other code. Once that is resolved, millis should do what you need. If blocking remains, reading the RTC won't help you.

Hi @Nikosant03

The easiest way to is use the RTCZero library's alarm interrupt service routine, to time the 1 second intervals.

You should be able to do this by using the library's SimpleRTCAlarm example: https://github.com/arduino-libraries/RTCZero/blob/master/examples/SimpleRTCAlarm/SimpleRTCAlarm.ino ... then in the RTC's callback function increment the alarm time by 1 second each time. This will trigger the callback every second, allowing you to get the epoch time synchronized to the RTC itself.

The problem is the RTC running asynchronously to the oscillator that times the CPU and millis(). So, one second as timed by millis() will be different than one second as timed by the RTC. So, if you measure one second with the former, the latter will occasionally increment by 0 or 2 rather than 1 second during the same period.

1 Like

I agree that's a good option. But, beware the callback will be running in interrupt context. Best to just set a volatile flag in the callback that will be picked up (and cleared) in loop() where any heavy lifting should be done.

Thank you @MartinL for the suggestion. I will give a try on another project. @gfvalvo could you please explain more the purpose of the volatile flag?