Internal clock accuracy

I'm building a clock and I have most of it working fine. It (occasionally manages) receives an MSF signal and sets the RTC. All this works well and cycling the power allows it to continue telling the time properly.

My issue is, since reading the time from the RTC isn't that quick, I'm relying on the internal clock to maintain time between either a new MSF signal or power cycle. However the internal clock appears to lose 3 min/hour( 1 every 20) which seems abnormal to me.

The time is displayed on 7 segment LEDs (SPI) along with the date and also on an LCD (I2C) and a motor is pulsed for 10ms (each second) MSF time is gathered via an interrupt routine. No changes to these displays or the motor take place unless previousSecond != current second.

Can anyone think of areas that I should concentrate my investigations on as to where this time is "going" and as to the cause of the loss? Should I expect such a loss? Is there any way to see if I'm giving the board too much to do?

The board is a clone Leonardo from DIYMORE ( https://smile.amazon.co.uk/gp/product/B07CM5Z7MJ ) but I have ordered a genuine version to try and eliminate the board as a cause.

I can find no reference for an Arduino Leonardo that describes a RTC. Have you added a RTC board or what are you actually using?

Paul

Yes I've added a DS3231 for the RTC and that works fine. It's just the internal Arduino clock that's giving me trouble. The interface is I2C for the RTC.

LeChatQuiRit:
Yes I've added a DS3231 for the RTC and that works fine. It's just the internal Arduino clock that's giving me trouble. The interface is I2C for the RTC.

How are you using the internal clock? If you are using millis() and looking for a difference in time to be so many milliseconds, just adjust the number of milliseconds you are looking for.
Other wise, you need to replace the crystal or resonator for the processor and adjust it's frequency using a frequency counter.
Paul

However the internal clock appears to lose 3 min/hour

:o
If the clock was off that much the Leo couldn't communicate with your computer. Post that code.

My issue is, since reading the time from the RTC isn't that quick, I'm relying on the internal clock to maintain time between either a new MSF signal or power cycle.

How frequently do you synchronize the internal clock to the values of the RTC?

This can certainly be done more frequently than with new MSF reading or restart.

The Time Library employs an automatic synchronization of the millis() based internal clock with the RTC at user set periodic intervals.

Paul_KD7HB:
How are you using the internal clock? If you are using millis() and looking for a difference in time to be so many milliseconds, just adjust the number of milliseconds you are looking for.
Other wise, you need to replace the crystal or resonator for the processor and adjust it's frequency using a frequency counter.
Paul

at boot I read the time from the DS3231

 tinfo = Clock.getTime();
 setTime(tinfo.hour,tinfo.min,tinfo.sec,tinfo.date,tinfo.mon,tinfo.year);

Then in loop()

if (oldSecond !=second() ) {
  oldSecond=second();
// do my things
}

how often does millis overflow?

cattledog:
How frequently do you synchronize the internal clock to the values of the RTC?

At the moment only when there's an MSF update or reboot. I'm guessing you're hinting it should be much more frequent.

cattledog:
The Time Library employs an automatic synchronization of the millis() based internal clock with the RTC at user set periodic intervals.

I've now added a syncProvider I'm just not sure on one thing, When I return the time does TimeLib automatically set the time or should I be setting it in the setSyncProvider function or return the current unixtime plus offset for summer/winter since that's UTC?

I have just added

      time_t LocalTime = UK.toLocal(Clock.getUnixTime(Clock.getTime()));
      setTime(LocalTime);
      return 0;

Although I don't think I really need to worry about the timezone (unless I'm returning unixtime) since the other sync source is MSF and that's always the UK current time regardless of BST/GMT and it's that current time that I save to the RTC after a successful MSF time is received.

If you are using a DS3231, you can configure the INT pin to generate a 1Hz squarewave by configuring INTCN, RS1 & RS2 in the Control Register. Link the INT pin to one of the external interrupt pins on the processor. That might be more accurate depending on the crystal on your RTC module.

Unfortunately I'm all out of pins.

reading the time from the RTC isn't that quick

Have you timed it? It's easy enough to do with micros(). Read the RTC 1000 times and take the average.

LeChatQuiRit:
Unfortunately I'm all out of pins.

Show us your schematic. We'll suggest ways to save pins.

I've now added a syncProvider I'm just not sure on one thing, When I return the time does TimeLib automatically set the time

Yes, the syncProvider function returns a uint32_t value which the Time Library function setSyncProvider uses to set the time.

It looks like you are not using the Time Library setSynchProvider() but rather have created your own function.

With this approach you will need to call it at periodic intervals. The Time Library handles the periodic synchronization for you.

What RTC library are you using which uses these commands

Clock.getUnixTime(Clock.getTime())

PaulRB:
Have you timed it? It's easy enough to do with micros(). Read the RTC 1000 times and take the average.

It's noticeable especially with 7 segment displays showing hh:mm:ss that's all my concern is. With the syncprovider setup and updating every 26 sec I can see it 'skip' seconds in the display which isn't a very good UX. That's the only concern I have.

cattledog:
Yes, the syncProvider function returns a uint32_t value which the Time Library function setSyncProvider uses to set the time.

It looks like you are not using the Time Library setSynchProvider() but rather have created your own function.

With this approach you will need to call it at periodic intervals. The Time Library handles the periodic synchronization for you.

What RTC library are you using which uses these commands

Clock.getUnixTime(Clock.getTime())

I have added the syncprovider, I'm just using a timezone library too...

unsigned long refreshClockTime() { 
      unsigned long ct = Clock.getUnixTime(Clock.getTime());
      time_t LocalTime = UK.toLocal(ct);

      if(UK.locIsDST(LocalTime)) {
        return ct+3600;  
      } else {
        return ct;        
      }
}


...



  setSyncInterval(26);
  setSyncProvider(refreshClockTime);

The DS3231 library I'm using is
http://www.rinkydinkelectronics.com/library.php?id=73

"skip seconds" could simply be due to sampling error. The question is whether the displayed time is accurately maintained.

A clock displaying HH:MM:SS gives no indication of when SS might change, and the unix time stamp resolution is one second.

I have added the syncprovider, I'm just using a timezone library too...

Yes, what you have done looks correct.

With the syncprovider setup and updating every 26 sec I can see it 'skip' seconds in the display

However, if the internal clock is really loosing 3minutes/hour, that is 3 seconds/minute, I'm not certain that resynchronization every 26 seconds with this is really the best solution.

As previously mentioned, using the one second tick from the RTC may be a better solution.

The Leonardo uses a crystal oscillator, but I'm not certain about the specs on the specific component used on the board. The 5% inaccuracy you are experiencing seems totally unreasonable.

LeChatQuiRit:
It's noticeable especially with 7 segment displays showing hh:mm:ss that's all my concern is. With the syncprovider setup and updating every 26 sec I can see it 'skip' seconds in the display which isn't a very good UX. That's the only concern I have.

Post your complete code. We can suggest ways to fix it.

Are you doing anything in your code that disables interrupts for extensive amounts of time (over 1 millisecond) ?

david_2018:
Are you doing anything in your code that disables interrupts for extensive amounts of time (over 1 millisecond) ?

Using interrupts would pretty much be a mistake for a start. :astonished:

Paul__B:
Using interrupts would pretty much be a mistake for a start. :astonished:

Well, from the OP's initial post:

MSF time is gathered via an interrupt routine.

Disabled interrupts was the most obvious thing I could think of that would cause the millis count to be off by that much.