Arduino Clock Accuracy without external source

I have built a clock prototype using my arduino UNO board and a LED matrix, I have attached some buttons so you can set the time and so far so good. I would like to build a permant version of this to use as a clock in my house.

My problem is that it doesnt keep very good time, its loosing about 2 seconds an hour (~555 ppm) that sounds like more than can be explained by the error rate on the crystal (although I havent been able to find out the error rate for the crystal that came with my arduino - SPK16.000G)

I have done some googling to try and find a solution. So far I have found several possibilities.

  1. Use an external time source. (sounds like an expensive option)
  2. Use real time clock board. (this doesnt feeneccesary l as I can get the same quality crystal to clock the arduino)
  3. Use a better quality crystal to time my arduino. (this would be my prefered option as I will need to build a standalone arduino circuit to power my clock long term. )

If someone could explain what the problem is here and help me fix it I would be very grateful.

Here is my code for updating the time in case anyone can see something wrong with it:

Thanks in advance.

  unsigned long currentMillis = millis();
  int milliDiff = currentMillis - lastMillis;
  lastMillis = currentMillis;
  
  milli += milliDiff;
  
  while (milli >= 1000)
  {
    milli -= 1000;
    seconds++;
  }
  
  while (seconds >= 60)
  {
     seconds-=60;
     minutes++;
  }
  
  while (minutes >= 60)
  {
    minutes -= 60;
    hours++;
  }
  
  while (hours >= 24)
  {
     hours -= 24; 
  }

I don't see why you're taking such a roundabout way to count seconds.

unsigned long lastSecond = 0;
unsigned long seconds = 0;

void loop()
{
    while(millis() - lastSecond > 1000)
    {
        lastSecond += 1000;
        seconds++;
    }
    ... etc

The local clock is never going to be accurate enough for use as a long term wall clock. You need to keep it synced either with an external time source (NTP, GSM etc) or an RTC.

Thanks for the answer, but I have more questions now.

I don't see why you're taking such a roundabout way to count seconds.

It was just the first way I thought of doing it, as I had an on board milliseconds counter it seemed to make sense to count milliseconds. Thinking about it I believe your code would achive the same but I dont think it would improve accuarcy.

The local clock is never going to be accurate enough for use as a long term wall clock.

I am someone who questions everything and likes to understand what is going on. I would like to know the following:

  1. Why isnt the on board clock as accuate as the crystal that drives it?
  2. What is different about a RTC chip that gives it the accuaracy?

Thanks again

I tend to use a gps module to sync the time on such projects( as long as there is reception where you want situate the clock - or you could have the GPS in a nearby window )

By the time you have buttons to initially set the time, and to reset whenever it drifts - which even a battery wall click will do , you can buy a Fastrax UP501 GPS module ( we pay about US$ 15 for here in South Africa )

I usually get it to sync a free running clock ( counting millisecs as you are doing ) every ten minutes or so, if it is locked on to the satellites.

CrossRoads posted a millis()-based clock that uses Serial Monitor. [.PDE attached]
It's pretty accurate over extended periods.
It needs a routine to set the initial time with (or stay up late and start it up at exactly midnight.)

CrossRoads_clock.pde (739 Bytes)

At room temperature I've been able to calibrate my Uno using code so it's accurate to 1us / seconds (1ppm).
Or 0.1ppm over the long term without any temperature changes.
I did this using the PPS pin of the Ultimate GPS from Adafruit which is within about 10ns.
In theory you could measure the speed of light with this over 1000 ft between 2 units!
Let me know if you want code.
It can also be done using the serial NEMA sentence which is accurate to a few ms, from any GPS.
After you find your constant, you no longer need the GPS or a RTC.
I also have code to correct DS1307 just search the forums and my name.

I believe the clock on an Uno is not crystal-controlled, but rather uses a ceramic resonator. These can be less expensive, but typical accuracy is only on the order of ±0.5% (5000ppm).

I'd look at adding a real time clock. The Maxim DS3231 is a favorite, and very accurate (2ppm). One can be had on a breakout board from Macetech. Several of the other popular retailers carry the Chronodot as well.

Alternative time source.
http://g7kse.co.uk/projects/arduino-msf-receiver/

What is different about a RTC chip that gives it the accuaracy?

Nothing if you just get a normal RTC, but if you get one with an inbuilt TCXO (Temp Controlled Xtal Osc) it will be very accurate.

The DS3231 that Jack mentioned is such a device.


Rob

  1. Why isnt the on board clock as accurate as the crystal that drives it?
  2. What is different about a RTC chip that gives it the accuaracy?

I have an idea that the RTC is not doing anything other than counting clock pulses, whereas the board's clock is doing other work.

I dont know the internals, but when I have a free running clock using the boards xtal, and put in a couple of interrupts and serial.Prints, I have to adjust the millis count to get the clock right again. ( like, if clockMillis >= 59998 for a minute )

Thanks for all the responses, I have got a RTC chip on order and will have a play with it from a learning point of view if nothing else.

I will also play around with calibrating my arduino as several people have suggested.

I also like the idea of having a clock that is free running but takes note of any subtle changes you make to the timing and use these to auto correct its self in future. In this case as you manually make corrections to the clock it would get more accurate.

Would disabling interupts make the clock more accuate? Or is this going to break something in a really bad way (I dont really fully understand what interupts are used for atm).

if (rectime <= now() && tm1 == 0) //adjust clock
{ setTime(now() + 1);
int tm1 = 1000;
rectime = now() + 3217;// after this many seconds add 1 seconds
Serial.println("++++++++++++++++++++++++++++++++++++++++++++++++++++");
}
if (tm1 > 0)
{
tm1 = tm1 -1;
}

I use the above routine in my programs to keep the arduino in sink. I have tested it for a week at a time and adjusted the 3217 up and down a little to get better accuracy. So far it seems to be accurate to about 1 second a week. The rectime = now() + 3217; needs to be in the setup at the start of the program.

Your clock accuracy is consistent with what I'd expect from my regular arduino first clock without external timekeeping.
For the price of a fet and a few (!safety!) well chosen resistors, you can get it to amend its timekeeping against your local mains of either 50.00 or 60.00 Hz average over 24 hours.
I've had a lot more fun getting an external radio lock to the atomic clock at the national physical laboratory.

If you defintely want it without an external time reference then expect the sort of drift which you've been seeing.

ElectronicsNoobie:
I also like the idea of having a clock that is free running but takes note of any subtle changes you make to the timing and use these to auto correct its self in future. In this case as you manually make corrections to the clock it would get more accurate.

Hi, this is quite old topic but there was no definite conclusion. This is exactly what I would like to make. Have someone already did something like that? As far as I know it is possible to use external crystal or ceramic resonator or on-chip oscillator as clock source. They are all inaccurate, frequency is changing with temperature, voltage and time. How precise time measurement I can get with a standalone chip using it's own oscillator? How much adding a crystal or a ceramic resonator makes it more precise? Is it possible to measure reliably minutes or even seconds? Or maybe even fraction of seconds? I expect the chip to be at home powered from wall with no heavy load (motors or something like that) hidden from direct sunlight so temp varying at most from 18°C (winter) to 35°C (hot summer).

+1 for the DS3231
I did exactly the same as OP - five years ago, and haven't re-set the time yet.
(Apart from daylight saving - because I was too lazy to add that in code).

My original reason was the rubbish quality of the displays and timekeeping in retail clocks. Add a PIC 16F628A, DS3231, Supercap and $10 2-inch LED display that I can read across the room (without glasses!). All done for under $25 total.
PWM day/night auto brightness. Perfect.

Is it me, or is there more than the average thread necro'ing going on these days? I am now using a DS3231 module that has an onboard 3.3V regulator. I figure that should make it even more accurate.

As an aside, I started measuring the 60 cycle frequency at home, and was shocked to find that it is quite accurate, a few parts per million accurate. Perhaps with the larger interconnected grid that they have now, there is a need to be more synchronized than in the past. Just a guess.

This is an old post, but I thought someone might find this of interest. I wrote an AUTOIt script to accurately calculate the clock drift and a correction factor in about half an hour. I tested two boards:
One of my UNO board lyring around: +23s/day
One of my nano board lyring around: -13s/day
(actual drift was calculated with more precision, just rounding for convenience)

Interesting thing to note is I repeated the test with an elevated die temperature of 63C, and the clock drift went from -13s/day drift to 0s/day drift(Yes, "exactly" zero within measurement accuracy, which happened to be a complete coincidence. Again rounding for convenience). An increase of 0.02662%! pretty cool.

Here is the AUTOIt code (NOT ARDUINO CODE!). It was too long for a forum post, sorry...:
https://dl.dropboxusercontent.com/u/22193875/temp/TimerCalib.au3

And here is the paired code that runs in an arduino:

/*
Timer CALIBRATION
v 0.1
Author: Terry Myers, Control System Engineer, EZSoft, Inc, Malvern PA
Coordinated with an autoit Script to accuratly measure the timer drift
*/
long MillisCorrectionFactor;
void setup() {
  // put your setup code here, to run once:
  Serial.begin(115200);

}

//===============================================================================
void loop() {
 Serial.print(millisCorrected());Serial.print("|");
delay(1000);
}

unsigned long millisCorrected(){
//unsigned long MillisCorrectionFactor = long(float(millis())*0.0003667);
unsigned long MillisCorrectionFactor = 0;
  return millis() + MillisCorrectionFactor;
}

Try this code that does not use delay() for ticking off seconds.

unsigned long currentMicros;
unsigned long previousMicros;
unsigned long elapsedTime;

byte hundredths;
byte tenths;
byte secondsOnes;
byte oldSecondsOnes;
byte secondsTens;
byte minutesOnes = 1; // set to match your clock
byte minutesTens = 3;
byte hoursOnes = 1;
byte hoursTens = 0;

void setup(){

  Serial.begin(115200); // make serial monitor match
  currentMicros = micros();
  previousMicros = currentMicros;
  Serial.println ("Setup Done");
}

void loop(){

  currentMicros = micros();

  // how long's it been?
  elapsedTime = currentMicros - previousMicros;
  //Serial.print ("Elapsed: ");  
  //Serial.println (elapsedTime);
  if ( elapsedTime >=10000UL){  // 0.01 second passed? Update the timers
    elapsedTime = 0;
    previousMicros  = previousMicros + 10000UL;
    hundredths = hundredths+1;
    if (hundredths == 10){
      hundredths = 0;
      tenths = tenths +1;
      if (tenths == 10){
        tenths = 0;
        secondsOnes = secondsOnes + 1;
        if (secondsOnes == 10){
          secondsOnes = 0;
          secondsTens = secondsTens +1;
          if (secondsTens == 6){ 
            secondsTens = 0;
            minutesOnes =  minutesOnes + 1;
            if (minutesOnes == 10){
              minutesOnes = 0;
              minutesTens = minutesTens +1;
              if (minutesTens == 6){
                minutesTens = 0;
                hoursOnes = hoursOnes +1;
                if (hoursOnes == 10){
                  hoursOnes = 0;
                  hoursTens = hoursTens =1;
                  if (hoursOnes == 4 && hoursTens ==2){
                    hoursOnes = 0;
                    hoursTens = 0;
                  }
                }
              } // minutesTens rollover check
            } // minutesOnes rollover check
          } // secondsTens rollover check
        } // secondsOnes rollover check
      } // tenths rollover check
    } // hundredths rollover check
  } // hundredths passing check



  if (oldSecondsOnes != secondsOnes){  // show the elapsed time
    oldSecondsOnes = secondsOnes;
    Serial.print ("Time: ");
    Serial.print (hoursTens);
    Serial.print(hoursOnes);
    Serial.print(":");
    Serial.print(minutesTens);
    Serial.print(minutesOnes);
    Serial.print(":");
    Serial.print(secondsTens);
    Serial.print(secondsOnes);
    Serial.print(" micros: ");
    Serial.println (currentMicros);

  } // end one second check

} // end loop

Rollover demo!

Time: 22:59:32 micros: 4292000060
Time: 22:59:33 micros: 4293000064
Time: 22:59:34 micros: 4294000068
Time: 22:59:35 micros: 32776
Time: 22:59:36 micros: 1032764
Time: 22:59:37 micros: 2032764

Terryjmyers:
This is an old post, but I thought someone might find this of interest. I wrote an AUTOIt script to accurately calculate the clock drift and a correction factor in about half an hour. I tested two boards:

First of all "drift" is not the same as an almost constant clock speed difference. It is a variance in that difference. Terminology.

Yes, you can compensate for a constant clock error. But you can't compensate for the temperature sensitivity of a SAW resonator in software, at least not straightforwardly. So it's still usually a losing game, compared to simply adding a good RTC at very low cost.