Time moving backwards

I'm a huge fan of the Time library and this post is in no way to imply that there is an issue with it. But I observed an interesting situation a little earlier and I wanted to pass it on in case someone else might notice similar behavior. This is really an artifact of the way the library operates, combined with normal variations in MCU clock speeds. I likely had a worse-than-average situation.

My setup is:

  1. ATmega328P running on the internal RC oscillator at 8MHz.
  2. DS1307 RTC (so also using the DS1307RTC library).
  3. Using the default 5 minute sync interval between the hardware and software RTC.

I happened to notice that time moved back by a second. When I realized that it was happening every 5 minutes, I concluded that the MCU's RC oscillator was probably off far enough to require an adjustment that frequently.

I suppose this could be minimized by syncing more frequently or tuning the internal oscillator. But it probably can't be eliminated entirely, regardless of clock source, as the RTC and the MCU clock can never be expected to be perfectly in sync. Depending on the application, this could be something to consider when coding. Sooner or later, a second may slip one way or the other. In my case, it happened to make no difference at all to the operation of the project, so I just left things as they were. Had me scratching my head just for a minute, though! XD

does it really move backwards
12:00 => tick => 11:59

or does it "freeze"
12:00 => tick => 12:00

Moving backwards is imho an adjustment of 2 seconds, while freezing would be an adjustment of 1 second.
Both can/will affect program logic but backwards definitely more than freeze

Here is what I saw. loop() calls now() every time through and prints the time when it changes from the previous call. So yes that is probably an an adjustment of between 1 and 2 seconds. Given that logic, I wouldn't have noticed an adjustment of one second. Hmmm, interesting. Probably should print millis() out as well.

22:46:01 Sat 09 Jun 2012
22:46:02 Sat 09 Jun 2012
22:46:03 Sat 09 Jun 2012
22:46:04 Sat 09 Jun 2012
22:46:03 Sat 09 Jun 2012
22:46:04 Sat 09 Jun 2012
22:46:05 Sat 09 Jun 2012
22:46:06 Sat 09 Jun 2012
22:46:07 Sat 09 Jun 2012
22:46:08 Sat 09 Jun 2012

Actually seeing one-second and two-second adjustments. Which is what I would expect, depending on the relative phase of the clocks. The first column is millis() and the third column is the difference from the prior row. The code is just calling now() as fast as it can and only printing the values when the value of now() changed from the last call. Note the two events below are five minutes apart.

298102	15:41:05	999
299102	15:41:06	1000
301105	15:41:07	2003	<-- adjusted back by one sec
302106	15:41:08	1001
303106	15:41:09	1000
		
596105	15:46:02	1000
597106	15:46:03	1001
598106	15:46:04	1000
599105	15:46:05	999
600109	15:46:04	1004	<-- adjusted back by two sec
601110	15:46:05	1001
602109	15:46:06	999

have you tried what happens when you sync once per 10 minutes or 60 minutes?
Expect the deltas should be bigger..

robtillaart:
have you tried what happens when you sync once per 10 minutes or 60 minutes?
Expect the deltas should be bigger..

Have not but that is what I would expect as well. Another interesting thing to try would be a crystal-controlled MCU. I'd expect the time slips to still occur but less frequently. Might have to capture output for a while to catch it.