Go Down

Topic: Millis Accuracy Again (Read 6 times) previous topic - next topic



I did read what you said. But I don't see how it applies in my situation.

1) I don't need for my routine to run at precisely 1000ms intervals

2) I don't need to (and don't) keep track of how long atan takes

3) I don't see how keeping track of the rollover, etc. is any more accurate than reading millis() directly.

It's not that I don't appreciate the help, but that I don't understand why your approach is better than reading millis().

Again to reiterate, I do not need for my routine (or event, or what-not) to happen at fixed intervals. I just need to know when it happens.


You keep saying you need to know what time it is and how many seconds there are in a sidereal day.  I gave you a mechanism that will precisely keep track of seconds as time passes.  All you have to do is add one to a counter each time the while loop finishes.  micros() will roll over every 72 minutes so you might possibly need to keep that in mind.
Experience, it's what you get when you were expecting something else.


Apr 18, 2013, 01:41 pm Last Edit: Apr 18, 2013, 01:49 pm by orly_andico Reason: 1
Agreed. But the mechanism also relies on millis(). Or micros() for that matter.

How is it superior to just reading one of the above functions directly?

And if we are agreed that millis() / micros() are drifting, wouldn't the above approach also drift?

From my understanding of the stability of crystals and resonators, they drift based on temperature. If the temp is fairly constant, their deviation from rated frequency should be more-or-less the same, and not all over the place.

I also don't understand why accumulating a count yourself (instead of relying on the millis() routine to do that) is any better than using millis() directly.

If millis() is losing ticks, then any routine that relies on millis() would also lose ticks.


Apr 18, 2013, 01:48 pm Last Edit: Apr 18, 2013, 01:54 pm by afremont Reason: 1
No, I'm relying on the long term accuracy of millis().  In the long run it is accurate as the ISR is self compensating for the 1.024mS interrupt interval.  Any drift using what I gave you is from the oscillator and cannot be stopped without going to a more accurate oscillator such as a TCXO or a super accurate reference such as a Chronodot.

EDIT:  Catching up with your edit.  If anything is causing a long enough delay to cause the Timer0 ISR to miss a tick, then that is a severe problem.  The only thing I know of that does that is SoftwareSerial, and possibly the IR library.

In my experience, temperature drift of the ceramic resonator is quite small at room temperature variations.  Perhaps I've been lucky, but my two Uno's are within 300ppm and 50ppm respectively of being on the dot.
Experience, it's what you get when you were expecting something else.


Yes. And that's why I don't understand your approach.

Right now I am reading millis() directly.  So you could say I am also relying on the long-term accuracy of millis().  And while it's true that the ceramic resonator in my Mega is not that good, I'm also seeing drift in millis() with the Max32 which has a +/- 30ppm crystal.

Again - I'm not doing anything special with millis(). I'm just reading it. And it's losing time.

Why would your approach result in better time-keeping when it also relies on millis() ?

Go Up