Being from the future, I assumed you wouldn't need me to explain all this.
A photo was taken of the arduino & stopwatch at approximately 1-minute intervals for 10 minutes. The columns represent the time difference between the 2 clocks. The difference between the clocks at the start had the arduino 1.11 seconds behind the stopwatch. After approx. 10-minutes, the arduino was 0.27 seconds ahead. I hope that explains the data.
I repeated the experiment using 3 different arduinos and several stopwatches with similar results.
pushing data on spi and putting things on the serial will need interrupts.
interrupts are "robbed time".
are you sure your timekeeping is not bothered by these
interrupts ?
I can tell you that here in the future, we expect proper data, just as they did back in 2010 (and 1910 and 1810, etc.)
Proper data has an explanation for each line. Does your list represent readings at 1min, 2min, etc? Then say so. Even back in 2010, not doing so was called "unlabeled".
Proper data is concise. It does not include the same number represented as seconds and milliseconds. Even back in 2010 repeating values was called "redundant".
Your responses have become redundantly useless.
I'll try the same test with an LCD hooked up via parallel (eliminating spi) and report back.
I promise to post concise labeled unredundant proper data in the future.
As I see it the bottom line is that even the crappiest crystal is good for an accuracy of about 100ppm and most are a lot better.
Surely this is accurate enough, and if so the problem must be in the code.
I'm thinking of using a DS3231 (like a chronodot) to create an accurate 1Hz interrupt on the Arduino.
I gather then that you're timing a long event, not a drag race.
The RTCs with inbuilt TXCOs can be calibrated to about 2ppm and even uncalibrated are very good.
A couple of questions.
Is this a start-time-stop application?
How many things are being timed?
Do you need to display the time while the timer is running?
How long is the event being timed?
What accuracy do you need?
The methodology looks OK to me. The internal millis() timer shouldn't be subject to drift because of other code, but it seems that that his arduino are drifting about 1.3 s over 10 minutes (600s.) That's only 0.2% error, not completely "unreasonable" for a resonator, though better would be ... better.
I'm seeing unacceptable timing errors for anything more accurate than an egg timer.
0.2% error is unacceptable? I guess it would be nicer if it looked more regular...
Is this a start-time-stop application?
How many things are being timed?
Do you need to display the time while the timer is running?
How long is the event being timed?
What accuracy do you need?
I am making a split timer, where the timer will run continuously with intervals tripped by an electronic switch.
One individual is timed over several intervals. The overall event could be from 15 minutes to several hours, with the splits lasting in the range of 1-10 minutes long.
I need to display the interval times.
I want at least a tenth-of-second accuracy, hundredths would be better.
Man, it looks like I was having a conversation with myself?
Even my crappy Uno with a resonator is good to 4 seconds a day (72 seconds fast after 20 days) . As I've said before my Duemilanoves (even my Chinese knockoff) are much better. The secret is not to count millis between it doing something but to work just with the big number since it was started.
Note that the divide by 100 is throwing away relatively important precision, and probably explains some of the non-linearity in the times reported in your first post.
If you need 1/10 or 1/100 accuracy, then you don't want a 1pps time source. You would use the 32KHz output on the ChronoDot or other RTC chips. I think the Arduino hardware interrupt pins would be able to handle it. Or set up one of the timers with the external clock option and set up an overflow at the appropriate time. The only difficulty is that an external RTC designed for timekeeping is not going to have a frequency evenly divisible by 10 or 100, so you'll have to just get close and live with some inaccuracy. For 1/100 you would see maybe +/- 0.3% of one count error.
for a 16Mhz clock (assuming it is EXACTLY accurate) you get 64 clocks per timer tick (as set by the arduino's init() code) and 256 ticks per interrupt (overflow).
So thats: 16000000/64/256 = 976.5625 timer interrupts per second
but this number is used in integer math so that comes out to 976. So even with a perfectly precise clock you get error. You can see all this by reading wiring.c
This is why many microcontollers accept TWO clocks one CPU clock at say 16 MHz and another at 32768Hz (=2^15) watch crystal. Checking the datasheet the timer2 can be run by a watch crystal (at least in the mega 2560).
if we do the same math with a 32768Hz clock we get
32768/64/256=2 (or 128 if you don't prescale the clock)
which gives no loss of precision!
So, if you want good timing you should probably attach a watch crystal and use timer 2.
If there is an automatic error of 576ppm (52 seconds a day)( due to the integer maths. why are all my Arduinos so accurate ?. The worst is the Uno at 4 seconds a day.
One of my Duemilanoves made the period between 08:00 24th July and 0800 11th September (7 weeks, 49 days) to be 4233620357 Milliseconds an error of only 20357 milliseconds giving it better than 5 ppm against the NTP controlled host computer. It keeps better time than the Casio on my wrist.
The chinese knockoff is good to just over 2 seconds aday and my other real Duemilanove good to a second a day. They do vary slightly with temperature as you would expect of a crystal.
So even with a perfectly precise clock you get error. You can see all this by reading wiring.c
No, there is no error in the millis calculation. The fractional calculations in that code compensate for the number of actual interrupts per microsecond.