RoboticsProfessor:
I'm hearing and reading how millis() is not the answer for accurate time.
Several small clock chips, like DS3231, have month, day, hours, seconds and such, but not a good way to time milliseconds.
I only need milliseconds up to 59 minutes, 59 seconds.
Well if you can state your requirements for the max +/- error in milliseconds over a one hour period perhaps it would shed light on the possibilities or difficulties involved. Note that the source of any basic timing error is mostly due to unit to unit variation of the 16Mhz crystal or ceramic resonator used in most arduino boards which drive the basic clocking speed of the controller. Adding to this variation is crystal padding capacitor value variation tolerance and of course error due to ambient temperature variation for all those mentioned parts. Suffice it say that there will be some basic timing error for any crystal controlled device, but if that error is significant or not can only be answered by the designer of a given application or project.
So while there will always be some basic timing error for any arduino board in general, there is nothing to prevent one from measuring the specific error for one's specific board and using some compensating 'correcting factor' in software to improve the situation for that specific piece of software, e.i. a unit to unit calibration function. This of course is limited by the accuracy and resolution capabilities of the test equipment used by the user to measure the error. Typically one needs to have test equipment measurement accuracy of ten times the accuracy of specification to be meet.
Also keep in mind that there is the arduino defined micros() function that can measure up to a 70 min period, with a 4 microsecond 'step' count resolution. It too is no more accurate then the crystal time base, but in your case might be accurate enough and an improvement over using the millis() function for the same period in question.
Since you're saying you need accurate timing to milliseconds, I'm assuming you don't want to be more than 0.5ms off the actual timing. That's 60m * 60s * 1000ms * 2 or 7,200,000. So, you are looking for something that is accurate to 1 part in 7.2 million, or 0.138888... ppm. You might save time by looking at sub-140ppb (parts per billion) oscillators instead.
Edit: forgot to mention that the Arduinos with crystals will usually be +/- 20ppm and the Arduino Uno with resonator will be +/- 100ppm. So, almost a thousand times more inaccurate than you need (based on my assumption of how accurate you need the measurements).
It's an oven-controlled oscillator. Temperature drift is managed by heating the crystal up to a regulated temperature. It will take about one minute to warm up before the readings will be accurate. Also, the accuracy will degrade up to 10ppb per day and 300ppb per year, so be prepared to test and replace these as needed. This is also an OCVXO, which would let you use an external voltage to adjust the frequency a tiny amount to compensate for aging. You'll need to calibrate these every few days of operation for best results.
My Uno is better, but I did have to replace the resonator and solder in a crystal + caps... One way to improve the performance of millis() and micros()!
My actual application is to time racing robots.
Fast ones would clock in the 10-30 seconds range, slow ones could go 1 to 3 minutes on the same course.
Here is what I'm looking for...
Say a super accurate clock registers two different robots at exactly 12.345 seconds.
If my Uno would record the same two racers at 12.345, that would be great. If one read 12.343 and the other 12.347 then the contestants would think the 12.342 robot was faster. That would seem too far off to me.
Since I'm testing the Gravitech 4-digit, 7-segment display, I can only get 4 digits, therefore only readings to the hundredth of a second, so for this example, 12.347 and 12.342 would read as a tie which would be OK.
If buying a reasonably inexpensive oscillator/chip/timer would be an assurance that identical times wouldn't be recorded as different times, then it seems prudent to go for better accuracy.
The racing takes place on 2 and 3 March, but if the contest is popular, it could become an event every quarter. In that case, I'd want something all would agree is accurate enough so the fastest robot wins.
Temperature differential/compensation: I think not applicable since the races would be held in the same area within minutes of each other.
RoboticsProfessor:
My actual application is to time racing robots ...
I'd probably go with a clone or purpose-built board with a decent crystal, ±20 or ±30 ppm. It's important to be fairly accurate, but it's more important that it be consistent from one run to the next, as opposed to being atomic-clock accurate. With such a crystal, it might be a couple tenths of a second off after an hour if I did the maths right. We mostly just need to know who is fastest. I assume this is not a competition that takes place at several locations but then the actual times (as opposed to rankings) are compared between locations. That would require identical and probably traceable timing equipment.
A lot of LED displays can be stacked end-to-end to give as many digits as needed.
Is more than one robot being timed simultaneously?
You said the timer would have to go up to 59 minutes, 59 seconds. That a huge difference from 3 minutes. If you go back and use my calculations, you can easily see what PPM is acceptable for your requirements. For 5 minutes I'm getting 1.67ppm, which is a lot easier to set up! Correctly communicating your requirements is key.
If you can make sure your eventual solution really doesn't change temperature between races, you may do a lot better than the raw specs would indicate. The quoted PPM variation is usually over the operating temperature range. Put it all in a box with no ventilation, maybe a nice thermal mass in there too. Let it sit at the race location for a few hours.
Your biggest problem will probably reliably triggering your sensors on the accuracy scale you're attempting.
Who is defining the timing resolution you need to provide?
I would have thought that if you're timing robots scurrying about, aiming for millisecond resolution is overkill. If you limit yourself to hundredths of a second, you will probably still find that accuracy is more than you need. Do you have an existing timing mechanism? How often do you get results that tie within a hundredth of a second?