Hi everyone,
if I want to count 3 hours, is it more accurate to use micros() ? or the same result using millis() ?
Can someone explain how do the two functions count time ? I mean what is the trigger for these two counters
"is it more accurate"
it depends on the hardware you are using; which you havent told us.
"if I want to count 3 hours" .. what accuracy do you require? even for a perfect boiled egg you DONT need 4 mins +/- 1 microsecond
Micros() is usually implemented as millis()+microsecondsSinceLastMillisTimerTick, so it’s not going to be more accurate over a 3h timeframe.
How accurate you want 3h interval? you can get RTC module or even better a board with built in RTC instead
Or to put differently, what board do you have?
All in all, it boils down (see reply #2, second link) to the accuracy of the crystal itself. Is an error of a few seconds per hour OK? If it isn't, then why not? What application requires an extremely accurate timekeeping over 3 hours?
It's more precise and it obviously has more resolution but it's not more accurate... And a lot of "things" the processor does take several microseconds so it's hard to control exactly when micros() is read.
I haven't tried this but if you read & "print" micros() in a loop you'll get a lot of missing numbers in the sequence because micros() only gets read once every time through the loop... i.e. Although it's continuously counting in the background, it's not being read continuously and some values will be skipped-over.
Note too that depending on the board, micros() does not increment in steps of 1
This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.