Hi
I was wondering, will using Timers instead of micros() being more precise/accurate?
Thank you
Simon
Hi
I was wondering, will using Timers instead of micros() being more precise/accurate?
Thank you
Simon
A bit more info:
What I am tyring to do is a simple metronome. But I have been recording the output, I tried both with a Arduino Uno (with a resonator) and a (chinese) pro mini, which has a crystal oscillator.
Running the metronome over 5 minutes:
And for a musician hearing a click even 72ms late is really annoying trust me!
I think the numbers kinda make sense since the precision of the resonator is about 0.5% - i.e. 0.005 * 300s = 1.5s over 5min, and I guess the precision of the crystal is 50ppm = 0.005% - i.e. 0.0001 * 300s = 15ms over 5min
But for now I am using micros() to get the timing, hence my question: will using the timers make it more accurate? In particular will it get it down to the 15ms precision of the oscillator?
Thank you in advance!
"micros()" and "millis()" both use one of the hardware timers under the covers, so there is no particular advantage to explicitly using a timer. The drift you are seeing is almost entirely down to the accuracy of the oscillator.
If you need better time, you need a better oscillator. Some of the real time clocks modules (e.g. DS3231) have a clock signal output that could be used as a time base and these modules have much better oscillator performance than any reasonably priced alternative.
And for a musician hearing a click even 72ms late is really annoying trust me!
Late compared to what? No human can keep time to within 240 parts per million.
Thank you both for your answers.
@MrMark: Thanks for the info - kinda what I was thinking. But still using timers and interrupts will avoid the small "running code" delay (i.e. the time the rest of the code has to run until it gets to the "is it time to send a new click"), would it not?
@jremington: a valid point ha ha! But I was thinking hearing the click with a backing track for example, which is played perfectly on the click... And actually thinking about it, how can a computer (for example) play a track 100% on the time? I mean they must have the same issue I have... o_O?
How do you know the backing track timing is accurate? It can't be any more accurate than the oscillator driving it.
Any time you have two clocks, you have a synchronization problem.
the time the rest of the code has to run until it gets to the "is it time to send a new click"
That is a constant delay (usually less than a microsecond) and does not contribute to a cumulative effect.
Again a very valid point mister - like i said I do not know...
What I did was recording the click's output in a DAW (Pro Tools), and then comparing the recorded click with the internal click. Oooh but I think I understand now - are you saying that in that case the precision is down to the ADC on my computer? i.e. if it does not make exactly 44100 samples per second but maybe per 928ms, then my output is perfect, while in fact the computer is just a little too fast?
Exactly.
The recorder should have a relatively accurate and stable clock, so if you record your metronome and use those clicks, the only further timing errors to expect would be due the temperature sensitivity of the cheap resonator or crystal in the metronome.
Okay! Nice to know man - thank you again!
Funny enough the error (72ms) seems to be quite stable - then I was thinking maybe simply correcting the error in the code? Or is it a better idea to go to RTC module - e.g. the DS3231 like MrMark suggested?
No musical devices rely on all having the same frequency in their crystals. Instead, all connected devices follow the clock of one device. Say you are editing a video and you want some DAW like Pro Tools or Reaper play a soundtrack for the video editor to record. You have exact seconds and frame numbers for every important thing where video event and musical event have to match. You can edit your video and you can edit your music track, but if you just render a wav file from your DAW and import it to the video editor, you might miss some ms after some minutes. But if you manage to have your DAW to read the time code from your video editor, you get it matching perfectly. Or the other way around, have your video editor read the time code from the DAW.
That's great info Johan thanks! Kinda what I am starting to understand! I measured the precision of the click coming out of pro tools, it turned out to be even more unprecise as mine