Go Down

Topic: Important FIX to core library - millis(), micros(), digitalWrite(), pinMode() (Read 8235 times) previous topic - next topic


Still not seeing it.  The time that millis() or micros() turns off interrupts is MUCH smaller than the time interrupts are disabled by the occurrence of any interrupt service routine.

My comments are essentially identical to what "dBC" said on your link:

Surely there'd have to be two interrupts from the same source during the disabled window before you'd lose anything?   Otherwise the interrupt just gets delayed until they're globally re-enabled.  Looking at millis() for example, it disables them for about 9 cycles, or 562.5 nsecs at 16MHz.    So worst case, if an interrupt fired just as they were disabled, that event would get delayed by 562.5 nsecs.   Is the PLL code really that sensitive that it can't cope with 1/2 usec of jitter in the timer interrupt?

Calling millis() or micros() "very frequently" in the main loop would be an example of the "badly coded" comment I threw out.  Although, it's not clear that there is a great alternative.  I think you'd be better off asking for a delay() or whatever that doesn't call millis/micros.
You haven't found a bug, you've found a requirement of your code that the arduino environment doesn't meet.  Since people who should know seem to define a "hard real time" OS as one with less than 100us of jitter in interrupt response time (http://www.lisha.ufsc.br/wso/wso2009/papers/st04_03.pdf ), even on cpus that are much faster than the AVR, you're likely to have a hard time finding anything better.


Coding Badly is right, the only way to insure low noise due to ADC sample time jitter is to auto trigger the ADC with Timer/Counter1 Compare Match B.

At the rate chaveiro is sampling, 1600 samples per second, sub microsecond jitter is needed to avoid jitter noise in the AVR 10 bit ADC.  The jitter SNR must be greater than 62 dB to get 10 bit ADC accuracy.

The formula for ADC jitter noise is


SNR = 20*log(1/(2*pi*f*t))

where f is the sample frequency and t is the jitter time.

with 0.1 usec jitter the SNR is about 60, just about good enough.


SNR = 20*log10(1/(6.28*1600*1e-7)) = 59.96

Here is a calculator for ADC jitter http://www.maximintegrated.com/design/tools/calculators/general-engineering/jitter.cfm

It gives 9.71e-8 seconds max jitter for full accuracy of a 10-bit ADC at 1600 samples per second.

Go Up