Today I noticed that my timer IRQ driven ADC polling is about 2% of the mark. That is it runs ~2-4% slower than I would expect it. The issue is not due the crystal frequency because I already checked timing if all IRQs are off, I get exactly what I expect.
Then I started analyzing and now I have some theory about what happens. I have a timer IRQ at 10 kHz (the standard times is off). Then inside the IRQ I call analogRead().
Inside the standard libraries I found out that this will block for some time due to the following loop:
// ADSC is cleared when the conversion finishes
while (bit_is_set(ADCSRA, ADSC));
And in wiring.c I found
// set a2d prescale factor to 128
// 16 MHz / 128 = 125 KHz, inside the desired 50-200 KHz range.
// XXX: this will not work properly for other clock speeds, and
// this code should use F_CPU to determine the prescale factor.
sbi(ADCSRA, ADPS2);
sbi(ADCSRA, ADPS1);
sbi(ADCSRA, ADPS0);
According to the datasheet conversions take 13 cycles. Now 13 cycles @ 125 kHz is about 104 microseconds. So I assume that in 4% of the timer interrupts the conversion is still blocking while another interrupt is triggered. This would explain the effect perfectly well.
The obvious solution to decrease the sample frequency is not exactly what I would like to to.
Now here come my two questions:
- Is my assesment correct?
- What would be the highest sampling frequency that is guaranteed to give proper timings? 9kHz? Or is this to agressive?
Cheers, Udo