Following on from this thread about converting analogRead results into voltages, I started experimenting with the ADC converter. First thing I wanted to investigate was the conversion time.
According to http://arduino.cc/en/Reference/AnalogRead:
It takes about 100 microseconds (0.0001 s) to read an analog input, so the maximum reading rate is about 10,000 times a second.
Well we know that is a bit of a simplification because the datasheet says that conversions, apart from the first one when you turn the ADC on, take 13 ADC clock cycles, and because of the way the library sets up the prescaler (to be 128) that is in fact 13 * 128 clock cycles, namely:
1/16e6 * 128 * 13 = 0.000104
In other words, 104 µS.
However a quick test does not confirm that:
unsigned long startTime;
unsigned long endTime ;
void setup ()
{
Serial.begin (115200);
analogRead (0); // first read
noInterrupts ();
startTime = micros ();
analogRead (0);
endTime = micros ();
interrupts ();
Serial.print ("Time taken = ");
Serial.println (endTime - startTime);
} // end of setup
void loop () { }
Output:
Time taken = 112
(Board: Uno R3, IDE 1.0.6, board running at 16 MHz)
Now I know that micros() has a granularity of 4 µS but that doesn't account for the difference, really.
I tried to eliminate the micros() issue by just toggling an output pin like this:
void setup ()
{
pinMode (8, OUTPUT);
analogRead (0); // first read
noInterrupts ();
bitSet (PINB, 0); // toggle D8
analogRead (0);
bitSet (PINB, 0); // toggle D8
interrupts ();
} // end of setup
void loop () { }
I measure 111 µS on the oscilloscope between pulses. So the difference is not just micros() granularity.
Now you might be thinking that maybe my processor clock is slow, but in that case micros() would be slow too and it would cancel out. I tried the second sketch above on a different board with a crystal rather than a resonator, and got the same results.
So maybe the code in analogRead has an overhead, although 8 µS sounds a lot. So I rewrote the test to use direct port manipulation. I also used Timer 1 with a prescaler of 1 to do the timing, so we would get 62.5 nS resolution on the timing.
const byte port = 0; // A0
void setup ()
{
Serial.begin (115200);
Serial.println ();
pinMode (8, OUTPUT);
analogRead (port); // initial conversion
// reset Timer 1
TCCR1A = 0;
TCCR1B = 0;
TCNT1 = 0;
ADCSRA = bit (ADEN) | bit (ADIF); // enable ADC, turn off any pending interrupt
// set a2d prescale factor to 128
// 8 MHz / 128 = 62.5 KHz, inside the desired 50-200 KHz range.
ADCSRA |= bit (ADPS0) | bit (ADPS1) | bit (ADPS2);
ADMUX = bit (REFS0) | (port & 0x07); // AVcc
unsigned int start = TCNT1;
noInterrupts ();
bitSet (PINB, 0); // toggle D8
// start Timer 1
TCCR1B = bit (CS20); // no prescaling
// start the conversion
bitSet (ADCSRA, ADSC);
// ADSC is cleared when the conversion finishes
while (bit_is_set(ADCSRA, ADSC));
TCCR1B = 0; // stop timer 1
bitSet (PINB, 0); // toggle D8
interrupts ();
unsigned int timeTaken = TCNT1 - start;
Serial.print ("Time taken = ");
Serial.println (timeTaken);
} // end of setup
void loop () { }
I turned interrrupts off in case that was affecting the timing.
Results:
Time taken = 1751
At 16 MHz that therefore is 1751 / 16 µS which is 109.4375 µS. Even allowing for a couple of cycles to start and stop the clock, I can't believe that it has taken something like 6 µS more than the datasheet predicted.
Can anyone reproduce these results? Have an explanation?