Analog conversion (ADC) time doesn't agree with datasheet

Following on from this thread about converting analogRead results into voltages, I started experimenting with the ADC converter. First thing I wanted to investigate was the conversion time.

According to http://arduino.cc/en/Reference/AnalogRead:

It takes about 100 microseconds (0.0001 s) to read an analog input, so the maximum reading rate is about 10,000 times a second.

Well we know that is a bit of a simplification because the datasheet says that conversions, apart from the first one when you turn the ADC on, take 13 ADC clock cycles, and because of the way the library sets up the prescaler (to be 128) that is in fact 13 * 128 clock cycles, namely:

1/16e6 * 128 * 13 = 0.000104

In other words, 104 µS.

However a quick test does not confirm that:

unsigned long startTime;
unsigned long endTime ;

void setup ()
  {
  Serial.begin (115200);
  analogRead (0);  // first read

  noInterrupts ();
  startTime = micros ();
  analogRead (0);
  endTime = micros ();
  interrupts ();
  Serial.print ("Time taken = ");
  Serial.println (endTime - startTime);
  }  // end of setup

void loop () { }

Output:

Time taken = 112

(Board: Uno R3, IDE 1.0.6, board running at 16 MHz)

Now I know that micros() has a granularity of 4 µS but that doesn't account for the difference, really.


I tried to eliminate the micros() issue by just toggling an output pin like this:

void setup ()
  {
  pinMode (8, OUTPUT);
  analogRead (0);  // first read

  noInterrupts ();
  bitSet (PINB, 0);  // toggle D8
  analogRead (0);
  bitSet (PINB, 0);  // toggle D8
  interrupts ();
  }  // end of setup

void loop () { }

I measure 111 µS on the oscilloscope between pulses. So the difference is not just micros() granularity.


Now you might be thinking that maybe my processor clock is slow, but in that case micros() would be slow too and it would cancel out. I tried the second sketch above on a different board with a crystal rather than a resonator, and got the same results.

So maybe the code in analogRead has an overhead, although 8 µS sounds a lot. So I rewrote the test to use direct port manipulation. I also used Timer 1 with a prescaler of 1 to do the timing, so we would get 62.5 nS resolution on the timing.

const   byte port = 0;  // A0

void setup ()
  {
  Serial.begin (115200);
  Serial.println ();
  pinMode (8, OUTPUT);

  analogRead (port);  // initial conversion
  
  // reset Timer 1
  TCCR1A = 0;
  TCCR1B = 0;
  TCNT1 = 0;  

  ADCSRA = bit (ADEN) | bit (ADIF);  // enable ADC, turn off any pending interrupt
  // set a2d prescale factor to 128
  // 8 MHz / 128 = 62.5 KHz, inside the desired 50-200 KHz range.
  ADCSRA |= bit (ADPS0) | bit (ADPS1) | bit (ADPS2);   
  ADMUX = bit (REFS0) | (port & 0x07);  // AVcc   

  unsigned int start = TCNT1;
  noInterrupts ();
  bitSet (PINB, 0);    // toggle D8
  
  // start Timer 1
  TCCR1B =  bit (CS20);  //  no prescaling
  // start the conversion
  bitSet (ADCSRA, ADSC);
  // ADSC is cleared when the conversion finishes
  while (bit_is_set(ADCSRA, ADSC));
  TCCR1B = 0;    // stop timer 1
  bitSet (PINB, 0);  // toggle D8
  
  interrupts ();
  unsigned int timeTaken = TCNT1 - start;
  Serial.print ("Time taken = ");
  Serial.println (timeTaken);
  }  // end of setup

void loop () { }

I turned interrrupts off in case that was affecting the timing.

Results:

Time taken = 1751

At 16 MHz that therefore is 1751 / 16 µS which is 109.4375 µS. Even allowing for a couple of cycles to start and stop the clock, I can't believe that it has taken something like 6 µS more than the datasheet predicted.


Can anyone reproduce these results? Have an explanation?

It's not an explanation, but the first thing I'd try if I were investigating is to change the prescaler, and see if the difference between expectation and reality was constant or scaled with the conversion time...

Following on from the above sketch, if I loop testing times like this:

const   byte port = 0;  // A0

void setup ()
  {
  Serial.begin (115200);
  Serial.println ();
  pinMode (8, OUTPUT);

  analogRead (port);  // initial conversion
  
  // reset Timer 1
  TCCR1A = 0;
  TCCR1B = 0;
  TCNT1 = 0;  

  ADCSRA = bit (ADEN) | bit (ADIF);  // enable ADC, turn off any pending interrupt
  // set a2d prescale factor to 128
  // 8 MHz / 128 = 62.5 KHz, inside the desired 50-200 KHz range.
  ADCSRA |= bit (ADPS0) | bit (ADPS1) | bit (ADPS2);   
  ADMUX = bit (REFS0) | (port & 0x07);  // AVcc   

  for (int i = 0; i < 10; i++)
    {
    unsigned int start = TCNT1;
    noInterrupts ();
    bitSet (PINB, 0);    // toggle D8
    
    // start Timer 1
    TCCR1B =  bit (CS20);  //  no prescaling
    // start the conversion
    bitSet (ADCSRA, ADSC);
    // ADSC is cleared when the conversion finishes
    while (bit_is_set(ADCSRA, ADSC));
    TCCR1B = 0;    // stop timer 1
    bitSet (PINB, 0);  // toggle D8
    unsigned int timeTaken = TCNT1 - start;
    
    interrupts ();
    Serial.print ("Time taken = ");
    Serial.println (timeTaken);
    Serial.flush ();
    }
    
  
  }  // end of setup

void loop () { }

I now get different values:

Time taken = 1751
Time taken = 1796
Time taken = 1706
Time taken = 1701
Time taken = 1701
Time taken = 1696
Time taken = 1706
Time taken = 1716
Time taken = 1706
Time taken = 1741

That's a lowest of 1696 and a highest of 1796! That's a hundred clock cycles difference! Now I can accept that it might take 8 or so cycles to detect that the conversion is finished and break out of the loop, but not 100.

DrAzzy:
It's not an explanation, but the first thing I'd try if I were investigating is to change the prescaler, and see if the difference between expectation and reality was constant or scaled with the conversion time...

Good thinking, 99! And in fact I had been trying that:

const   byte port = 0;  // A0

void doConversion (const int which, const byte prescaler)
  {
  // reset Timer 1
  TCCR1A = 0;
  TCCR1B = 0;

  ADCSRA = bit (ADEN);
  // set a2d prescale factor to 128
  // 8 MHz / 128 = 62.5 KHz, inside the desired 50-200 KHz range.
  ADCSRA |=  prescaler;  
  ADMUX = bit (REFS0) | (port & 0x07);  // AVcc   

  noInterrupts ();
  bitSet (PINB, 0);    // toggle D8
  
  // start Timer 1
  TCCR1B =  bit (CS20);  //  no prescaling
  unsigned int start = TCNT1;
  // start the conversion
  bitSet (ADCSRA, ADSC);
  // ADSC is cleared when the conversion finishes
  while (bit_is_set(ADCSRA, ADSC));
  unsigned int finish = TCNT1;
  TCCR1B = 0;    // stop timer 1
  bitSet (PINB, 0);  // toggle D8
  
  interrupts ();
  unsigned int timeTaken = finish - start;
  Serial.print ("Time taken for prescaler of ");
  Serial.print (which);
  Serial.print (" = ");
  Serial.print (timeTaken);
  Serial.print (", expected = ");
  Serial.print (13 * which);
  Serial.print (", discrepancy = ");
  Serial.println (timeTaken - (13 * which));
//  Serial.print ("Result = ");
//  Serial.println (ADC);    
  Serial.flush ();
  }
  
  
void setup ()
  {
  Serial.begin (115200);
  Serial.println ();
  pinMode (8, OUTPUT);

  analogRead (port);  // initial conversion
  
  doConversion (1, 0);
  doConversion (2,   bit (ADPS0) );
  doConversion (4,   bit (ADPS1) );
  doConversion (8,   bit (ADPS0) |  bit (ADPS1) );
  doConversion (16,  bit (ADPS2));
  doConversion (32,  bit (ADPS0) | bit (ADPS2));
  doConversion (64,  bit (ADPS1) | bit (ADPS2));
  doConversion (128, bit (ADPS0) |  bit (ADPS1) | bit (ADPS2));

  }  // end of setup

void loop () { }

Output:

Time taken for prescaler of 1 = 43, expected = 13, discrepancy = 30
Time taken for prescaler of 2 = 43, expected = 26, discrepancy = 17
Time taken for prescaler of 4 = 68, expected = 52, discrepancy = 16
Time taken for prescaler of 8 = 123, expected = 104, discrepancy = 19
Time taken for prescaler of 16 = 238, expected = 208, discrepancy = 30
Time taken for prescaler of 32 = 458, expected = 416, discrepancy = 42
Time taken for prescaler of 64 = 848, expected = 832, discrepancy = 16
Time taken for prescaler of 128 = 1728, expected = 1664, discrepancy = 64

That datasheet quotes two different prescaler bits that give a prescaler of "2", so I made one of them 1, just for display purposes.

There seems to be a big jump in the discrepancy for the prescaler of 128.

OK, I have a theory.

Rewriting the sketch above which took 10 readings to take 500 readings, and save the lowest and highest setting:

#define nop asm volatile ("nop\n\t")

const   byte port = 0;  // A0
unsigned int lowest = 0xFFFF;
unsigned int highest = 0;

void setup ()
  {
  Serial.begin (115200);
  Serial.println ();
  pinMode (8, OUTPUT);

  analogRead (port);  // initial conversion
  
  // reset Timer 1
  TCCR1A = 0;
  TCCR1B = 0;
  TCNT1 = 0;  

  ADCSRA = bit (ADEN) | bit (ADIF);  // enable ADC, turn off any pending interrupt
  // set a2d prescale factor to 128
  // 8 MHz / 128 = 62.5 KHz, inside the desired 50-200 KHz range.
  ADCSRA |= bit (ADPS0) | bit (ADPS1) | bit (ADPS2);   
  ADMUX = bit (REFS0) | (port & 0x07);  // AVcc   

  for (int i = 0; i < 100; i++)
    {
    noInterrupts ();
    nop;

    unsigned int start = TCNT1;
    bitSet (PINB, 0);    // toggle D8
    
    // start Timer 1
    TCCR1B =  bit (CS20);  //  no prescaling
    // start the conversion
    bitSet (ADCSRA, ADSC);
    // ADSC is cleared when the conversion finishes
    while (bit_is_set(ADCSRA, ADSC));
    TCCR1B = 0;    // stop timer 1
    bitSet (PINB, 0);  // toggle D8
    unsigned int timeTaken = TCNT1 - start;
    lowest = min (lowest, timeTaken);
    highest = max (highest, timeTaken);
    
    interrupts ();
    Serial.print ("Time taken = ");
    Serial.println (timeTaken);
    Serial.flush ();
    }
    
  
  Serial.print ("Lowest = ");
  Serial.println (lowest);
  Serial.print ("Highest = ");
  Serial.println (highest);
  Serial.print ("Difference = ");
  Serial.println (highest - lowest);
  }  // end of setup

void loop () { }

With a prescaler of 128:

Lowest = 1676
Highest = 1806
Difference = 130

With a prescaler of 64:

Lowest = 846
Highest = 911
Difference = 65

With a prescaler of 32:

Lowest = 431
Highest = 461
Difference = 30

With a prescaler of 16:

Lowest = 221
Highest = 236
Difference = 15

It seems that the discrepancy is at most the prescaler amount, give or take a couple of cycles due to the time taken to break out of the loop which detects the ADC conversion finished.

So here's my theory:

When you start an ADC conversion, the ADC starts converting (doing its 13 cycles) on the next ADC clock cycle, not the next processor cycle.

So, if the ADC clock just ticked, and we have a prescaler of 128, then it has to wait 127 processor cycles before the ADC starts, hence the overall delay of an extra 127 processor clock cycles.

As an analogy, say the Post Office promises to deliver your letter within 12 hours provided you post it by 8 pm, then if you post it at 8 pm and it arrives at 8 am the next day, they met that promise. But if you post it a 10 am, then the letter takes the extra time (another 10 hours) because the postal truck doesn't collect the letters until 8 pm.

So it seems to me that the datasheet is understating the conversion times a bit, because they don't mention (that I could see) that the conversion time might be lengthened by the need to synchronize with the ADC clock.

Does this explanation sound reasonable?

Yes.

I have found a reference in the datasheet, now I knew what to look for:

When initiating a single ended conversion by setting the ADSC bit in ADCSRA, the conversion starts at the following rising edge of the ADC clock cycle.

In other words, the conversion doesn't start immediately, it waits for a rising edge on the ADC clock.

And the diagram in section 23.4 makes it clear that the ADC clock is basically the processor clock, and then prescaled down to get the ADC clock. Therefore at a prescaler of 128, the ADC clock will only "tick" every 128 cycles (and presumably that would be mod 128 of the clock, counting from the moment that the ADEN bit is asserted).