Faster Analog Read?

Bear in mind that the Arduino digital and anlog read/write abstractions that make this platform so easy to use, do reduce the execution time over what could be achieved if one used low level code to directly access these functions. The arduino functions will be slower than the timings quoted above.

For example, the Arduino digitalRead function first does a lookup to convert the Arduino pin number to an actual port and pin. It then disables any PWM function that could be running on this pin. And finally, it executes another dozen instructions or so to actually read the port. I would think it would take well over twenty times longer for each digitalRead compared to directly accessing a specific low level port pin.

AnalogRead also does the Arduino pin mapping lookup and it sets the analog reference bits each time analogRead is called, although this probably represents a small fraction of the total ADC conversion time.

If your application does need digital read times under a microsecond, you can read more about direct port manipulation here: Arduino Reference - Arduino Reference

Wow. Thank you jmknapp and oracle.
So if I just include your defines in the top of my code and the function calls in my setup, I should be good to go? As in, analogReads will just return a lot faster (at a bit lower precision)?
Chris

Yes, analogRead() will just return faster if you set the prescale, I've tried it. Here's a little test program that shows the effect:

#define FASTADC 1

// defines for setting and clearing register bits
#ifndef cbi
#define cbi(sfr, bit) (_SFR_BYTE(sfr) &= ~_BV(bit))
#endif
#ifndef sbi
#define sbi(sfr, bit) (_SFR_BYTE(sfr) |= _BV(bit))
#endif

void setup() {
  int start ;
  int i ;
  
#if FASTADC
  // set prescale to 16
  sbi(ADCSRA,ADPS2) ;
  cbi(ADCSRA,ADPS1) ;
  cbi(ADCSRA,ADPS0) ;
#endif

  Serial.begin(9600) ;
  Serial.print("ADCTEST: ") ;
  start = millis() ;
  for (i = 0 ; i < 1000 ; i++)
    analogRead(0) ;
  Serial.print(millis() - start) ;
  Serial.println(" msec (1000 calls)") ;
}

void loop() {
}

As it stands above, with FASTADC defined as 1, the 1000 calls to analogRead() take 16 msec (16 microseconds per call). With FASTADC defined as 0, the default prescale gives 111 microseconds per call.

Personally I've been looking into ways to make ADC processing less expensive in CPU cycles, to avoid calling analogRead() which burns over 100 microseconds sitting in a loop waiting for the conversion to finish. Turns out there's a mode where the conversion finishing can generate an interrupt, so that's the route I'm going, so the processor can be doing other things during the conversion (as long as the result is not needed right away!).

As for how the above code relates to a prescale value of 16, that's from a table in the Atmega data sheet:

Do you guys have a sense of how much the resolution or accuracy of the inputs vary as the speed increases? I'm happy to decrease the default prescale factor if we still get accurate readings. (I think it's better in most cases to be accurate and a bit slow than the other way around.) Any thoughts on which value provides the best balance of accuracy and speed?

Do you guys have a sense of how much the resolution or accuracy of the inputs vary as the speed increases? I'm happy to decrease the default prescale factor if we still get accurate readings. (I think it's better in most cases to be accurate and a bit slow than the other way around.) Any thoughts on which value provides the best balance of accuracy and speed?

If the tradeoff isn't too bad, you can add a new function, something like analogReadFast, and leave the existing functionality unchanged.

Do you guys have a sense of how much the resolution or accuracy of the inputs vary as the speed increases? I'm happy to decrease the default prescale factor if we still get accurate readings. (I think it's better in most cases to be accurate and a bit slow than the other way around.) Any thoughts on which value provides the best balance of accuracy and speed?

I don't know how much the accuracy drops if you exceed the recommended rates in the datasheet, but I would think it would not be good practice to add functionality to the standard Arduino platform that clocked the ADC faster then the manufactures recommendation. And if my math is correct, at 16mhz, a prescale of 64 or less will exceed the 200khz max ADC clock rate stated in the datasheet.

Perhaps someone who has the time to explore and document the affect on accuracy could post some information and code in the playground.

Oracle, did you have a chance to do any tests on accuracy when you ran the speed tests?

Oracle, did you have a chance to do any tests on accuracy when you ran the speed tests?

When I was doing the speed tests, I didn't know there was a prescaler for the ADC speed. I thought it was fixed for the chip. I didn't know running speed or accuracy tests on that part was possible.

It probably wouldn't have mattered to me since I knew my bottleneck was the LCD. And my project is working perfectly now with the timer interrupt library from playground, so I didn't actually need to speed up my loop.

I don't know how much the accuracy drops if you exceed the recommended rates in the datasheet, but I would think it would not be good practice to add functionality to the standard Arduino platform that clocked the ADC faster then the manufactures recommendation. And if my math is correct, at 16mhz, a prescale of 64 or less will exceed the 200khz max ADC clock rate stated in the datasheet.

Still, the Atmel document referred to above (http://www.atmel.com/dyn/resources/prod_documents/DOC2559.PDF) says:

The ADC accuracy also depends on the ADC clock. The recommended maximum ADC clock frequency is limited by the internal DAC in the conversion circuitry. For optimum performance, the ADC clock should not exceed 200 kHz. However, frequencies up to 1 MHz do not reduce the ADC resolution significantly. Operating the ADC with frequencies greater than 1 MHz is not characterized.

When using single-ended mode, the ADC bandwidth is limited by the ADC clock speed. Since one conversion takes 13 ADC clock cycles, a maximum ADC clock of 1 MHz means approximately 77k samples per second. This limits the bandwidth in single-ended mode to 38.5 kHz, according to the Nyquist sampling theorem.

To me, that's as good as a green light, direct from the manufacturer, to go to 1 MHz and a 77k sample rate--with no significant effect on resolution.

I wonder if the use of the term resolution rather than accuracy is significant. That note discusses calibrating an ADC, it is not clear what errors an uncalibrated ADC will have if used outside the recommended clock speeds. It could well be no problem in doing so, but I would not take that app note as a green light to exceed the specs in the datasheet without some real world testing.

I wonder if the use of the term resolution rather than accuracy is significant. That note discusses calibrating an ADC, it is not clear what errors an uncalibrated ADC will have if used outside the recommended clock speeds.

But the note says that calibration applies only to differential ADC mode:

For most applications, the ADC needs no calibration when using single ended conversion. The typical accuracy is 1-2 LSB, and it is often neither necessary nor practical to calibrate for better accuracies.

However, when using differential conversion the situation changes, especially with high gain settings.

I believe the note uses accuracy/resolution interchangeably:

The AVR uses the test fixture's high accuracy DAC (e.g. 16-bit resolution) to generate input voltages to the calibration algorithm.

There are standalone ADC chips which are a lot faster, but I have no experience with them.

Does anyone have experience with ADC chips?

Also, would using a multiplexer speed up analog reads?

I have used the MUX shield to read 16 analog pots, one every 16th cycle of the main loop (which also counts through the MUX output), previously I had used separate analog inputs, each also on a main loop based counter.

Only taking one reading per main loop cycle made a huge difference, and this ought to help it even more.

edit:
Ok i got home and tried it, My ISR which generates sound uses about 40% of the processing time, and I have a lot of stuff going on in my main loop and there i one analogread per loop(different pots through MUX)

test bit was cycling at:
460hz
changed 3 analogwrites to OCRxx and some digitalwrites to direct bitwrite to the ports., the speed improved to
480hz
then i did this tweak and the speed improved to
530hz
so even with all the other things happening in my loop
(quite a lot, including a lot of "map" and other multiplication and division) thats quite a nice improvement!

edit:
oh yes i do not notice any difference in the accuracy of the potentiometers as far as i can tell i cant hear any wavering of the controls when set the filter frequency or whatever.

I know this is an old thread, but I want to thank anyone that is still around for contributing. I ran the duemilanova using a prescale of 16 (thanks jmknapp!), and performed an analogRead on a 1K Hertz signal. Here is the result (100 samples of 1000 Hz sine wave):

Gives a 56K sample rate! Next up, prescale to 16, use a 20 MHz crystal, and rely on i2c to move the data. Were very close to a very usable $30 DAQ! Thanks guys.

Just for comparison, here are the same 100 samples of a 1000 Hz sine wave using the normal prescale of 128 (it automatically resets itself back to 128 when the arduino reboots/resets).

Here's the code ...

/*
  Analog Input with prescale change
  Reading a 1 kHz sine wave, 0 to 5 volts
  Using analog 0
  Results stored in memory for highest speed
  using code from:
  http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1208715493/11
  with special thanks to jmknapp
 */

#define FASTADC 1
// defines for setting and clearing register bits
#ifndef cbi
#define cbi(sfr, bit) (_SFR_BYTE(sfr) &= ~_BV(bit))
#endif
#ifndef sbi
#define sbi(sfr, bit) (_SFR_BYTE(sfr) |= _BV(bit))
#endif

int value[100];   // variable to store the value coming from the sensor
int i=0;

void setup() 
{
   Serial.begin(9600) ;
  int start ;
  int i ;
  
#if FASTADC
  // set prescale to 16
  sbi(ADCSRA,ADPS2) ;
  cbi(ADCSRA,ADPS1) ;
  cbi(ADCSRA,ADPS0) ;
#endif
}

void loop() 
{ 
 for (i=0;i<100;i++)
{
  value[i]=analogRead(0);
} 
for (i=0;i<100;i++)
{
  Serial.println(value[i]);
} 
Serial.println();
Serial.println();
Serial.println();
delay(5000);
  
}

You'll (hopefully) find continuing progress at:

Thanks to everyone on this thread, I'm getting my first Arduino in a couple of days, and I need read 32 analog inputs several hundred times quickly. The prescale factor of 16 will make this possible. I will try to post how this turns out when/if I get it working.

I know this is an old thread but I'm looking at speeding up the sampling time for Analogreads also and have a couple of questions. The datasheet talks about increasing the ADC clock to 1Mhz but it may have an impact to resolution. The datasheet eludes to the fact the reason behind using the default 200kHz clock is because it is optimized around the connected voltage source having an output impedance of 10k or less. Is it reasonable to assume that you might be able to maintain the same resolution at higher speeds if the output impedance of the voltage source is significantly lower? Is the resolution loss a function of capacitance charge time in the Sample and Hold circuitry? I'm going to be connecting a MAX4372 to the Arduino and it has an output impedance of 1.5 ohms so I'm thinking I should be OK with increasing the frequency to a higher speed. Anyone have any thoughts on this matter? Thanks.

Hi !
I'm very interested in fast analog ports on arduino too.
So i tried some speed tests on this instruction v=analogread(pin); (v and pin ar ints).

Prescaler Maximum sampling frequency
16 62.5 kHz
32 33.2 kHz
64 17.8 kHz
128 8.9 kHz

To get the maximum precision, i would take just the sampling speed i need, not more. I didnt test at prescaler lower than 16 because of the datasheet 1MHz limit for the DAC speed.

Is it possible to get an analog result faster than 13 ADC clocks, say if less resolution is required? Like, maybe 10 or 11 clocks for 8 bit accuracy?

Also, it appears the 1MHz ADC clock rate "limit" is a soft limit. The spec sheet says that faster ADC clock rates haven't been characterized, but it doesn't say it won't work at all.

I want to push this to the limit because I intend to try to get at least 8 analog signals (16 if I go for the MEGA) streaming into my computer as fast as possible... hopefully 10kHz each. May take some low-level picking around (especially to stream it reliably over USB... that will be a challenge!), but this is my goal.

Does Arduino have separate sample-and-hold circuitry for each analog input pin, allowing simultaneous sampling of all the analog input pins? This would be really nice (though not absolutely required).

Is it possible to get an analog result faster than 13 ADC clocks

No. Not according to the datasheet

Does Arduino have separate sample-and-hold circuitry for each analog input pin

No. The channels are multiplexed. I wonder if you might have better luck with an external A/D converter.

On this thread, someone succeeded in analog sampling at 360kHz (actually, he was operating the ADC at 8MHz--instead of the usual 1MHz--to achieve 360kHz), though I believe at reduced precision:
http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1242967991
(I believe they also achieved 10 bits precision at 124kHz).
Also of note is that gabebear is sending data at 2Mbit/s.

These figures are what we should be shooting for. Now that it has been shown to be possible, we can do it.

Since it appears the analog input pins don't each have dedicated sample-and-hold circuitry, I'll have to live with non-simultaneous sampling for multiple inputs. That's okay.

As far as using a dedicated ADC... As long as I'm running it through an Arduino, I don't think it will help my performance considerably, since it still has to be put through the serial connection, which seems to be able to do 2Mbit/s if you tweak it just right (but no more, at least not considerably more).

Does Arduino have separate sample-and-hold circuitry for each analog input pin

No. The channels are multiplexed. I wonder if you might have better luck with an external A/D converter.

After thinking about it for a little while, I realized that my application (a phased array of microphones, with the processing done on the PC-side) doesn't require simultaneous sampling, as long as I know which analog input is being sampled and at what time. I would still have to interpolate between samples when combining the signals even if simultaneous sampling were possible.

And I've looked at different A/D chips and acquisition interfaces. No commercially sold and packaged DAQ device comes close to an Arduino (IF I can sample 8 analog signals at least at 8-bits and, say, 8kHz each... while streaming to my PC at almost 2Mbit/s) or some similar kind of microcontroller for a similar price. Since the Arduino seems to have such a rich community and because I like the relatively-easy-learn environment and because it's so cheap, I think it's a pretty good fit. I hope to later do some similar work with ultrasound in the >1MHz range, but for that I will need a different microcontroller (or, actually, many microcontrollers, with the data captured initially onto their on-board SRAM, since I don't think I'll be able to stream that much data continuously at full duty cycle), since the typical Arduino-type microcontrollers don't support that many samples per second.

If I had thousands of dollars to spend, I would just buy myself some NI high-speed DAQ equipment and LabView (that's what I did before... I'm trying to replicate my senior thesis project using equipment that I own instead of my alma mader's equipment which I no longer have access to). But I don't have much of a disposable income right now.

Eventually, though, I'd like to get into the >100MHz range so I can do some phased array imaging with radiowaves (underground imaging for amateur archeology). But that is clearly out of the realm of microcontrollers. I will need either a different approach or more money for that. Perhaps an arduino driving some programmable analog delay lines.

DAQ equipment (and software like Labview) is expensive. Arduino and associated software are far cheaper.