Go Down

Topic: How to test actual adc resolution? (Read 991 times) previous topic - next topic


A while back, folk went over the spec sheets for atmel 328s and came to the conclusion that if atmel sanctions speeds up to 1MHz for the ADC clock without significant loss of resolution, that it must be ok to use a 16:1 prescaler on a 16MHz CPU, for example.

However, I have also read that faster reads may result in a loss of resolution on the part of the ADC, ie you might end up with a net 8 bit ADC in terms of resolution/accuracy instead of a 10 bit ADC. So I wonder if anyone has conducted any tests? If so, how did you test and what were the results? I presume there is a reason that the arduino chose the most conservative ADC clock prescaler of 128...


Haven't conducted any tests, but I'm wondering where the "1 MHz without significant loss of resolution" info comes from. Looking at Table 29-16 of the ATmega328P datasheet they quote 2 LSB's (typical) of absolute accuracy with an ADC clock of 200 kHz, but 4.5 LSB's (typical) at an ADC clock of 1 MHz.

2 LSB's of accuracy is like saying the 10-bit ADC is really an 8-bit ADC. 4.5 LSB's is like losing another bit (almost). Is that a big deal? Depends on your application. In the audio world, 8-bit resolution is pretty bad but serviceable, while 7-bit would be downright annoying (to me). In other applications, it just wouldn't matter.

The Gadget Shield: accelerometer, RGB LED, IR transmit/receive, speaker, microphone, light sensor, potentiometer, pushbuttons


Thanks for the reply!

I based my statement on some forum posts from others. But looking through the ADC section inside the big datasheet, I come to the same conclusion as you did. I will try my hand at oversampling and decimating to increase accuracy, but there are limits to that too.

BTW, your products are great!

Go Up