 I’m trying to use analogRead() to read voltage in a project (eventually will read the battery supply voltage to a 328p) and am using the usual voltage divider and 1.1v internal reference setup, however my readings seem a bit low, although they are consistent at least.

If I use R2=4.7k and R1=10k, to drop the 3.3v supply (actually ~3.28v) to under the 1.1v reference, i get 3.20v from the sketch and VOUT (which is fed into analog0) is measured as 1.04v according to the meter, raw analog reading of 992 or 993.

if i use 470k/1M i get 3.17v reported (smoothed with a 100nF cap), VOUT=0.76v according to the meter (odd?!) and raw analog reading of 982 or 983.

i tried replacing my breadboard powersupply with a coincell that had about 3v left in it, and got 2.85v measured.

I wondered if I should divide the analog reading by 1023 or 1024, doesn’t seem to make a difference, I’d assume 1023 but some sites quote 1024. should i be multiplying the reading by the 3.3v maximum supply or the ~3.12 of r2+r1/r1 perhaps?

I’m using a mega2560, so added some code to use the correct reference voltage, grounds on the mega and breadboard are connected.

am i missing some maths in my code (i thought of something like the ratio of 1.1v aref to VOUT?) or are the inaccurate readings due to using 5% resistors or something, adding 5% to the readings makes them too high.

``````// vout = rb/(ra+rb)*vin
// 1.055 = 4.7k/(10k+4.7k)*3.3
// 1.055 = 470k/(1M+470k)*3.3 with 100nF cap

void setup()
{
// init serial monitor
Serial.begin(9600);

// chose 1.1v aref based on chip
#if defined(__AVR_ATmega2560__)
analogReference(INTERNAL1V1);
#else
analogReference(INTERNAL);
#endif
}

void loop()
{
Serial.print("raw: ");

// convert 0-1023 reading to 0-3.3v input voltage
float voltage = reading * (3.3/1023.0);
Serial.print("calc: ");
Serial.println(voltage);

delay(2000);
}
``````

If I use R1=4.7k and R2=10k, to drop the 3.3v supply (actually ~3.28v) to under the 1.1v reference,

10 / 14.7 x 3.3 = 2.25V

if i use 470k/1M i get 3.17v reported (smoothed with a 100nF cap), VOUT=0.76v according to the meter (odd?!) and raw analog reading of 982 or 983.

``````470000 / 1470000 * 3.3 / 1.1 * 1023 = 981
``````

Sounds OK.

@AWOL: sorry, got my r1/r2 backwards, R1=10k, R2=4.7k, so 4.7/14.7x3.3=1.055

@Nick, what are you calculating there - seems like the ratio of VOUT to AREF to get back to the analog reading? but my resistor divider is not returning perfect values, r2/r1+r2*3.3 should be 1.055 but i'm getting 1.04 or 0.76

I wondered if I should divide the analog reading by 1023 or 1024, doesn't seem to make a difference, I'd assume 1023 but some sites quote 1024.

You divide by 1024 if you are a mathematician or if you truly understand ADC successive approximation. For joe-programmer, howeve there is a good reason to use 1023 instead because that realigns the results with what most programmers really want. This explains it fairly well: http://blog.codeblack.nl/post/NetDuino-Getting-Started-with-ADC.aspx

In theory, you should divide by 1024, because there are that many steps. The maximum digital value of 1023 in that case represents a voltage between 3.2967 and 3.3V, but the formula would result in the lower value. In practice it is easier to divide by 1023, so a digital value of 1023 represents the full scale of 3,3V. And with ADC precision-errors and a Vref that's probably not exactly 3.3V, the result is pretty accurate.

So, select 1024 or 1023 based upon what you what the software to report...

Ray

In practice it is easier to divide by 1023

sp. "In practice it is easier to divide by 1024"

@ray - i'll have a grok through that link when i've got my thinking hat on.

@awol - yeah i guess he meant 1024 is easier to programmatically divide by.

in the meantime, i think i've got my issues solved - it seems the 1.1v aref is more like 1.09v on my mega2560, so i've adjusted it in software and now i'm getting pretty near the vout=1.045 and vin=3.28 my multimeter reads. the annoying thing is that i'll have to use the meter and figure out a good aref offset for each chip/circuit.

``````// vout = rb/(ra+rb)*vin
// 1.055 = 4.7k/(10k+4.7k)*3.3

// for mega2560, recalibrate for 328p
float aref_fix = 1.09;

void setup()
{
// init serial monitor
Serial.begin(9600);

// chose "1.1v" aref based on chip
#if defined(__AVR_ATmega2560__)
analogReference(INTERNAL1V1);
#else
analogReference(INTERNAL);
#endif
}

void loop()
{

// convert 0-1023 reading to fraction of aref
float vout = (reading*aref_fix) / 1023.0;
Serial.print("vout: ");
Serial.println(vout);

// vin to divider = (r2+r1)/r2 * vout of divider
float vin = (14.7/4.7)*vout;
Serial.print("vin: ");
Serial.println(vin);

delay(2000);
}
``````

gives:

``````reading: 993
vout: 1.05
vin: 3.28
vout: 1.05
vin: 3.28
``````

I'm trying to use analogRead() to read voltage in a project (eventually will read the battery supply voltage to a 328p) and am using the usual voltage divider and 1.1v internal reference setup, however my readings seem a bit low, although they are consistent at least.

As fun as the /1023 Vs /1024 issue is there are a couple of issues with the AVR hardware that can have a bigger effect on "accuracy variation". First, the internal 1.1vdc voltage reference is a nominal value and that the actual reference voltage for your specific chip might be 1.0 to 1.2 or whatever the datasheet tolerance allowance states. And then of course there is the overall AVR ADC accuracy variation specification which as I recall is +/- 2LSB. So don't let your eyes lose the target as what is required which is a known accuracy in your measurements. I generally test such reading by feeding a known accurate input voltage to the analog input pin, for both top and bottom of desired measurement range and then adjust the voltage to units conversion in the sketch to match these 'known' input test voltages, generally by small adjustment to the map() arguments. Another method is to use a 20 turn precision trimming pot as part of your input voltage divider network which you can use to 'tweek' to match the reading when inputting known test voltages.

Lefty

@lefty - yes in my final circuit i'm going to be using a 3.3v boost regulator, which i hope will have quite a good fixed output and can be used as an external reference rather than using this horribly variable 1.1v internal one. i might try that with my current setup actually, as the vin is through a regulated supply at the moment.

1023 or 1024 isn't going to make any real difference in this case i agree.

i quite like the idea of using the trim pot to adjust the reading rather than tweaking the code for each chip/circuit/psu variant. i'm a software guy at heart, so tend to scoff at RC circuits and pots when i can fix things without getting the iron hot :grin:

sej7278: @Nick, what are you calculating there - seems like the ratio of VOUT to AREF to get back to the analog reading? but my resistor divider is not returning perfect values, r2/r1+r2*3.3 should be 1.055 but i'm getting 1.04 or 0.76

``````470000 / 1470000 * 3.3 = 1.055v at the analog pin, right?
``````

With a 1.1v reference voltage you will therefore be reading:

`````` 1.055 / 1.1 = 0.959  (as a ratio)
``````

Since the maximum value from the ADC is 1023 therefore the ratio is multiplied by 1023:

``````0.959 * 1023 = 981
``````

Thus a reading of 981 is to be expected.

And if you multiply by 1024 because there are 1024 steps, then you get:

`````` 0.959 * 1024 = 982
``````

Which you did.

if i use 470k/1M i get 3.17v reported (smoothed with a 100nF cap), VOUT=0.76v according to the meter (odd?!) and raw analog reading of 982 or 983.

VOUT=0.76v according to the meter (odd?!)

With a 1 M resistor in your voltage divider then the meter internal resistance (say, 5 M) when measuring that voltage will be appreciable.

You need to look at the spec for the meter to see how much it is affecting the reading.

i quite like the idea of using the trim pot to adjust the reading rather than tweaking the code for each chip/circuit/psu variant. i'm a software guy at heart, so tend to scoff at RC circuits and pots when i can fix things without getting the iron hot

Where I work, we use software calibration. (It's not an Arduino.) There are some commands in frimware that allow us to change variables in FLASH. In one case there are 100 correction-factor variables over a 0-10V range. In most cases, there are just gain & offset variables for each range, but there are always at least 10 readings over the DAC or ADC range.

It's a fully-automated process. Our test & calibration software runs on a PC and communicates with the device's firmware. And, there is also a precision DMM connected. The calibration software compares the DMM readings to the UUT readings, updates the calibration/correction variables, and then re-tests. There are multiple DACs & ADCs and multiple ranges, so it takes a few minutes.

I've made some Arduino-based audio-activated lighting effects. In this application I don't care about absoulete accuracy, but I use software to automatically adjust the sensitivity & thresholds depending on music volume (and I also automatically switch between the 1.1V an 5V references as needed).