I have a voltage divider set up on my Arduino to read voltages up to almost 14 volts (100K and 56K resistors and a 10uf cap to filter noise). However, when I measure voltages from batteries or from the Arduino's 5v output, the reading is always different between the Arduino and the two multimeters I tested.
A 4xAA NiMH battery pack measures 5.03v on the Arduino, 5.41v on one multimeter, and 5.40v on the other multimeter. If I run the Arduino's 5v output through the voltage divider and into analog 0 I read 4.98v, while it reads 5.37v on one multimeter, and 5.59v on the other. Which of these voltage readings is correct? I first thought the Arduino reading was the correct one, since the 5v output actually read as 4.98v, but I got confused when both multimeters read around 5.4v instead.
Here's the sketch:
float v = 0;
void setup()
{
pinMode(10, INPUT);
Serial.begin(9600);
}
void loop()
{
delay(100);
read_voltage();
}
void read_voltage()
{
v = analogRead(0);
v *= 0.0136154168412233;
Serial.print("Voltage: ");
Serial.println(v);
}
The math for voltage divider:
56000/156000 = 0.358974358974359
5/0.358974358974359 = 13.92857142857143
13.92857142857143/1023 = 0.0136154168412233
Apologies if this is a dumb question, I'm new to electronics and still have a lot to learn! I plan on building a 12v boat battery charge controller so I need as accurate as possible voltage readings.
The math for voltage divider:
56000/156000 = 0.358974358974359
5/0.358974358974359 = 13.92857142857143
13.92857142857143/1023 = 0.0136154168412233
One thing about the math, the resistors all have a accuracy tolerance and therefore you can't calculate the expected voltage just from their stamped value. One would have to measure them with an known accurate ohm meter and go from there.
As far as how to deal with two meters and one Arduino A/D measurements. Old Chinese saying: Man with one watch always knows the time, man with two watches never quite sure
As the Arduino A/D is just 10bits and it's published accuracy is +/- 2 bits, I would trust your meters more then the Arduino value. Meters themselves will have a accuracy error of some value and unless calibrated recently one really doesn't know for sure. One could have some stable reference voltage to check meters with. In the old days we use to use a mercury battery cell as a reference as they had a good stability value.
Anyway welcome to the world on instrumentation where we have so many watches we never really know what time it is.
A 4xAA NiMH battery pack measures 5.03v on the Arduino, 5.41v on one multimeter, and 5.40v on the other multimeter. If I run the Arduino's 5v output through the voltage divider and into analog 0 I read 4.98v, while it reads 5.37v on one multimeter
It seems to me that it might be due to the fact that you were not supplying exact 5.0V to the Arduino. My understanding is that by default, the internal voltage ref is not used and thus the analog reading depends on what the Vcc is. So a 10bit ADC value is 1024 * (Vi / Vcc).
I'd suggest you try a couple of things.
Try measuring the Vcc and take that into consideration to see if the result matches the multimeter readings.
Alternatively, you can activate the internal reference and use that as an accurate voltage reference point.
I'd be astonished if both multimeters are that far out by the same amount. Your arduino +5V isn't +5.00V, clearly. Are you using the onboard regulator?
I used the USB port to power the Arduino. Just out of curiosity I tried using a 9v battery with 5v regulator to power the Arduino (5v regulator to 5v input on the Arduino), and measured the voltage of the battery pack again. This time the Arduino measured 5.37, while the multimeter measured 5.40. Much closer this time. Could it be that my USB port is providing the Arduino with slightly over 5v which is throwing the readings off?