Hi, I've just got an arduino and I'm mostly playing about with its features. I was quite interested to try measuring the voltage of a lithium ion battery (4.2V maximum so well within the 5v the arduino can take)
I took a battery that I had recently discharged to 3V using my charger (I have no reason to believe this will be exactly 3V as it is a cheap charger) and measured it with my cheap digital multimeter, giving a consistant 3.31V which I was surprised at how much higher than 3V it was. Using the arduino however it gave a consistant value of 3.44V which is significantly different from my multimeter.
I'm not sure which of the measurements to believe, am I just seeing the innacuracy of measuring with these devices? Is there any other reason I would be getting such different results? I have included my code below which is just a slightly changed version of the example that comes with the software. The positive from the battery is connected to analog pin 0, and the negative to the ground pin.
That is a very good point. Would you be able to say which of the DMM and arduino is likely to be more accurate? While the DMM is a cheap unit, it was designed to measure voltages, while its just a tiny part of the arduino.
Further testing with different batteries with different voltages reveals that the arduino consistanly measures around 0.15V higher than the DMM.
I would say - none. You really can not say with certainty unless you measure with a high end calibrated voltage meter.
You may want to measure the 5V output from your board with your DMM. This voltage is maintained by the onboard regulator and although it is pretty accurate it is typically never exact. In either case the voltage measured with ADC will be relative to this voltage and you can use it to calculate a calibration constant (rather than 0.0048828).
You may want to measure the 5V output from your board with your DMM. This voltage is maintained by the onboard regulator and although it is pretty accurate it is typically never exact. In either case the voltage measured with ADC will be relative to this voltage and you can use it to calculate a calibration constant (rather than 0.0048828).
This has solved it. I measured the 5V rail to be 4.81V. Changing the calibration constant using this value is now giving identical results with arduino and DMM. Of course this is assuming the DMM to be correct, I'll see if I can get hold of a well calibrated voltmeter from a friend to make sure.
As a final thought, is it possible to read the actual voltage of the 5V rail internally, such that if it ever changes, the calculation can be done within the program itself? I would guess not as it would need something else to calibrate with, but I could be wrong.
You could wire the supply via a suitable divider (use 1% or better resistors) to one of the analogue inputs, and use the internal reference (1.1V or 2.56V) to measure the supply.
As a final thought, is it possible to read the actual voltage of the 5V rail internally, such that if it ever changes, the calculation can be done within the program itself? I would guess not as it would need something else to calibrate with, but I could be wrong.
No, not in the standard A/D configuration. The A/D measurement is a ratio of what the reference voltage is to the unknown value on the analog input pin. The standard set-up is to use the on-board +5vdc or the PC's USB voltage as the analog reference voltage, so a analogRead value of 1023 just means that the measured voltage is equal to the reference, whatever that reference is. You can see that sometimes when you put a fixed voltage to the input and only change from using USB power to external power (or visa versa), you will most likely see a slight change in the A/D value read.
If accuracy is important then changing the reference to utilize a more stable voltage is needed. But keep in mind that the Arduino's 10 bit A/D is not really a instrument quality accuracy. There are fairly inexpensive external A/D modules in higher then 10 bit resolution avalible if better accuracy is desired or needed.
You could wire the supply via a suitable divider (use 1% or better resistors)
To create a precise voltage divider, we only need an accurate ratio for the resistors, and fortunately we can measure this with a low cost DMM. We can then use this ratio in our ADC calculations. The absolute value of the resistors however is not important. That is - we don't really care if the resistors are 1%, 5% or even 10% - as long as we have an accurate ratio to calculate against (for assembly line volume production where you want to avoid an end-of-line manual tuning step however - the story is different).
I measured the 5V rail to be 4.81V.
This is low for a decent quality 5V regulator. Then I suspect either your input voltage is too low for the regulator (the voltage measured is correct, but the regulator is unable to maintain a 5V output) or your DMM is reading low.
To create a precise voltage divider, we only need an accurate ratio for the resistors, and fortunately we can measure this with a low cost DMM. We can then use this ratio in our ADC calculations. The absolute value of the resistors however is not important. That is - we don't really care if the resistors are 1%, 5% or even 10% - as long as we have an accurate ratio to calculate against (for assembly line volume production where you want to avoid an end-of-line manual tuning step however - the story is different).
That won't help the problem if the voltage divider uses the USB or on-board voltage divider as the reference voltage. Variation in Vcc will cause variation in the absolute voltage dividers output voltage regardless of how precise the ratio is.
I thought that was the object of the exercise - it's what we want to measure against the internal reference.
Maybe, that wan't really clear to me. That has been done before, using the internal band-gap reference and then using a voltage divider to measure the actual Vcc, then programming back to using the Vcc as the reference before actual performing a analogRead(). Also one has to do a dummy read when switching references as the first read tends to be unreliable after such a switch.
There are several methods of improving the accuracy of the A/D system. I think for the standard USB arduino using the 3.3vdc pin as an external reference is the simplist and cheapest method at the expense of reducing the measurement range to 0-3.3vdc from 0-5vdc. That 3.3v value would not be effected by what power source one is using for the board.
I assumed you powered your Arduino through the barrel jack. If it is running off USB power, the onboard regulator is bypassed altogeher and 4.81V is more likely. This would leave the possibility open that your DMM is indeed correct.
You have a number of possibilities as suggested in this thread to get a "second opinion".
Power your Arduino through the barrel jack and use the onboard 5V regulator as analog reference (reference DEFAULT).
Power via USB and connect a wire from the 3V3 output to AREF (reference EXTERNAL).
Power via USB (or barrel jack) and use the internal 1V1 reference (reference INTERNAL).
Note that you should never wire a voltage reference to AREF unless your sketch is configured with analogReference(EXTERNAL).
When using 1V1 or 3V3 as ADC reference, you may need a voltage divider to change the level of your input signal accordingly (no harm though for inputs in the Gnd to Vcc range, but ADC readings will max out for voltages above the reference).
You can also measure the ADC reference voltage with a DMM between AREF and GND for any mode.
Thanks to everyone who has helped me on this. As per the advice here I have started powering the board through the barrel jack which is now giving me precisely 5.00v when measured via my DMM. I think my computers front USB ports aren't very well powered.