I am trying to see how accurate the analog input voltage readings are on my Arduino Duemilanove. As a test I wired the 3.3V source into one of my analog inputs. In my software I calculate the voltage as follows: (5V * read_value)/1024.
I would expect to see something very close to 3.3V but I am seeing 3.5V. Am I missing something?
One thing to keep in mind.. not sure how it factors in here, as I'm no math wiz but when you measure from the 5v pin, or the AREF pin, I get about 4.87 volts, not an exact 5v. I'm guessing this is what's throwing your reading off?
The analog inputs are used in reference of the AREF pin. So try calculating with 4.87 instead of 5v?
Make sure the value you're using is a Float.. took me a second to realize this, was only getting 3.
When I use code you provided, I get from 3.34 to 3.41 constantly. When I change it from 5 * read_value to 4.87 * read_value, it's exactly 3.3. (it fluctuates some with the noise of course, but much closer to 3.3)
Well before you can do any meaningful calibration you must first know the exact voltages being supplied to the chip. The AVCC pin on the processor is the default reference voltage used. This will come from either the USB power or the on-board 5vdc regulator if you are using external power. Both these voltage references are bound to be different from each other as well as not exactly 5.000 vdc. Also the 3.3vdc pin (which you are measuring on a analog input pin) comes from an internal voltage regulator inside the FTDI serial converter chip, which is most likely also not exactly 3.3000 vdc.
So all these voltage values must be verified with an accurate DMM before you really have a chance for a good calibration value. Then you must decide if you want the calibration to apply when you are on USB power or external power, one calibration correction will not apply accurately to both.
If one wants the best possible calibration accuracy from an Arduino, then it's best to either use the internal 1.1vdc band gap reference (but that will limit your measure range to 0-1.1vdc. Or use an external regulated voltage reference wired to the Arduino Aref pin.
Using a signal generator I input a voltage, let's say 4V. I will assume that this voltage is accurate (I will measure it using a high quality multimeter). Let's say that I start off assuming that my reference voltage is 5V and I measure 3.5V instead of 4V.
Can I then determine that my reference voltage is actually 3.5/4 = x/5 (just a ratio)? So I would then say that my reference voltage is actually 4.375V.
So I would then say that my reference voltage is actually 4.375V.
Yes, given your example scenario. Such gross errors in reference value could be from, for example, the result of using a battery to power the processor instead of some regulated voltage source, and the battery's voltage lowers as it discharges.
The best accuracy to be expected from a 10 bit ADC is around .1% (5v/1024), but that is assuming the voltage reference used is at least that accurate. In the Pro instrumentation world, calibration standards, voltage references, etc should be 10X more accurate then the device you are attempting to calibrate.