Accuracy

Hi,

I am using the analog internal reference (1.1V) on a Duo and consistently read analog values that are 10 mV higher than the input voltage (verified by multimeter). The project is used to log temperature values with a voltage range of 0-1V, so 10mV corresponds to a temperature error of 1 degree Celsius (really big). Is there any way to improve the accuracy? I'm reading values of roughly 580, which corresponds to (580/1023 * 1.1 = .63V) when I should be reading ~570.

Thanks

Steve.

I think accuracy you have is pretty good, as according to specification 1.1V reference may vary from 1.0 to 1.2 V. If it’s consistent delta = 10 error, you just fix this via calibration constant ( add or subtract). If its not stable, noisy like error, you better try oversampling. Look for AVR121 doc.

Who says your multimeter is more accurate?

Confucius say "Man with two watches never know the right time."


Rob

And stopped watch tells correct time twice per day

Now that may be obvious and a bit stupid but when a person glances at their watch to see the time, how do they know a) the watch is correct and b) it is actually going. In this case accuracy is in the eye of the beholder. My better half always has clocks running some 10 mintes ahead of "real" time. Don't ask why - it's a woman thing !

Accuracy is a subjective criteria - unless you are comparing against a national standard.

For "in-house" use you do not need accuracy - you need repeatability

I got around the problem by not wearing a watch for the last 10 years or so.


Rob

[quote author=James C4S link=topic=114804.msg864633#msg864633 date=1342653464] Who says your multimeter is more accurate? [/quote]

Because it used a dual-slope integrating ADC and the design will avoid ground loops....

The offset is likely caused by you running current through the ground wire to the sensor. You mustn't do that. Keep the ground connection to the sensor separate from the ground to the the supply so there is minimal current flowing along it. Use a different GND socket on the Arduino in fact.

conleysa: Is there any way to improve the accuracy?

You could calibrate the internal 1.1V reference using the code described here http://arduino.cc/forum/index.php/topic,92074.msg691991.html#msg691991 or here http://provideyourown.com/2012/secret-arduino-voltmeter-measure-battery-voltage/

Maybe it's the wrong question...

Maybe the right question is "What sensor would give me this range [x..y deg. C] temperature with this [Z%] accuracy?"

Then maybe we could help better -- maybe not.

My point is that 1V is a small swing and a voltage output is susceptible to noise. Something like the Dallas One Wire sensors may do a better job -- and a library is available.

The Dallas DS18B20 has half the midpoint accuracy that an LM34-35 has and an LM334 can easily be calibrated to .1 - .2% accuracy. Use a separate 5V source and a voltage follower to drive the Arduino, Also very important to find a ground closest to the Actual Chip ('328) AGND pin so the ground noise is minimal. Sloppy use of the board 5V source and poor grounding techniques are about 80% of the analog measurement issues. If it were me I would use a TIL431 from the raw (Ext) dc input and a pot to set the bias on the analog sensor. (the TIL431 is an adjustable Zener diode from 1.2 to 37 volts) That way you KNOW it is clean. The Other REAL issue is as you mentioned grounding... A Star topology for ground connections is Essential for successful analog measurements.

Doc