I'm trying to measure a voltage from 0-25V and have a voltage divider using a 40k and 10k resistor. here's my code:
Vo1=0; //Clears vo1 to be ready to start the average
for(int i=0;i<400;i++) Vo1+=analogRead(0); //reads 400 values of vo from A0
Vo1=Vo1/400.0; //Averages the 400 readings
Vo=(Vo1/1023)Vin5;
Vo=0.9929*Vo+0.1797;
It gets a little ugly at the end but after I average the readings, I divide by the maximum reading 1023, multiply by 5 to correct for the voltage divider and since 1023 is the arduino supply voltage, I multiply by Vin, which is the value of the usb voltage (typically around 4.94).
The last line is to calibrate the output with my multimeter. When I upload the code and compare I get pretty good resolution within 20mV of the multimeter at when my measured voltage is <1V or >19V and between, I'm less than 10mV off from the multimeter.
But then if I leave it sitting for a couple of hours or so, come back and my accuracy drops to being about 100mV at the extremes and 70mV between, and I have to recalibrate. I don't believe the USB voltage changes much, but should I use a zener or some other form of external reference voltage so this doesn't keep happening to me? Or is the problem something else?