Voltmeter help (Yep, another stupid voltmeter question :P)

I'm trying to measure a voltage from 0-25V and have a voltage divider using a 40k and 10k resistor. here's my code:

Vo1=0; //Clears vo1 to be ready to start the average
for(int i=0;i<400;i++) Vo1+=analogRead(0); //reads 400 values of vo from A0
Vo1=Vo1/400.0; //Averages the 400 readings
Vo=(Vo1/1023)Vin5;
Vo=0.9929*Vo+0.1797;

It gets a little ugly at the end but after I average the readings, I divide by the maximum reading 1023, multiply by 5 to correct for the voltage divider and since 1023 is the arduino supply voltage, I multiply by Vin, which is the value of the usb voltage (typically around 4.94).

The last line is to calibrate the output with my multimeter. When I upload the code and compare I get pretty good resolution within 20mV of the multimeter at when my measured voltage is <1V or >19V and between, I'm less than 10mV off from the multimeter.

But then if I leave it sitting for a couple of hours or so, come back and my accuracy drops to being about 100mV at the extremes and 70mV between, and I have to recalibrate. I don't believe the USB voltage changes much, but should I use a zener or some other form of external reference voltage so this doesn't keep happening to me? Or is the problem something else?

Or is the problem something else?

That's where my money is.
But it helps to post all the code using the proper code tags along with a schematic of how you have it wired.

Are you sure the voltage you are trying to measure is stable over this period?

The 5V of the USB varies, so your reading can never be accurate.

If you use

analogReference(INTERNAL);
delay(20);

the internal reference voltage of 1.1V is used.

You have to adapt the resistors for that voltage. And the voltage is not precisely 1.1V, but it's better than the 5V of the USB bus.