You are making an assumption that the voltage the A/D unit is comparing to is EXACTLY 5.0 volts. That is very rarely the case. Measure the 5 volts the Arduino is using with your DVM and use that value in the calculation and see if the computed voltage is closer.
That sensor is just a 7500ohm:30000ohm voltage divider, and the 0.2 comes from 7500/(30000*7500)=0.200. Given the way you wired it up ( + to 5V) and programmed it, it should always show you 0.2 * Vref, or analogRead(A0)=204, no matter what the voltage of the battery pack is.
Wishful thinking to print that voltage with four decimal places, if you only have about 200 A/D values to work with (1024 * 0.2). You can't even get two decimal places with that sketch.
Measuring with 1.1volt Aref is the way to go, but that restricts the range of that sensor voltage divider to about 5.5volt.
Leo..
I don't know if it was calibrated because I measured it with a multimeter that was in the lab. I checked more than 3 multimeters, including the Fluke 17B+ and other models.
I thought that if I expressed it as 4 decimal places, I would get a more precise value, but it was a meaningless action. And when I measured Aref on the Uno board, it was displayed as 4.89~4.90V.
And then the project is plugged into a different power source, and he has to do it all over again.
A more sensible way is to determine Vcc on the fly using the 1.1V internal reference as outlined earlier. This way the system compensated to a large extent for any supply voltage fluctuations.
An even better way would be to use an ultra stable reference of known value, but I guess we all know that this is likely overkill for this particular application.