I have a project that is controlling several relays while reading voltages from a lead acid battery. The voltage read pin is reading off of a voltage divider that drops the lead acid battery voltage of 12.8 or so down to 4.40. My total impedance is 10.4 kOhms for the voltage divider, with a theoretical ratio of (3.6/(3.6+6.8 )) = 0.346
Oddly enough while the voltage divider is reading almost exactly what it should in theory at 4.40 on my multimeter, the arduino ADC pin is returning a value significantly higher at 4.73. This sounds minimal, but that difference extrapolates a battery voltage of 13.67 instead of a correct reading of 12.8.
Oddly enough, the voltage read pin seems to increase as digital pins are turned on in sequence in output mode. Specifically, as each of four digital pins running a set of four relays are turned on the measured voltage increases by 70 mV. Meanwhile, the voltage divider is still showing a correct value of 4.40 V on a multimeter, while the arduino ADC voltage has increased to almost 5 Volts.
I understand that the Arduino ADC was recommended for source impedances of 10k or less, but why does it change its readings when the rest of the arduino is under load?