what would be the best way to read the voltage reference voltage in code to update the calculation for sensors. (like a calibration)
I have noticed the ref. changes a lot, depending on the power input level and type.
via USB is around 4.1 volts
via V In is around 4.9 volts
to calc. using: float volts = analogRead(0) * 4.11 / 1023;
right now I need to check with a multi meter and setup each time.
I know I can use a hardware ref. but i am looking for a software solution for now.
doing it this way gives an accuracy of 0.03-0.05v which is good enough