Please correct me if/where I am wrong here. I'm still trying to get my head around it all.
I'm trying to find a way to accurately calculate the exact reference voltage for the analogue inputs. I understand that the default is the 5v supply voltage, but does this drop if the supply voltage drops? I measured the 5v rail on my Arduino + Ethernet Shield and I got 4.80v. Does this mean that the reference voltage becomes 4.8v instead of 5v? This is while being powered from USB.
Assuming that I'm correct on the above, I have an issue where what if the supply voltage was exactly 5v, or dropped down to 4.8v, and I needed to make an accurate reading on the analogue input?
I noticed that when I measured the 3.3v rail that is did indeed return exactly 3.3v, so can we use this as a reference into, say, analogue input 0 and use the following formula to calculate a reference point for the inputs?: referenceVoltage = 1 / ( ( analogRead(0) / 1023 ) / 3.3 );
This gave me a value of 4.71v, which could be about right if my formula is correct.