I have a sensor that is connected to input 0 on my board and measured with my meter it's delivering 4.08v to the pin.
I wrote a simple sketch to start experimenting with reading the voltage accurately:
int aReading;
float Vin1;
float Vin2;
void setup()
{
Serial.begin(9600);
}
void loop()
{
aReading = analogRead(0);
//Using a "default" 5v value
Vin1 = (5.00/1024) * aReading;
//Using the measured supply voltage
Vin2 = (5.06/1024) * aReading;
Serial.print("Reading = ");
Serial.print(aReading);
Serial.print("\t5.00 = ");
Serial.print(Vin1);
Serial.print("\t5.06 = ");
Serial.println(Vin2);
delay(2000);
}
From everything I've read (reference voltage/resolution)*reading should give me the voltage input.
If I use 5v I'm "missing" 0.05v off the reading (which is a big difference with what I'm doing), but if I measure the 5v supply on the board it's 5.06v. If I use this I get an accurate voltage reading.
So my question is, do I really have to measure the supply voltage and use that or have I missed something along the way?
If I do have to measure the supply, would dipping the supply (perhaps turning on a lot of LEDs etc) affect the calcs?
Thanks in advance, Easty.