# Not clear on how to use the internal ref when reading analog...

I need to read a 0-1.2V analog input (as accurately as possible) but, I know the measurement is affected by supply voltage. I bought some zeners and was going to read A0 (the zener) then read A1 the input to get a correct reading.

Now I learn that I can use the internal reference and I don't need the zeners. I've read sample code but I just can't grasp how this works.

Can someone post a code snippet showing me how to set the reference and how to use it to correct A1?

I'm using a ProMini if that matters.

The internal reference is approximately 1.1 volts, the voltage you are testing must be lower than this, a voltage divider can do that. What is the maximum voltage you expect to see, and the output impedance if you know it?

``````const float aRef = 1.1; // measured with accurate voltmeter
float  volts;
const byte aInPin = A0; // signal input pin

void setup()
{
Serial.begin(9600); // set serial monitor baud rate to match
analogReference(INTERNAL); // use internal 1.1 volt reference
}
void loop()
{
volts = analogRead(aInPin) * aRef / 1024;
Serial.println(volts,3); // 3 decimal places
}
``````

The FIRST project I am trying with this is a LM34 temp sensor so, the input voltage probably will only go up to 1.05V.

1.1V worst case.

1.05V = 105F
1.1V = 110F

So, when you set "analogReference(INTERNAL)", all of the ADCs are measured against it automatically?

and
"= analogRead(aInPin) * aRef / 1024;" (which equals " (A0/1024)*aRef ) to get the value?

THAT makes sense. Outsider, you're awesome!