I'm having some trouble to read accuracy values from analog inputs. Using a modified version of AnalogInput example and a 0.88V in the analog input 3 I'm reading a value of 699 that's iqual to 0.75V.
Now I'm using analogReference(INTERNAL); but I tried with analogReference(EXTERNAL) setting AREF pin to 5V and had the same 0.750V.
My 5VDC source is a 7805 wired to a 12VDC source. I don't have a ocilloscope to see it is flat on 5V then I changed 12VDC to another source and nothing change.
I have a atmega328 in a breadboard. I changed the chip, moved it to arduino board and the same 0.75V was read.
That's my code:
int sensorPin =A3; // select the input pin for the potentiometer
int sensorValue =0; // variable to store the value coming from the sensor
void setup() {
// declare the ledPin as an OUTPUT:
Serial.begin(9600);
analogReference(INTERNAL);
}
void loop() {
// read the value from the sensor:
sensorValue = analogRead(sensorPin);
// turn the ledPin on
Serial.print(sensorValue);
Serial.print(" - ");
Serial.println((sensorValue* 1.10) / 1024.00);
delay(sensorValue);
}
What is your board's +5vdc read exactly on a good digital multimeter? Most regulators have a +/- tolerance and the exact value has a bearing on the ADC 'accuracy'. Even switching from the on-board 5 volt regulator to using USB power can change the ADC readings proportional to the specific difference in voltage from those two sources.
My 5VDC source is a 7805 wired to a 12VDC source. ...... then I changed 12VDC to another source and nothing change.
No nothing should, the regulator is producing the same voltage output independent of the voltage source driving it (within limits) that is it's job, it is what you expect.