ADC accuracy problem

Hi,
after having some problems measuring analog voltages with an Arduino Nano in my application I made a test setup:
An Arduino Nano, powered by the USB connection, is continuously reading an analog voltage and outputting the value read on Serial Monitor.

Curcuit:

  • the output of a reference voltage chip is connected to Vref of the Nano; reference voltage = 2.46 V
  • a potentiometer between Gnd and 5V output of the Nano provides a variable voltage to A3.
  • a voltmeter is connected to A3

Code:

int V_batt;
int i;
setup {
  analogReference(EXTERNAL);
  while (1) {
    V_batt=0;
    for (i=0; i<16; i++) {
      V_bat += analogRead(A3);
      delay(1);
    }
    V_batt = (V_batt + 8)>>4;
    Serial.print("V=");
    Serial.println(V_batt);
    delay(500);
  }
}

With this simple test setup I varied the analog input via the potentiometer and made a table with several different voltages at A3 and the values of V_batt on the Serial Monitor.

With the reference voltage of Vref=2.46 V the formula V_A3*1023/V_batt=Vref should always verify that Vref=2.46. But this never is true! My measurements are resulting in varying values of Vref=2.48 ... 2.82 V (mean value 2.7258).

The values are varying because the voltmeter only has a resolution of 10 mV. That's ok.
But I cannot find an explanation for the high mean value of computed Vref. The ADC seems to see 2.7258 V at the Vref pin instead of 2.46 V.

Has anybody an explanation?

But I cannot find an explanation for the high mean value of computed Vref.

Since you didn't show us how you arrived at that value, you can't expect us to explain it.