Battery voltage tester giving offset results

I'm combining an op amp-based constant current load with an arduino for measuring the capacity of 18650 cells. I've got the CC load working on its own, but I'm running into issues measuring the battery voltage. Attached is the whole circuit. Here's the code:

unsigned int reading;
unsigned long testTime;
float VBatt, Vcc;
bool doneMeasuring = false;

long readVcc() {
  long result;
  ADMUX = _BV(REFS0) | _BV(MUX3) | _BV(MUX2) | _BV(MUX1);
  delay(2);
  ADCSRA |= _BV(ADSC);
  while (bit_is_set(ADCSRA, ADSC));
  result = ADCL;
  result |= ADCH << 8;
  result = 1121280L / result;
  return result;
}

void setup() {
  pinMode(8, INPUT);
  Serial.begin(9600);

  Serial.println("Starting test...");
  testTime = millis();
}

void loop() {
  if (!doneMeasuring) {
    reading = analogRead(A3);
    Vcc = readVcc() / 1000.0;
    VBatt = reading * Vcc / 1024.0;
    Serial.print("Battery voltage: ");
    Serial.println(VBatt);

    if (VBatt < 2.7) {
      testTime = millis() - testTime;
      doneMeasuring = true;

      pinMode(8, OUTPUT);   //
      digitalWrite(8, LOW); // Pull the non-inverting input of the op amp low to disable the output...maybe? (nope)
      float totalHours = testTime / 1000.0 / 60.0 / 60.0; // hours
      float mAh = totalHours * 200; // Test load is 200 mA constant
      Serial.print("Test finished, took ");
      Serial.print(totalHours);
      Serial.println(" hours");
      Serial.print("Capacity: ");
      Serial.print(mAh);
      Serial.println(" mAh");
    }

    delay(10000);
  }
}

The battery voltage is consistently reading low. I measure it with my DMM and I get 3.84 while the serial monitor says its 3.61 (give or take some error, it seems to be between -.15 and -.2 volts.)

I guess that register-reading method uses the internal 1.1V reference? That reference is very stable, but there is a tolerance so you may have to make a calibration/correction.

Yep, it's to read the 1.1V reference then to back-calculate the Vcc from that. That bit seems to work on its own, reading the voltage of the AREF pin with the DMM I get 4.94V, which is what that function returns. This bit:

result = 1121280L / result;

Is the constant, I measured my "1.1V" reference as actually 1.095V, so I adjusted the constant, it was originally 1126400 (1100mV * 1024), but with my measured reference voltage of 1095mV * 1024 = 1121280. Given that that works I'm confused why the measured battery voltage is different. All I'm doing is putting the meter leads across the battery while it's plugged into the breadboard. It shouldn't affect it to measure with the DMM while the test is running, right?

I'm intuitively unhappy with a few parts of this project.

Firstly, really, measuring Vcc by measuring the internal "1.1" reference against it is a bit of a "cludge" and gives very limited resolution. Suppose Vcc is 5V and Vref is 1.1
your reading n=1024 * 1.1 / 5 = 225; only 8 bit resolution.

Also there is no direct way to measure the 1.1V reference for calibration.

For accurate results I'd use an external vref and a potential divider to scale the battery voltage to match.

Secondly you are applying a dramatically changing Vcc to the arduino, I'd be happier if Vcc was constant i.e. power it from a usb.

Third, your sketch doesnt allow for noise reduction; I'd be happier to see a repeat - better still a few repeats and averaging - of the voltage measurement - to get better readings.

And finally, I had a similar problem which turned out to be a protection diode on the board. More info here, and a better sketch for the "secret voltmeter" thanks to JRemington.

Arduino measure vcc

johnerrington:
Firstly, really, measuring Vcc by measuring the internal "1.1" reference against it is a bit of a "cludge" and gives very limited resolution. Suppose Vcc is 5V and Vref is 1.1
your reading n=1024 * 1.1 / 5 = 225; only 8 bit resolution.

Also there is no direct way to measure the 1.1V reference for calibration.

For accurate results I'd use an external vref and a potential divider to scale the battery voltage to match.

I found a sketch somewhere that did indeed read the internal 1.1V reference using the secret voltmeter, it gave me a 6 digit floating point number (.22 something) and told me to measure the voltage at the AREF pin (4.94) and multiply it by that number, then by 1024 to get the constant in the readVcc method. It reported that my 1.1V reference was actually at 1.095V.

Secondly you are applying a dramatically changing Vcc to the arduino, I'd be happier if Vcc was constant i.e. power it from a usb.

It is powered from USB, more incidentally than anything else, so that I can log the data through the serial monitor. I'm using an authentic Arduino Uno R3 at least for the prototyping part. I'm not sure why you think I'm applying a changing Vcc, the 2xAA battery holder in the schematic represents the 18650, not the power supply. Sorry, I forgot to annotate that, I meant to.

Third, your sketch doesnt allow for noise reduction; I'd be happier to see a repeat - better still a few repeats and averaging - of the voltage measurement - to get better readings.

And finally, I had a similar problem which turned out to be a protection diode on the board. More info here, and a better sketch for the "secret voltmeter" thanks to JRemington.

Arduino measure vcc

I actually did originally have a loop that took 64 readings with a delay(1) between each one, and averaged them together. I took it out, for some reason, I really can't think of why. Thank you for that link, I'll definitely look that over, lots of good information.

I think I've decided that the flaw in my plan here is that I'm trying to use the op amp-based constant current load separately from the battery voltage measurement. I'm only measuring the battery voltage under load, and I'm not measuring the current (voltage across resistor obvs) which means that my results will be inherently inaccurate. There's no feedback between the discharge part of the circuit and the measurement part, I'm just starting the test at power up, stopping it when the voltage is low and multiplying my time by the "constant" current, which I just assume is correct. I'm starting the question the accuracy of my multimeter, and even a 10mA discrepancy over several hours can really mess up a capacity result. Anyway, I've rambled. I think I'm going to rethink this project, look at more examples, and start with the battery measurement part, work out the kinks in that.

Thank you both for the help!

Forgive me in advance, this might get lengthy. I'd like to see if I understand exactly how this secret voltmeter works.

The UNO board only has 6 ADCs (0-5), but the registers allow for more. Using those registers one can read ADC6, which is where the 1.1V internal reference...lives. Since that measurement is taken as a fraction of Vcc some math can figure out what Vcc is. Let's say we get 223 back from ADC6, and we can measure our actual Vcc at the AREF pin as 4.96V. That means that our 223 represents a reference voltage of 1.08 right? Then we get our calibration constant by

223 / 1024 * 4.96 * 1024000 = 1106080

Now, in code, by reading that ADC6 value (223) and dividing the constant by it, we get 4960, 4.960V. If I read 221, It'd be:

1106080 / 221 = 5004.89

or 5.004V. From my understanding the reference voltage does fluctuate, so I could probably help myself by taking several readings from ADC6 and averaging them, like you said. I think I'll also try measuring under different input voltages, I think I'll use LEDs to display the 10-bit output since I won't have it connected to the computer.