A To D voltage scaling

I want to make a LiPo battery voltage indicator. My concern is the battery voltage varies from full 12.6 v down to 11.6 v empty. Using a traditional voltage divider 12.6 down to 5 v for the A to D input means that the the maximum to minimum voltage at the analog in varies approx 0.4 v (4.6 v) which I am guessing results in a pretty average accuracy. Is it possible to have the battery voltage change at the analog in equivalent to 5 v to 0 v somehow? This will give me the full range of the A to D and better accuracy....?

Is there a better way to achieve accuracy to two decimal places?

Try this.
Leo..

// displays the voltage of a battery/supply/USB/Vin/etc. on the serial monitor and/or LCD shield
// works with 3.3volt and 5volt Arduinos
// uses the internal 1.1volt reference > independent of supply fluctuations
// max readout is 20.460volt, 0.001volt resolution, last digit is averaged
//
// ~180k resistor + 10k trimpot (calibration) in series from A1 to +batt/supply
// 10k resistor from A1 to ground
// 100n capacitor from A1 to ground for stable readings
//
// calibration: connect e.g. two 9volt batteries in series (~18volt) and a DMM, adjust trimpot to the same reading
//
#include <LiquidCrystal.h>
LiquidCrystal lcd(8, 9, 4, 5, 6, 7); // your LCD pins could be different
int ledPin = 10; // backlight pin
unsigned long total = 0;
//
void setup() {
  analogReference(INTERNAL); // use the internal ~1.1volt reference, change (INTERNAL) to (INTERNAL1V1) for a Mega
  Serial.begin(115200); // ---set serial monitor to this value---
  //analogWrite(ledPin, 255); // optional dimming
  lcd.begin(16, 2); // shield with 2x16 characters
  lcd.setCursor(0, 0); // first row
  lcd.print("Voltmeter"); //info text
  lcd.setCursor(0, 1); // second row
  lcd.print("0-20.460 volt");
  delay(2000); // info display time
  lcd.clear(); // clear
  lcd.setCursor(0, 0);
  lcd.print("Voltmeter 0-20V"); // print once
}
//
void loop() {
  analogRead(1); // one unused reading to clear old sh#t
  for (int x = 0; x < 200; x++) { // 200 readings for averaging
    total = total + analogRead(1); // add each value
  }
  // print to LCD
  lcd.setCursor(0, 1);
  if (total == 204600) {
    lcd.print("---overflow---");
  }
  else {
    lcd.print("A1=  ");
    lcd.print(total * 0.0001, 3); // milivolts to volts, 3 decimal places
    lcd.print("volt");
  }
  // print to serial monitor
  Serial.print("Raw average = ");
  Serial.print(total * 0.01, 2); // 100 readings, 1/100, 2 decimal places
  if (total == 102300) {
    Serial.print("   ----overflow----");
  }
  else {
    Serial.print("   The battery is ");
    Serial.print(total * 0.0001, 3); // milivolts to volts, 3 decimal places
    Serial.println(" volt");
  }
  delay(1000); // readout delay
  total = 0; // reset value
}

Thank you for your kind assistance . I appreciate it. This looks as if it may solve my problem. I will try it tonight and let you know how I go.
Ken

5172ken:
My concern is the battery voltage varies from full 12.6 v down to 11.6 v empty. Using a traditional voltage divider 12.6 down to 5 v for the A to D input means that the the maximum to minimum voltage at the analog in varies approx 0.4 v (4.6 v) which I am guessing results in a pretty average accuracy. Is it possible to have the battery voltage change at the analog in equivalent to 5 v to 0 v somehow? This will give me the full range of the A to D and better accuracy....?

The ADC resolution is about 0.001 (1/1024) of the full range. When the full range is 15V, you'll have a 15mV resolution.

Offsetting the input voltage is critical, because changes to the subtracted bias voltage will have a big (1:1) impact, and the required circuit depends on the reference and battery voltage overall stability (time and temperature). E.g. a shift down by 10V requires an circuit with <1mV distortion, equivalent to 0.01% of 10V! When you happen to reach 1% with moderate efforts and high precision resistors, the distortion will be 100mV, far away from the unbiased resolution of 15mV. Much effort for worse results, proving the rule of thumb: Keep it simple! (applies to both hard- and software)

Thanks for that. The issue that I am faced with is the terminal voltage of the battery changes by 1.2 volts full charge to flat. 12.6v to 11.4 v.

Feeding this into a voltage divider at the analog input results in a change of 5 volts down to 4.52 volts. This means 1028 channels down to 929 channels used in the A to D.

A relatively small change on the basis I have 1028 to play with. I was hoping there was a way to make use of all 1028 channels spread across a change of 0.48 volts. So based on the above feedback (which I appreciate very much) it sounds as if changing the reference voltage is the key to this??

The code I gave you uses techniques to increase the resolution.
One of them is averaging multiple reads.
And it uses 1.1volt Aref, that is more stable than the default Aref.

A "voltage loupe" with an opamp could work if you only want 11.4-12.6volt.
These posts tries to tackle 9-14volt to 0-5volt.
http://forum.arduino.cc/index.php?topic=326950.0
You might need other values.
Let us know.
Leo..

5172ken:
A relatively small change on the basis I have 1028 to play with. I was hoping there was a way to make use of all 1028 channels spread across a change of 0.48 volts. So based on the above feedback (which I appreciate very much) it sounds as if changing the reference voltage is the key to this??

The reference voltage level is of no importance. The ADC provides values 0-1023, NOT 1028, with (hypothetical) 1024 for Vin=Vref. Important is only the accuracy of the reference voltage, where the internal 1.1V source is more stable than the (default) Arduino Vcc.

The reference voltage level also is unimportant for the achievable resolution, as long as a voltage divider is used to force the input voltage into the acceptable 0-Vref range.