The cheapest and safest method to read Car Battery Voltage

Wanting to contribute my own design to this topic and seeing that most (if not all) of the posts where this theme is discussed are blocked by the famous "120 days topic block", I have decided to open my own.

The variety of electrical and mechanical systems that interface with a vehicle’s battery can cause wild voltage excursions in the nominal 12V (12.5) supply. In reality, 12 V can vary from –14V to +35V for extended periods of time and experience voltage spikes with extremes ranging from +150 V to –220 V. Some of these surges and transients arise from everyday use, others from fault conditions or human error. Regardless of cause, the damage they can produce in electronics system can be difficult to diagnose and expensive to fix, specially your Arduino inputs. The International Organization for Standardization (ISO) has compiled this industry knowledge into the ISO 16750-2 and ISO 7637-2 specifications. Automotive electronic control unit (ECU) and your Arduino, should survive these conditions without damage. So:

There are many different way to sense or read a car battery voltage using Arduino, GPIO, etc, from a simple resistor divider to more complicate electronics like operational amplifiers, special sensors, etc. Lest analyze the cheapest way using a simple resistor divider as most suggest. The most common of all, from 0-15Vdc using 10bit(1024) with a 1:3 radio (resistor divider factor =2). Example (image1).

The cheapest way to get rid of the car “transients” is to add a 100nf capacitor. Every fast transients and car noise will be absorbed by the capacitor (image2).

The cheapest way to get rid of the car “surges” is to add a 5.1V zener. A0 will be following the battery voltage from 0 to 15Vdc and keeps your GPIO input save up to 5.1Vdc even if the surges jumps to 150V. Notes: (1) The zener will be adding about 1% error reading in the higher range aprox. between 14.5 to 15Vdc, but will be almost perfect from 14.5v to 0. (2) All resistors can be 250mW so they can handle surges events like 35Vdc or fast surge transients that don’t last more than 1 second and the zener will keep going ok (image3).

My own design is based in a 0 to 16Vdc input divider, since 16 it’s a multiple of the Arduino 10bits (1024) ADC, so I use a 1:3.2 (2.2 factor) resistor divider. Obviously using that factor can be tricky finding the correct resistors, but to compensate that and real resistors tolerance I add a multi-turn pot to adjust the divider. I have test this circuit using a digital voltmeter adjusting P1 to read exactly 3.751Vdc at A0 and injecting exactly 12Vdc at Vin. By doing that, this circuit give me %1 of error between 16 and 15Vdc and +/-0.12% from 0 to 14,90Vdc, tested in a 40 degree environment (image4).
Help this Help. Have a nice day.

Image1.png

Image2.png

Image3.png

Image4.png

Image1.png

Image2.png

Image3.png

Image4.png

Hi,

The variety of electrical and mechanical systems that interface with a vehicle's battery can cause wild voltage excursions in the nominal 12V (12.5) supply. In reality, 12 V can vary from -14V to +35V for extended periods of time and experience voltage spikes with extremes ranging from +150 V to -220 V. Some of these surges and transients arise from everyday use, others from fault conditions or human error. Regardless of cause, the damage they can produce in electronics system can be difficult to diagnose and expensive to fix, specially your Arduino inputs. The International Organization for Standardization (ISO) has compiled this industry knowledge into the ISO 16750-2 and ISO 7637-2 specifications. Automotive electronic control unit (ECU) and your Arduino, should survive these conditions without damage. So:

Where did you read this, especially the underlined bit?

Tom.... :slight_smile:

Thank you for sharing your golden knowledge, it will help me a lot for my future project
thx

If you use the internal 1.1 volt reference, as you should for any accuracy , then as the analog input is still “protected “ to 5v , you can in effect have ~60v spikes on the top of your divider and not damage the input by exceeding 5v at the input .
You can add the capacitor and possibly a lower voltage zener but check calibration for any non linearity.
( use something like 10k for the top resistor )

You can write a calibration routine in your software to take out any tolerances in resistors or the reference voltage ( map) and save the cost of a trim pot.

In reality, 12 V can vary from -14V to +35V for extended periods of time

Perhaps with antique cars, but certainly not in a functioning modern vehicle.

Transient spikes can be large, and are generated in the wiring, but will never be long-term.
Its normal for "automotive grade" MOSFETs to be rated at 55V or more - gives an idea
of the extremely noisy environment.

Lead-acid 12V battery voltage will be as low as 6V or so when the starter-motor is used, to 14V
or so during charging (nominally 13.8V). Most of this variation is due to heavy currents flowing in
or out of the battery. If the battery voltage strays outside these limits its probably time to
replace.

35V for an “extended time” would burn-out ALL of the light bulbs in your car in no time! …And who-knows what else. 35V gives you more than 8 times the power of 12V so a 60W headlight becomes almost 500W!

You can get unpredictable spikes so a protection diode following the voltage divider (which acts a a current limiter for the diode) is a good idea.

Dividing to 5volt is mistake #1.
Comparing battery voltage to a potentially unstable 5volt supply (default Aref) will kill your "0.12% precision".
A blinking LED,or PWM-ed LCD will drop that stability to maybe 5%.
Post#3 has the answer to that.
Dividing to ~1.1volt also gives a higher over-voltage protection.

Using a 5.1volt zener diode on an MCU input is mistake #2.
MCU input limits are -0.3volt to 0.3volt, until you power up the Arduino.
The zener starts conducting well below 5.1volt, stuffing up the linearity at the top end of the A/D,
where your battery voltage is.
Schottky clamping diodes could be used, if needed (likely not).

Do more homework, before you make those poor recommendations in your first post.
All of the above has been covered here many times.
Leo..

hammy:
If you use the internal 1.1 volt reference, as you should for any accuracy , then as the analog input is still “protected “ to 5v , you can in effect have ~60v spikes on the top of your divider and not damage the input by exceeding 5v at the input .
You can add the capacitor and possibly a lower voltage zener but check calibration for any non linearity.
( use something like 10k for the top resistor )

You can write a calibration routine in your software to take out any tolerances in resistors or the reference voltage ( map) and save the cost of a trim pot.

I think I understand what you mean, but any chance you could elaborate? Auto voltage meter is exactly what I am working on right now, and 5V stability (likely due to a TFT screen) is exactly the issue I am facing

@ trailsurfer604

Which Arduino. There is no 'one size fits all'.
See the Aref page.

For an Uno, you design the voltage divider for ~1.1volt, like 1k8:27k or 2k2:33k or even 10k:150k.
With of course the 100n ceramic cap from pin to ground.

For your ProMicro, the divider must dial down to 2.56volt, like 10k:56k.

Then use this line in setup()
analogReference(INTERNAL); // use the internal ~1.1volt reference (2.56volt for the ATmega32U4)

Converting to voltage could be as simple as
float voltage = analogRead(A0) * 0.01678; // calibrate !

That calibration factor will be different for the ProMicro, and also depends on the resistor ratio. About 0.0165
Leo..

Wawa:
Do more homework, before you make those poor recommendations in your first post.
All of the above has been covered here many times.

Why so harsh? Maybe the OP is not ideal but there are members with thousands of posts writing crap. Is it better?

The internal voltage reference is stable but not accurate and needs calibration. If the 5V source for Arduino is from some good regulator it may be better than using the reference.

If the Arduino is powered from the car battery it is easy to ensure it is always powered when battery is connected to the analog input. So the Zener diode provides good protection (and also protects the 5V power rail from overload). Of course it will skew readings in the upper range which can be mitigated by adjusting the divider (i.e. maximum battery voltage gives 4 V).

Wawa:
@ trailsurfer604

Which Arduino. There is no ‘one size fits all’.
See the Aref page.

For an Uno, you design the voltage divider for ~1.1volt, like 1k8:27k or 2k2:33k or even 10k:150k.
With of course the 100n ceramic cap from pin to ground.

For your ProMicro, the divider must dial down to 2.56volt, like 10k:56k.

Then use this line in setup()
analogReference(INTERNAL); // use the internal ~1.1volt reference (2.56volt for the ATmega32U4)

Converting to voltage could be as simple as
float voltage = analogRead(A0) * 0.01678; // calibrate !

That calibration factor will be different for the ProMicro, and also depends on the resistor ratio. About 0.0165
Leo…

Thank you, this is very useful. I am currently using a micro, so the reference will be 2.56V for me. I will try it tonight to see this addresses my issues.
How do you determine the calibration factor? I checked the Aref page, but it doesn’t address calibration.

My own design is based in a 0 to 16Vdc input divider, since 16 it's a multiple of the Arduino 10bits (1024) ADC, so I use a 1:3.2 (2.2 factor) resistor divider. Obviously using that factor can be tricky finding the correct resistors, but to compensate that and real resistors tolerance I add a multi-turn pot to adjust the divider.

In my opinion, this approach is a waste of time and an unnecessary component - the potentiometer. An end to end calibration that aligns ADC values with standard test voltages, is the sensible way if you want to achieve good accuracy. That can be performed entirely in software, since you only need to scale the values (intelligently, of course, using fixed point scaled arithmetic). The inaccuracy of the 1.1V reference was mentioned, but that reference is very stable at whatever voltage it operates. The end to end calibration would subsume the 1.1V reference error, along with all the resistor errors.

If you want to see how automotive input protection really works, check out the application notes by manufacturers that make automotive IC's. I did, and I really learned a lot. Those reflect a deeper understanding and the influence of decades of industry experience that are not fully included in this post.

I don't think it's harsh to criticize things that people say when they put themselves up on the podium. Educational material shouldn't only reflect the instructors opinions, it should encompass the wider field of knowledge in a subject.

trailsurfer604:
I checked the Aref page, but it doesn’t address calibration.

Correct.

trailsurfer604:
How do you determine the calibration factor?

I calculated it, from the voltage divider and the average 1.1volt Aref I’ve seen.

Just see what voltage you get on the display, and compare that with a DMM.
If there is a difference, then multiply the calibration factor by the ratio of difference.
Example: Arduino is 12, battery is 13 > 13/12 = 1.07692 * 0.01678 = new calibration factor.
Upload, and try again.
Leo…

You can write a calibration routine:

Use a digital input and check that in setup(). If it’s set run a calibration procedure .
In that procedure connect a “ hi” voltage and type in that voltage , ( use the serial monitor and print the instructions to it and read the replies back ) then connect a low voltage and type that in . You can then work out a calibration factor in your software and save it to EEPROM.

hammy:
You can write a calibration routine:

Use a digital input and check that in setup(). If it’s set run a calibration procedure .
In that procedure connect a “ hi” voltage and type in that voltage , ( use the serial monitor and print the instructions to it and read the replies back ) then connect a low voltage and type that in . You can then work out a calibration factor in your software and save it to EEPROM.

Exactly. Or else you can upload it as a separate sketch. One calibration is probably enough for the lifetime of the circuit.

The best method to read a car / truck battery voltage is to use an INA226 module.

This module sells in china for 1.5$ and provides an accurate and factory calibrated voltage read up to 36V. Nothing to trim ,nothing to calibrate it is precise out of the box.
Additionally the chip provide an averaging of up to 8000 samples so you can just read the result with one call.
Beside that, it provides an off-rail current measurement that you could use to measure the battery current along any shunt.

Measuring voltages with A0 are approximative and depend on the 5V supply.
I did it previously and stopped since i have found the the INA226 that do a far better job with 0,5% precision.

Beside that, if you get the magic blue smoke, it will be limited to a cheap 1.5$ device, easy to be replaced.