What happens when battery keeps getting 'charged'?

I’m adding a second battery to my car for camping purposes. The second battery will be a deep cycle SLA. It will be connected to my starter battery/alternator via a voltage-sensitive relay (cuts in ~14V to parallel the two batteries and they both charge while car is running, cuts out ~12V to isolate secondary battery so it can drain in peace when car is off)

I got to thinking, how does a car alternator charge a car battery? Doesn’t it just constantly output 14.4V or so for as long as the car engine is running? My question is, then, what is it doing to the battery if a constant voltage is applied indefinitely?
My curiosity is in the context of any other kind of battery- what would happen if you applied a constant 4.2V to a lithium cell indefinitely?
A lot of battery chargers advertise that they cut off charging when batteries are full, some switch to a trickle maintenance function, but I’m not understanding why some special mode is required. Isn’t the system inherently at a standstill when battery voltage equals voltage coming in?

Google is your friend

batteryuniversity.com/ is the definitive reference for the charge/discharge behavior of most common types of batteries.

Lead acid batteries are very tolerant of abuse, but lithium based batteries are very quickly destroyed by any deviation from proper charge/discharge protocols.

Yes, I know how an alternator works, I'm just asking about how it um . . . . works

In essence, it's a 'dumb' charger, right? It's not a fancy computer that's monitoring the battery's internal health.

In essence, it’s a ‘dumb’ charger, right?

That depends on what you mean by dumb.

Automobiles have charging circuitry that regulates the battery charge cycle, but for lead acid batteries, that is fairly straightforward.

What I mean by 'dumb' is that an alternator outputs a single voltage with a restricted current max. It's as dumb as a wall wart- it puts out a single voltage, and it just sits there waiting to be drawn from.

So with what was said so far, it seems lead acid batteries simply cut themselves off once input voltage equals battery voltage. I mean, that's electronics 101 lesson on voltage potential. In practice, if you go for a 10 hour road trip, then your battery is being 'charged' that whole time, even if it's full up at 14.whatever volts.

My question is, wouldn't any battery, regardless of chemistry, still obey the same law of voltage potential? Again, as an example, if I had a power supply putting out a reliable 4.2 volts, what detriment could come from feeding that to a single lithium cell indefinitely? (assuming current is limited to within charging current of lithium cell)

Maybe chargers that advertise features like automatically stopping the charge once battery is full, are just being silly. When voltage input matches battery voltage, charging stops, there's nothing magical about it, it's automatic, it requires no action whatsoever. That's the impression I'm trying to either validate or educate away.

Nope, not as "dumb" as you seem to think. In many modern cars, charging is indeed computer controlled. Adding an extra battery could create a problem in such a vehicle.

Welp, I don't consider my 16 year old car as 'modern' and tons of people have been adding second batteries for camping or audio systems for quite some time with commercially available products, so I'm not looking for naysaying in that regard.

Heard of a regulator ? http://www.secondchancegarage.com/public/83.cfm

"You should see something between 13 and 15 volts when running. No change in voltage means either the regulator or alternator isn't working, while higher voltage means the regulator isn't properly "regulating." "

So alternators put out various voltages, and transistor based regulators use a sort of PWM of about 2000Hz to regulate the output voltage. Interesting, but still doesn't satisfy my curiosity. It still sounds like alternators indiscriminately put out 13-15 volts, always, when engine is running. They never stop trying to top up the batteries when the voltage gradient swings that way.

I believe it said that a relay disconnects the alternator from the system when the battery voltage = the charging voltage but I'm not a mechanic so perhaps you should post on an Auto forum to ask "Auto" questions. Come back when you have "General Electronics" questions (there's nothing "General" about your question. It is a specific automobile electrical system question.

My question was, generally, what happens to a battery if it is given a fixed voltage that it will have when full, indefinitely. I have mentioned this several times. The automotive framework was a scenario where such a thing probably plays out.

Battery chemistry is not as simple as you’d apparently like to think, particularly not lithium-based chemistry. But in general if the charger is regulated never to exceed the battery full voltage then you’re o.k. Most battery types will effectively taper the charging current down to near zero themselves.

Most chargers, including auto alternators, are not so regulated but since your question is “What if the charger puts out a fixed voltage…” then that’s your answer.


Connect both battery in parallel without relay. If second battery is completely discharged wait 20 min before you start engine. If second battery is not portable put a diode between them to prevent consuming energy from your car battery, when camping.

INTP: My curiosity is in the context of any other kind of battery- what would happen if you applied a constant 4.2V to a lithium cell indefinitely?

If the lithium cell happens to be 4.2 V and the applied voltage is 4.2 V, then there'd be no voltage difference - so no current flow. So..... maybe nothing will happen.

Southpark: If the lithium cell happens to be 4.2 V and the applied voltage is 4.2 V, then there'd be no voltage difference - so no current flow. So..... maybe nothing will happen.

That's certainly true for lithium chemistry. That's why Li-ion chargers use CC/CV charging. That's Constant Current to 4.2V followed by Constant Voltage of 4.2V until the current reduces to near zero.


Interesting, constant current reduces

ted: Interesting, constant current reduces

It's a circuit controlling thing. A voltage monitor will keep an eye on the voltage during a constant current charging condition. If the voltage of the cell reaches somewhere around 4.2V, then the controller system starts changing behaviour..... and goes into some other charging mode.

Interesting, constant current reduces

Well, no, not really, it limits, hence the name "Constant Current". There are many ways to do what you are trying to do. The best way is to just ask a mechanic.