Solar charger IC for NiMH just like the TP4056 for LiPo

Hey Guys,

I have been looking for a smart solution to power an SAMD21G18 on an itsybitsy M0 from adafruit:

using a 5-6V solar panel (I could also use a solar panel with a different rating) that charges some NiMH-batteries. The board accepts a voltage of 3.5-6V, so I guess 4-1.2V NiMH Low self discharge (LSD) batteries would be good. I'd like to use the LSD batteries to run the processor for a long time.

I saw that there is no easy and cheap solution such as the TP4056 module for LiPo-batteries. Do you have an idea?

Thank you,
Marco

With NiMH batteries, you do not need a charge controller, as long as you obey the "trickle charge" restrictions.

More info here BU-408: Charging Nickel-metal-hydride - Battery University

I've had great success with constant voltage charging, along with constant current when the voltage is below the voltage set-point.

If slightly less that full charge -- like, on the order of 5% to 20% less than a full 100%, is acceptable [i.e. a trade-off of slightly more battery vs expense/charging complexity], then this may be the technique for you!

I usually set the maximum charge voltage to around 1.43V per cell. For instance, for a 7 cell "9V" battery, the max charge voltage would be

7*1.43 = [b]10V[/b] -- more like 1.429V

I've used an LM317L, for charging 8.4V "Transistor Batteries" [the ones with the snaps on top]. I set the regulation voltage to 10V, and the inherent 100mA [or so] current limiting sets the maximum charge current to around 0.5C.

Some may argue that such an arrangement will stress the LM317L. Maybe. But, I've had such chargers work for years, with no problems -- and, unless the battery is run down, considerably, it doesn't charge very long at that maximum rate. Most of the time it charges at the regulated voltage, so less than 100mA. As that battery charges, the current level gradually falls and eventually, after several hours, drops below the 0.05C rate, considered safe for NiMH trickle charge. And after even more time, the charge rate drops to near zero.

Another regulator that I've used is the STS L200CV, which is designed to both regulate voltage AND current. Thus, you can set whatever charge limitations the device is capable of.

This is, definitely, not the quickest way to charge a NiMH battery, but it's cheap, and easy, and a good compromise between performance and expense/complexity. Also, because the max charge current can be set high, for an initial 0.5C to 1.0C, you can get a "quick" 50ish% charge, so if you select a battery that will, at 50% charge, run your device for the desired duration, then a full charge will give you bonus run time. In other words, double the battery size, so you can use this cheaper/easier quick charging system -- may or may not be a practical solution, depending on how much more expensive a 2xlarger battery turns out to be.

I've also teamed the L200CV with an MCU that also monitors battery temperature, and even Discharge Voltage, with a battery disconnect to prevent it from discharging too low. Also, with an MCU, you can set a maximum charge time.

Also, this is probably best for cases where charging will occur at, or near room temperature.

Hey guys

Thank you for your helpful comments!

ReverseEMF:
I usually set the maximum charge voltage to around 1.43V per cell. For instance, for a 7 cell "9V" battery, the max charge voltage would be

7*1.43 = [b]10V[/b] -- more like 1.429V

I've used an LM317L, for charging 8.4V "Transistor Batteries" [the ones with the snaps on top]. I set the regulation voltage to 10V, and the inherent 100mA [or so] current limiting sets the maximum charge current to around 0.5C.

ReverseEMF, your comment was very helpful to me! I am thinking about building a NiMH solar charger using this 12V, maximum 100 mA solar panel:

and the LM317 or LM317L. Here is a tutorial for the circuit:

I guess the diodes are just to prevent current to flow back to the solar cell or draw current from the battery. The resistors are then for setting the output voltage of the LM317.

I am wondering how much the voltage of the solar cell will drop when plugged in to the voltage regulator and when under load. I guess a current that is too large will kill the voltage supply.

Best and thank you,
Marco

I am wondering how much the voltage of the solar cell will drop when plugged in to the voltage regulator and when under load

The short circuit current for the panel you linked (with zero volts across the cell) won't be much more than 100 mA.

Typical current-voltage relationship:

jremington:
The short circuit current for the panel you linked (with zero volts across the cell) won't be much more than 100 mA.

Typical current-voltage relationship:

Thank you a lot for your useful comment! Alright, I've looked a bit more online and I will probably go for following circuit:


from: https://www.electronicshub.org/solar-battery-charger-circuit/

I have also thought about going for this circuit:

from: Solar Charger Circuit using IC LM317 | Electronics Project

However, I don't see a reason for putting a Zener diode and a transistor that sucks away the current if the Zener didoe breaks down. I think the R1 and R2 resistors of the LM317 can be selected to give a very steady output voltage.

In my case, I want to charge three LSD NiMH batteries with a rating of 1.2V:

I will wait for their arrival, fully charge them and see the actual voltage and then select the resistances for the LM317:

What do you think? Is that circuit smart and efficient enough? I don't want to waste the solar energy.

Best,
Marco

I don't want to waste the solar energy.

For the tiny solar panel linked in reply #3, adding a voltage regulator will be a waste of solar energy.

The XCell LSD NiMH battery is rated at 2100 mAh, so the safe trickle charge maximum rate is about 100 mA, which is the maximum current that the linked panel can provide.

It will take the panel alone, with a series diode, well over 21 hours (2100 mAh/100 mA) of full sun, or 2 to 3 summer days, to charge a dead cell at that rate.

One caveat, though. Using a linear regulator, like the LM317 [especially the LM317!] is not only a cheap solution, it's a dirty one, as well.

Why? Because it's rather inefficient. A lot of power is lost in the regulator, because of something called drop out voltage. So, if that's a concern, then this can be minimized by matching the output voltage of the Solar Panel, as closely to the minimum required input voltage, as possible--but, there's really no way to make it as efficient as can be achieved with a Switch Mode solution.

BUT, if using a larger Solar Panel, than would be needed for an efficient solution, is not an issue, then what the heck, there's probably plenty of sun for what you need.

I would advise against using a Shottky diode between the battery and the LM317 circuit. Shottky diodes tend to have very high reverse leakage current, which would work against the Low Discharge feature of your batteries. Use, instead, a regular silicon diode -- like a 1N4004. Using a diode, here, is a necessary evil, since it, not only makes the charger even more inefficient, it also, lowers, slightly, the regulation, since the diode's forward voltage varies, slightly with current. So, adjust the voltage, beyond the diode, to compensate for the diode drop.

Why is the diode needed? Because, without it, the battery will, when not being charged, discharge through the R1, R3 path.

mbobinger:
I have also thought about going for this circuit:

from: Solar Charger Circuit using IC LM317 | Electronics Project

However, I don't see a reason for putting a Zener diode and a transistor that sucks away the current if the Zener didoe breaks down. I think the R1 and R2 resistors of the LM317 can be selected to give a very steady output voltage.

I agree. The ZD, R2 T1 addition really isn't needed, since the LM317 will already, if adjusted right, limit the charge level. I suppose, if the goal is to get as close to a 100% charge, as possible, that addition makes sense, but expecting a 6.8V zener to be precise enough, in terms of voltage, to limit at the proper set-point, is a crap-shoot, in my opinion.

jremington:
For the tiny solar panel linked in reply #3, adding a voltage regulator will be a waste of solar energy.

The XCell LSD NiMH battery is rated at 2100 mAh, so the safe trickle charge maximum rate is about 100 mA, which is the maximum current that the linked panel can provide.

It will take the panel alone, with a series diode, well over 21 hours (2100 mAh/100 mA) of full sun, or 2 to 3 summer days, to charge a dead cell at that rate.

Thanks a lot for that remark! For me, it would be fine if the whole solar panel is able to recharge a total of around 3x2100 mAh (1.2V NiMH batteries) per year. Very likely, if the power consumption and the sleep modes are designed nicely, the required charging is much less.

ReverseEMF:
One caveat, though. Using a linear regulator, like the LM317 [especially the LM317!] is not only a cheap solution, it's a dirty one, as well.

Why? Because it's rather inefficient. A lot of power is lost in the regulator, because of something called drop out voltage. So, if that's a concern, then this can be minimized by matching the output voltage of the Solar Panel, as closely to the minimum required input voltage, as possible--but, there's really no way to make it as efficient as can be achieved with a Switch Mode solution.

BUT, if using a larger Solar Panel, than would be needed for an efficient solution, is not an issue, then what the heck, there's probably plenty of sun for what you need.

Dear ReverseEMF,

Thank you a lot for your helpful comments! Since efficiency of the solar charger will play a role (I might also want to place them in a forest without too much sunlight but at least sufficient to sustain the required voltage), the dropout voltage of the LM317 is not ideal.

How does a switch mode solution work? Are there maybe any tutorials? Would it also be possible to use a voltage regulator with a lower dropout voltage to increase the efficiency?

Are there also switching regulators on the market, e.g.:

Best,
Marco

I think I have found my favourite DC-DC converter, a switching regulator with up to 93% efficiency:

However, it is only available in 3.3V or 5.0V. I could go for plugging 4x1.2V NiMH Batteries or use another DC-DC converter......

Best,
Marco

mbobinger:
I think I have found my favourite DC-DC converter, a switching regulator with up to 93% efficiency:
https://cdn-reichelt.de/documents/datenblatt/D400/LMO78_10_DS_EN.pdf

However, it is only available in 3.3V or 5.0V. I could go for plugging 4x1.2V NiMH Batteries or use another DC-DC converter......

Or, how about one of these:

mbobinger:
How does a switch mode solution work? Are there maybe any tutorials?

Very complicated and not for the beginner [in fact, in many ways it's over my head, too [I'm more of an MCU/blinky lights guy] -- better to find an off-the-shelf solution.

mbobinger:
Would it also be possible to use a voltage regulator with a lower dropout voltage to increase the efficiency?

Perhaps, but even a Low DropOut [LDO] regulator will not guarantee low dropout. LDO merely means as-low-as-it-can-go. If the input voltage to output voltage ratio, is higher than the LDO voltage, there will still be a high dropout voltage.

For instance: Let's say we use a 5V LDO Regulator IC with a rated minimum dropout voltage of 300mV, but, we put a 12V voltage on it's input. The actual dropout voltage will be: 12V - 5V = 7V! So, what's the point of an LDO? -- you might ask. Here's another example that should clear that right up:

Suppose we want to run a device at 5V, using 4 AA cells. And we want to squeeze as much juice [thank you Ben] out of those cells as possible, like still get 5V out of our regulator even when the cells are down to 1.3V...well -- lets do the math!

1.3V * 4 = 5.2V

So, if we use an LDO regulator with a 200mV minimum dropout voltage, or better [like an LD2981CU50TR], then HA!-HA! It can happen!

ReverseEMF:
Or, how about one of these:

Dear ReverseEMF,

Thank you a lot for your useful comments! The ICs with adjustable switching voltage regulator you sent me are just perfect.

My probably last questions are, at what voltage shall I recharge the NiMH batteries? I have purchased following batteries:
https://www.reichelt.de/nimh-akku-aa-mignon-2100-mah-1er-pack-xc-x2100aa-p190936.html?&trstct=pos_0

In the datasheet, there are some plots with Chinese axis but it is clear what is shown (page 7 in the pdf):

Fully charged, the batteries hold a voltage of around 1.4V. For lower temperatures, that voltage exceeds even 1.5V.

Looks to me like I can go for a charging voltage of 1.4V. What do you think?

The solar panel should be placed outdoor to recharge the NiMH batteries over the whole year and power an MCU that records sensor values. I am not sure whether it will work to operate the NiMH battery and the charger in a temperature window of -10 to 50°C.

Thank you!
Marco

mbobinger:
My probably last questions are, at what voltage shall I recharge the NiMH batteries? I have purchased following batteries:
XC X2100AA: NiMh Akku, AA (Mignon), 2100 mAh, 1er-Pack bei reichelt elektronik

In the datasheet, there are some plots with Chinese axis but it is clear what is shown (page 7 in the pdf):
https://cdn-reichelt.de/documents/datenblatt/D500/XC_X2100AA_DB.pdf

Fully charged, the batteries hold a voltage of around 1.4V. For lower temperatures, that voltage exceeds even 1.5V.

Looks to me like I can go for a charging voltage of 1.4V. What do you think?

The solar panel should be placed outdoor to recharge the NiMH batteries over the whole year and power an MCU that records sensor values. I am not sure whether it will work to operate the NiMH battery and the charger in a temperature window of -10 to 50°C.

OK...didn't know about the wide operating temperature stipulation. Like I said before, this is a solution for charging at room temperature. I've never dealt with temperatures wider than around 55°F to 104°F [~13°C to ~40°C]. So, can't guarantee this is a viable solution for such extreme temps.
Also, at temps below 0°C and above 40°C you should only do a trickle charge [i.e. C/10, e.g. if the battery capacity is 220mAHr, then the C/10 charge rate would be 22mA] -- so, you might want to consider, a constant voltage charger with current limited to a trickle charge -- that way, it will, also, properly reduce the charge rate, when the battery reaches the set voltage plateau. 'cuz, not good to trickle charge a NiMH battery beyond it's full charge point.
And, reminder: as I said, before, this solution will, likely, NOT fully charge the battery. It's a trade-off between cost/complex of the charger, and battery economy.
But, if you're still onboard, I've generally used ~1.42V. Which, if you wind up using a series diode to prevent battery discharge, is pretty hard to hit, so, yeah, lets call it 1.4V. The chargers I design, typically have some sort of switching circuit -- e.g. an OptoIsolator "solid-state relay" with two MOSFETs on the output, or a N-Ch Low Gate Threshold MOSFET that relies on the regulator DropOut voltage to turn it on.

For Instance:

D1 is for reverse voltage protection, but the RTR030N05TL is vulnerable to voltages greater than -12V, so there's that. D1 also provides a bit more MOSFET turn-on voltage, but, with this MOSFET, not really needed.

Be advised: this is an old design, and that MOSFET has since become "Not recommended for new designs". Mouser still has over 120 thousand of them, but choose accordingly :wink:

And, I just noticed: That note that says "Adjust R6..." it should be R8. The note was composed in an even earlier schematic, and since, the Part Reference Numbers were updated, and I forgot to update the note!