Charging a battery

I feel like I am missing something.

I have a battery which is rated at 4.1 V and according to manufacturer can be charged with a constant voltage source:
http://www.infinitepowersolutions.com/products.html

I am using the adafruit DAC MCP4725 Breakout Board - 12-Bit DAC with I2C Interface [STEMMA QT / qwiic] : ID 935 : $4.95 : Adafruit Industries, Unique & fun DIY electronics and kits and outputting 4.05 volts. I am connecting the Vout of the DAC (+4.1V) to the Vout of the battery (initially +3.8V) and the minus side of the battery to ground.

I have sucessfully discharged the battery.

Any idea what I am doing wrong here?

(if somebody just has a vague idea or a hint what I could look into, please dont hesitate to post. Ideally I need to figure this out in the next two hours, so any idea is really appreceated.)

thanks

p.

Your DAC is not a battery charger. You may have damaged it so it now acts like a "load" rather than a DAC.

why do you think this might damage the DAC? in what way? it seems to be working fine (i.e. its outputting the voltages I tell it to, etc.)

(I am asking not to be defensive, but to understand whats going on)

but hooking up plus to plus and minus to minus is correct right? (bleh - I am beginning to doubt everything. think I just killed the battery somehow as well...)

  • to + and - to - is correct for battery charging.

The DAC in question can only output a maximum of 25mA.
If there was a difference in voltage between your DAC output and the battery terminals before connection, the DAC output stage may have been damaged by the excess current.

Constant voltage charging makes no allowance for load current so you must have some means of regulating the charge current to a "safe" value. A simple method of doing this is to insert a series resistor which will limit current. In your example

Output voltage = 4.05
Battery voltage = 3.8

Voltage difference = 0.25volts
Maximum permitted current = 0.025A

Limiting resistor = 0.25/0.025 = 10ohms So a value somewhere between 10 and 22 ohms should be used to prevent DAC damage

Once the battery voltage rises, the current reduces so the voltage dropped across the resistor reduces and eventually the battery voltage matches the DAC output.

HOWEVER, if your DAC output is set to less than the battery voltage the current will flow from battery into the DAC output stage and may well damage it. This can be prevented by insertion of a series diode but your calculations then have to account for the voltage dropped across the diode.

As I said at the start, a DAC is not designed to be used as a battery charger.

cool.

Thanks for the explanation.

Given the largest of these thin-film batteries is rated at 2.2 mAh and has an internal resistance of about 20 ohms, the max currents that can
flow are about 200mA (90mA is the recommended maximum).

I'd suggest an LM317 or other variable voltage regulator - just add a couple of external resistors to set the voltage to 4.1 and a
series resistor to limit the current (30 to 100 ohms perhaps). If one of the smaller batteries then a large series limiting resistor
would be appropriate.