I have a device that looks something like:
7.5V wall wart (actually gives 11V !) -> polyswitch 1.1A fuse -> Safety rectifier -> 1117v50 regulator -> depletion mode MOSFET -> Backflow prevention diode -> voltage sensor input -> 3.7V LiPo cell -> mc34063 based switching boost converter -> 5.35V supply for AVR microcontroller.
I tried separate parts of this on breadboard, then built a PCB in Eagle and sent the gerbers off to iteadstudio -- and, amazingly, it mostly works!
The voltage sensor is there to allow me to turn off input power when the charge of the LiPo is complete -- which I'm currently saying is at 4.1V (the cell is rated up to 4.2V).
The depletion mode N-channel MOSFET has kind-of high Rdson at about 3 Ohms. This works as a current limiter for the charger, which was actually intentional. And there's a heat sink on the LDO and on the MOSFET. The gate is held high with a pull-up resistor and by a digital pin outputting into it through a small resistor. When the voltage reaches 4.1V, I pull the gate to ground, through that small resistor and digital pin. This makes the board entirely battery powered, even if it's plugged in, so when the battery depletes to 4.0V, I turn charging back on.
The problem I have now is that the charger works great while the LiPo is depleted, at about 3.0V -- providing about 600 mA, which is about ideal for the cell I'm using. However, as the charge increases, the available power goes down, and charging starts taking a long time. I'd like to push more current in, up to the 4.1V cut-off.
There are at least two things preventing this:
1) The backflow diode (preventing current to flow back into the regulator when not plugged in) drops some voltage. I originally used a 1N4004 I had laying around, but at 0.8V drop-out, that was a loser. I changed it to a Schottky with 0.36V drop, which is better.
2) The depletion mode MOSFET is great as a switch, because it keeps the circuit ON at boot -- there is no boot-strapping problem like I would have if I used an enhancement mode device. But, the resistance, which is great at controlling current when the LiPo is low, is too much in the way when the LiPo is high.
So, long story short, I have some possible re-designs under consideration:
a) Quick fix: use a 6V or maybe even 7V regulator. Keep everything the same. Declare success (-ish).
b) Less losses: use some low-Rdson device to turn charge current on/off. Perhaps a P-channel depletion mode. If I pull it down with a resistor, it may still stay "on" on power-up, assuming I can pull it high enough with a digital out pin to turn it off. However, with less loss here, I'll need some way to limit current when the LiPo is low.
c) No regulator. Just push the wall wart into the battery. Again, I'll need some way of limiting current here, through the entire charge cycle.
Separately, I'm considering:
d) Move the MOSFET, so it only goes from regulator out to battery, but the switching step-up regulator does not pull power through it. Use the built-in diode (or bypass with a Schottky) to allow the battery to power the switcher when not plugged in. This solves the "being plugged in still uses charge/discharge cycles of the cell" problem, but doesn't do anything to solve the slow charge towards full problem. It also adds a source of loss (Schottky diode) between the cell and the step-up regulator that powers the AVR when not plugged in.
So, I'm looking for advice, comments, or anything else you'd like to contribute.
And, yes, I've considered delegating the entire shebang to a dedicated power control circuit, but where's the fun in that? :-)