Efficiency question (running on batteries)

I am having a difficult time calculating the relative efficiency of two alternative solutions to a voltage control problem.

The power source for my board is a LiPo battery that will run at 3.5-3.7 volts for the majority of its discharge curve (at least my understanding is that this chemistry produces a very stable voltage until suddenly plumetting as it nears its discharge state). I am stepping this up to 5 volts with an out-of-the box PCB. I can't get the datasheet at the moment, but let's assume it has 85% efficiency (which seems right in the middle range for low power step ups meant for logic circuits).

I also need to power a component that requires a nominal 3.3 volts, but is fine between 3 and 4.2 volts.

The entire board will shut down when the battery voltage drops to 3-3.3 volts (using a Phillips voltage trigger piece).

Version one is to use the raw battery output to power the component. This avoids regulation losses and should be sufficient to power the component until the battery is low enough to shut down the entire board (on average slightly above 3.0 volts). The downside is that for an unrelated reason, this approach will require me to place a diode right before the 5v step up PCB. So the voltage to the 3.3 volt component will be unaffected by the diode, but the step up PCB will now be converting on average 3 volts to 5 volts, rather than 3.6 volts to 5 volts. The load on the step up PCB is likely to be in the range of 100 milliamps (and about 50 milliamps on the 3.3 volt component, but that is not getting stepped up here).

In version two, I get 3.3 volts by using a low dropdown linear regulator to go from the 5v output of the step-up to 3.3 volts. The max load on the 3.3 volt component is 50 milliamps, so the waste should be roughly 1.7v * 0.05 A = 0.085 Watts. This avoids the diode, and thus the step up is converting 3.6 -> 5 rather than 3.0 -> 5 (with a ~ 150 milliamp load).

In the end, I'll likely use a Schottky diode, so the voltage drop will be more like 0.3 than 0.6, but I'd like to get my head around the classic example using a standard diode.

My confusion arises from not knowing how to calculate the losses in the step up. I know I'm not making things easy by not including specs, but let's assume a generic case. I just don't know how to work the quiescent current and switching losses into the equation.

It boils down to this: what's more efficient, (1) stepping up 3.6 volts to 5 volts for a ~ 150 milliamp load and using a linear regulator to drop 5 volts to 3.3 volts for a 40-50 milliamp load; or (2) stepping up 3.0 volts to 5.0 volts for a ~ 100 milliamp load?

Thanks!

Why do you need +5V?

I really can't answer your question as I think there are several variables, plus I would want real circuit measurements made before saying which idea is best. The real reason I'm posting is your statement:

(at least my understanding is that this chemistry produces a very stable voltage until suddenly plumetting as it nears its discharge state).

While that statement does apply to nicads and nickel metal hydride batteries, Li-po batteries have a pretty linear discharge curve. They start off at 4.2 fully charged and discharge in a pretty straight line down to the cut-off value where damage will result if you go lower. 3.0 is a common stopping point, but some conservative people stop at 3.5 to be less hard on the cell. So simple measurement of the cell voltage under load can be used to determine % of charge left.

Lefty

I don't know if this might help or not, but I've been messing with a sample of this chip . . .
http://datasheets.maxim-ic.com/en/ds/MAX1759.pdf
It's a buck/boost chip that only requires 3 caps - 100mA output, 3.3V or adjustable for 5V. Other than it's small package (uMax) it has some good potential in LiPo circuits.

Pololu has a nifty little device that does just that:
http://www.pololu.com/catalog/product/798 $5.45.
"This extremely compact boost regulator generates 5 V from voltages as low as 0.8 V and delivers up to 200 mA, making it perfect for powering small 5V electronics projects from 1 to 3 NiMH, NiCd, or alkaline cells or from a single lithium-ion cell."

Crossroads, that's a nice little boost regulator at a great price.

However, the MAX1759 chip I mentioned is a little different.
It boosts the voltage when it is below the set voltage and it bucks (lowers) the voltage when it is above the set voltage.
When it bucks the voltage, it is generally more efficient than the typical voltage regulator. ( I think the datasheet said 90%)

I don't see much of a need in 5V circuits, but for 3.3V running on LiPo's, it may be useful.

When fully charged a LiPo is around 4.1V, so you need to lower it to 3.3V. When it's under 3.3V you can either switch off your circuit (as I think the OP is doing) or boost it up to 3.3V. (see the data sheet linked above for a better description)

Thank you all for the helpful insight!

Although my initial question assumed that I already had a step-up circuit, it turns out that the PCB I have is limited to 100 mA and the loose ICs that I have (MC34063A) are pretty inefficient at the voltages involved (LiPo single cell to 5V). So the voltage regulation component is in play as well. Although a buck-boost topology is probably the best fit, I haven't had any luck finding any buck/boost ICs that come in DIP packages on Digikey. I'd rather avoid surface mounting for the time being (prototype stage, frequent minor redesigns).

Any thoughts on Linear Technology's LT1110 ? It comes in adjustable and fixed (5V or 12V) versions and is available in a DIP package. The fixed versions appear pretty easy to implement, requiring only 3 external components. I've ordered some samples and will investigate, but if anyone has had any experience, please share your thoughts. The topology is described as a gated oscillator switcher.

Important tangent aside, my initial problem had to do with powering both 3.3V and 5V components in my system in the most energy-efficient manner possible. I need to run at least 1 of my ATMega328s at 5V because it is doing quite a bit of resource intensive floating point math and thus I need the full 16mHz. At 16mHz, the computations for each system cycle take between 30-60 milliseconds (usually ~30ms, but in the mode where time is most critical, a bit over 50 ms). Add inter-chip communications (another ATMega handles LCD and user interaction and another ATMega handles SD logging) and GPS parsing and we arrive just short of 150 ms. GPS fixes arrive every 200 ms, so I can't really afford to give up any processing power.

The 3.3V is for the GPS chip. It has a pretty tolerant input of 3-4.2V, but is not 5V tolerant.

My two options are:
(1) Step up LiPo output (3-4.2V) to 5V, then use a linear low dropdown regulator to get 3.3V for the GPS (max 50mA load, average ~40mA). At max load, this wastes 0.085 Watts as heat.

(2) Run GPS directly off the battery (circuit shuts down at 3V to protect battery, so the GPS unit will run until shutdown), and use stepped up 5V only for 5V components. Unrelated design considerations demand that in this case I place a diode between the GPS junction and the step-up converter, thus dropping the input to the step-up by 0.6V (assuming I don't use a Schottky). The average load on the output of the step up will be approximately 100mA, but will rise to 350mA from time to time to power an LCD backlight (possibly for several minutes at a time).

So the question boils down to whether the waste from stepping up an additional 0.6V for a 100-350 mA output load exceeds 0.085 Watts. I am assuming the regulator is 85% efficient.

Whichever is likely to have the lower waste will win.

[I think I've seen buck/boost regulators with dual outputs that can be set to output 5V and 3.3V at separate pins...but these were tiny surface mount components. Perhaps at a subsequent prototype stage when I transition to manufactured PCBs and outsourced assembly]

Thanks again for your responses. You've already helped to make the design more robust.

You can get rid of the diode drop using this