I am having a difficult time calculating the relative efficiency of two alternative solutions to a voltage control problem.
The power source for my board is a LiPo battery that will run at 3.5-3.7 volts for the majority of its discharge curve (at least my understanding is that this chemistry produces a very stable voltage until suddenly plumetting as it nears its discharge state). I am stepping this up to 5 volts with an out-of-the box PCB. I can't get the datasheet at the moment, but let's assume it has 85% efficiency (which seems right in the middle range for low power step ups meant for logic circuits).
I also need to power a component that requires a nominal 3.3 volts, but is fine between 3 and 4.2 volts.
The entire board will shut down when the battery voltage drops to 3-3.3 volts (using a Phillips voltage trigger piece).
Version one is to use the raw battery output to power the component. This avoids regulation losses and should be sufficient to power the component until the battery is low enough to shut down the entire board (on average slightly above 3.0 volts). The downside is that for an unrelated reason, this approach will require me to place a diode right before the 5v step up PCB. So the voltage to the 3.3 volt component will be unaffected by the diode, but the step up PCB will now be converting on average 3 volts to 5 volts, rather than 3.6 volts to 5 volts. The load on the step up PCB is likely to be in the range of 100 milliamps (and about 50 milliamps on the 3.3 volt component, but that is not getting stepped up here).
In version two, I get 3.3 volts by using a low dropdown linear regulator to go from the 5v output of the step-up to 3.3 volts. The max load on the 3.3 volt component is 50 milliamps, so the waste should be roughly 1.7v * 0.05 A = 0.085 Watts. This avoids the diode, and thus the step up is converting 3.6 -> 5 rather than 3.0 -> 5 (with a ~ 150 milliamp load).
In the end, I'll likely use a Schottky diode, so the voltage drop will be more like 0.3 than 0.6, but I'd like to get my head around the classic example using a standard diode.
My confusion arises from not knowing how to calculate the losses in the step up. I know I'm not making things easy by not including specs, but let's assume a generic case. I just don't know how to work the quiescent current and switching losses into the equation.
It boils down to this: what's more efficient, (1) stepping up 3.6 volts to 5 volts for a ~ 150 milliamp load and using a linear regulator to drop 5 volts to 3.3 volts for a 40-50 milliamp load; or (2) stepping up 3.0 volts to 5.0 volts for a ~ 100 milliamp load?
Thanks!