Transistor and Battery Life

I'm trying to understand what effect the base current of a transistor has on battery life and I'm curious if anyone could shed some light on this for me. Specifically, I'm curious if a transistor which has a base current of 100mA will incur a concurrent drain on the battery that could lead to poor battery life.

As an example I'm looking at using this transistor to drive a 1w or 3w high power LED:
http://www.fairchildsemi.com/ds/BD/BD681.pdf

The biggest things I am concerned about are power dissipation and battery consumption (brightness will be controlled by a potentiometer so I just use the MC to turn on/off the transistor).

The Arduino can sink/source at most 40mA per pin. If you just want to switch on/off go for a FET as it will consume no current except for switching. However if you use a potentiometer for brightness regulation you will have lots of unwanted power dissipation. Google for pulse width modulation to learn how to avoid it. You can also find some examples for PWM on my website http://www.blinkenlight.net. Just search for the "knight rider" examples.

SilentDirge:
I'm trying to understand what effect the base current of a transistor has on battery life and I'm curious if anyone could shed some light on this for me. Specifically, I'm curious if a transistor which has a base current of 100mA will incur a concurrent drain on the battery that could lead to poor battery life.

Whatever load your circuit creates will incur in faster battery drain, thus shortening battery life's.
Tips Udo Klein posted are just great for you specific situation.
About the generic part of your question my suggestion is: while using bipolar transistors, do calculations for the right amount of current do you really need on transistor’s base to drive the load you plan for it. This way you avoid wasting power.

Note that I’m just starting learning about transistors so someone can point out if I’m saying gibberish here ! :*

SilentDirge:
I'm curious if a transistor which has a base current of 100mA will incur a concurrent drain on the battery that could lead to poor battery life.

If it is a BJT, the more current on the base means more current will be allowed to flow from Collector to Emitter. That's the current you should be concerned about, not the current turning on the transistor.

SilentDirge:
As an example I'm looking at using this transistor to drive a 1w or 3w high power LED:

Well a 3W LED draws around 1A. That's more concerning for battery life than 100mA.

All great suggestions, thanks much for the feedback.

Udo, PWM is a great suggestion. I did already attempt this and liked the potentiometer better because I had less flickering at lower intensity/frequency (and 1 less pin used) but I may have to go the PWM route due to the heat issue you pointed out.

James; indeed, 3W at full power (which it won't be driven at) would kill a battery pack very quickly which is why I'm trying to find a reasonable solution. I just didn't want to compound the issue further by choosing the wrong transistor. You're absolutely right, though; in the grand scheme of things I'm probably worrying about the wrong thing.

If you wire the LED in series with the emitter, the base current will flow to the LED and will not be wasted. (The emitter current is the sum of the base & collector current.)

However, your biggest power loss will be from the current-limiting series-resistor. For maximum efficency, you'd want to use some sort of constant-current switching-regulator. (Sorry, I don't have a design or link handy.)

For maximum efficency, you'd want to use some sort of constant-current switching-regulator.

Like this one:-
http://uk.farnell.com/diodes-inc/al3158fsg-7/led-drvr-chg-pmp-qfn3030-20/dp/1904035

If you get flicker using PWM, you are doing it wrong. If the cycle frequency is above 10 kHz you will not see any flicker. 10 kHz is not at all hard to reach. You can reach even above 100 kHz in software. See my experiments on how to do it:
Removing Flicker | Blinkenlight.