Typically, high-power LEDs are driven by a special LED driver circuit.... A constant-current switching supply, which means the power supply itself doesn't have to dissipate much power. You can build or buy a high-power LED driver, but if you build one the first time and you don't know what you are doing, you might end-up burning-out some components (or your LEDs) and spending more money than if you just buy one (or three for an RBG LED).
Normally (with a regular low-power LED) there is a resistor that limits current... You can do the same thing with high-power LEDs, but I don't see any current limiting resistors in your circuit... Either the transistors or the LEDs will fry... Or both. ![]()
As far as the transistors, since they are used as switches you should only have to worry about the current rating. To be safe, I'd choose a transistor rated at least 1 amp (maybe 1/2 amp minimum). I don't there are any transistors rated for less than 12V, so I wouldn't worry about the voltage rating.
Do I need a NPN or PNP transistor?
Probably NPN, depending on how you desing your circuit... But, your circuit won't work... The current need to flow through the LED and transistor in series (through the emitter & collector of the transistor), and the "arrow" in the transistor & diode need to be in the direction of current flow (positive to negative). You don't have a ptth to ground, so you've got no current flow... ![]()
With a high-power LED, the resistor will need to dissipate about the same power as the LEDs (with a 5V supply), so you'd have to use 2W resistors if you want some safety margin. If you are running off a battery, you probably don't want to be wasting half of the power heating-up resistors and cutting battery life in half...
Here's how you calculate the current limiting resistor for an LED -
1. The voltage gets divided between the resistor and the LED. (The same current flows through both when they are in series.)
If there is a transistor, there is also a small voltage-drop across the transistor when it's on (~0.2V or so), but we are making some approximations anyway so we can ignore that for now. (There is a bigger voltage drop across the transistor when it's off, but who cares...)
2. So, let's say your battery has a nominal voltage of 7V (the actual voltage depends on the charge). For the RED LED, we'll say the voltage drop is 2.75V. (Since they give you a range, that means it varies and if you have two "identical" LEDs, they may have different operating voltages at 350mA. With a 7V supply and 2.75 V across the LED, that leaves 4.25V across the resistor.
3. Now that we know the voltage across the resistor and the current through it, we can calculate the required resistance with [u]Ohm's Law[/u]. 4.25V / 0.350A = ~12 Ohms.
Note that you cannot use Ohm's law directly with the LED because the LED is not "linear"... it's resistance changes when the voltage and/or current changes. This is why you have to use something else... A resistor or a constant current source to drive the LED.
4. We can also calculate power dissipated by the resistor. 4.25V x 0.35A = ~ 1.5 Watts.