Apologies for using an Arduino forum to post a non-Arduino question... at least, no Arduinos are involved as of yet...
I want to drive a string of high-power LEDs. Each one is 700ma with a 3.25v forward voltage. I figured that a 10v supply driving 3 chips in series, with suitable current restriction, would be acceptable; with active cooling, the tiny 0.25v oversupply across 3 diodes should be acceptable.
According to my calculations (er, ahem, a calculator i found on t'net), a 5w 1-ohm resistor in series with a 10v supply should adequately limit the current. The circuit should draw 2100mA. My small bench PSU is limited to 2A, so that would be a good start, before bringing in the big PSU for the final test.
So, I wondered about using a 7810 linear regulator; except they're limited to 1A, and even if I used 1 per diode, that 2v drop is going to cause some warmth... Meanwhile, I found a circuit diagram for a 30A 12V PSU which used a linear regulator to drop rectified 24V to 12V, and used paralleled TIP2955 transistors to pass up to 30A current: http://electronics-diy.com/12v-power-supply.php
So... I copied the circuit, substituting the 7812 for a 7810, ditched all the capacitors as I was feeding it nice smooth 12V DC (instead of rectifying 24VAC).
The voltage regulation works perfectly, but the circuit I built only passes about 1.4 amps (max), the TIP2955 gets chuffing hot, so I can only conclude it's not being driven to saturation, and that the regulator is passing the bulk of the current. The regulator gets warm, but not hot.
So - what have I done wrong? Do I need more volts to drive this circuit properly? Have I picked a bad transistor for a 10v output?