RGB LED driver circuit - works but resistor gets hot

It's been a good few years since I've designed an built a circuit. Previously I've been using my arduino to control a hacked RGB controler PCB that I took from an existing kit.

I've now replaced the hacked board with a circuit that I've come upwiht by copying the components on the board and tracing the circuit, and here's what I have....

But R2 is getting hot. The only part that's different from the PCB is Q2, which I have used a BD912. Q1 is a 2N3904.

I'm totally lost at this point :S

When Q1 is conducting, there are two paths for current to go through R2. One is through R3. The other is through the base-emitter junction of Q2 which only has R2 to limit the current. I think you might need to add a resistor on the base of Q2.

Pete

R2 dissipates about 0.2W in that circuit, so if it is rated 1/4W or greater then it should be OK even though it gets hot. However, depending on the amount of current taken by the LED module, you may be able to use a slightly higher value, which will run cooler. How much current does the LED module take?

Another possibility is to replace the two transistor combination by a single P-channel logic level mosfet and a couple of resistors.

Do you need R2 to be so low, you are putting gob loads of current into the base of that transistor, do you need that much?

Agreed, I'd either get another resistor and put it at the base of Q2 if you are unable to replace R2 with a higher resistor

I copied the circuit and R values from a PCB. I've since tried it with different values, @ 1K the LEDs are too dim, @500Ohm they are still a little bit dimmer thatn they should be, but it doesn't get hot. Although the Transistor then gets hot instead :/

I think I'm gonna swap it for a FET.

The LEDs are on a strip, it's 666mA per channel, per 5M. But I have a 2A PSU and I want to be able to allow connection of as much LED strip as possible. So the actual current draw will vary depending on how much strip is connected and what PWM is being used.

The BD192 only has a gain of 40 at 0.5A and this drops down to a gain of 15 at 5A, so that is your problem.

For a 666mA load you need a base current of at least 17mA so 500R should be enough providing your Vin is 12V, what is it? Remember there is a limit of 1A on the Vin current due to the series diode.

When i said the power dissipation in that resistor was 0.2W, I assumed that you were using a +5v supply. However, I now see that the voltage you are switching is connected to Vin of the Arduino, so I guess it is higher than 5v, and the power dissipation in the resistor will be higher too.

Yes, a mosfet is a better solution than a bipolar transistor for switching high currents. I suggest connecting it like this:

  • Single 1K resistor from Q1 collector to +ve supply, instead of the 2 resistors you have now
  • Mosfet gate also connected to Q1 collector
  • Mosfet source connected to +ve supply
  • Mosfet drain connected to LEDs

This is OK for supply voltages up to 20v (which is the maximum Vgs rating for typical mosfets). If the supply voltage is 10v or more, you can use a mosfet specified for 10v Vgs (e.g. IRF9540) instead of a more expensive logic level mosfet.