Driving High Power(10W) LED with constant current

Hi all. I am trying to build a LED driver, that can (1) regulate max current (2) be dimmable.
the LED Chip has RGB color LEDs in Series, whose voltage is 6-7V for red, 9-11V for blue. and max current is 350mA.
Based On YouTube Video below, I designed this circuit to regulate max 300mA current.

I made test circuit on the breadboard, and there are 2 problems.

(1) this circuit always puts 12V to the LED, which leads to 670mA current flow through each LED Chip.
(2) Voltage Regulator and 1Ohm Resistor gets really hot, within a few seconds when VCC is ON.

so I want to know what's wrong with my schematic, and hopefully how to improve it.
thank you for all!

[Links for reference]
(1) original YouTube Video I watched

(2) LED Chip

Why not just get a constant current driver from Aliexpress? $5 and it's done. Small and efficient.
I use Meanwell drivers The LDD series are good. All have worked well.

Edit: On your circuit, remove the GND below R3. That is what is keeping it active.

No sh...
At 300mA, your 1R resistor will dissipate <100mW. Should be OK. But due to the wiring error (good catch @SurferTim!), you're running 650mA through it and that means it'll dissipate ~ 450mW.

As to the regulator: it drops 7V, so if you take 75mA from the 5V rail, it'll burn ~ 0.5W which is enough to make it get pretty darn hot.

Btw, you don't really need U2.1. You could feed the output from U2.2 into the gate of Q1 through a resistor (e.g. 1k) and apply your PWM (through a series diode) also to the gate of Q1. IRF520 is furthermore an unfortunate choice for Q1, although as a TO220 part it's nice and beefy so that you can drop some watts into it. I assume/hope you have a heatsink on there as it's purpose is to burn the power the LED doesn't use.

A much less wasteful approach here would be to again do as @SurferTim says and get something like a PicoBuck module (there are plenty of variants on the theme). Running high power LEDs using a shunt current limiter is kind of silly; it wastes a lot of power, requires large parts and heatsinks and there's very little advantage to it.

Look at Q1, both the source and drain are grounded. That will not hurt the MOSFET but nothing will get past it. You need to correct the circuit. @SurferTim has a great idea unless this is for a commercial application but he may still be right.

by disconnecting R3 - GND, it kind of worked. voltage regulator still gets pretty hot, So I decided to buy a stable, ready-made RGB driver for this project. I tried to make one with exisiting parts, but I guess this method is not a safe way. thanks!

Yes, the power supply will still get hot, but not as hot. That is the down side of a linear (resistive) type current control.
PWM is much more efficient.

It is, but you need to run the numbers and decide on a suitable topology. I've made plenty of LED drivers with simple parts, and for a 10W LED the best approach is really to use a buck mode LED driver. A linear approach is not a great idea; I've done that, too, but it's inherently compromised in a couple of ways especially if you're working with currents >100mA.

Just FYI: A resistive regulator can be fairly efficient if you are using multiple LEDs in series where the sum of the barrier voltages is a little bit less than Vsupply-3v.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.