Using N-channel MOSFET with High Power LED

Hi all,

I am working on a controller for a high power LED light (200W).

I've chosen to use a N-channel MOSFET, in this diagram;

Browsing the internet I've found people using optocouplers to isolate the two separate 'circuits'.
Is this necessary when working with these high voltages?

Best regards,

Philip

Whether to use an optocoupler really depends on the overall application. It is not always necessary.

The circuit you have drawn is not good. You need LED current control.

45V DC I would not consider high voltage. Yes, precautions must be taken but not as worrying as working with the mains voltage.
Use a 150 ohm or greater resistor from the D9 pin to the gate of the MOSFET, also replace the 220 ohm resistor with a 10k ohm one.
And as Paulcet said you need a current controller to drive the LED, if it is not already implemented.
You will be passing around 4.5 amps through the MOSFET so a heatsink should be used.

You will be passing around 4.5 amps through the MOSFET so a heatsink should be used.

How do you know that? In the absence of any current limiting resistor the LED will draw current that is only limited by the RDSon (0.2 ohms) of the fet. It might draw 200A but not for very long.

And as Paulcet said you need a current controller to drive the LED, if it is not already implemented.

After that line I wrote:

You will be passing around 4.5 amps through the MOSFET so a heatsink should be used.

Paulcet:
The circuit you have drawn is not good. You need LED current control.

I will supply the voltage using a lab powee supply. As far as I know, the LED will drain whatever amps it needs (I assume around 4 amps). The Mosfet serves as a switch, and with PWM signal the led can be dimmed (not by regulating the current, but with PWM)

Am I right?

Thanks for all your replies

No, that is not right. See: LEDs

An LED is not a current limiting device.

Could you give me an example of such a resistor-value, or how to determine it. Google isn't much of a help

Best regards,

Philip

ElectroPhilip:
As far as I know, the LED will drain whatever amps it needs (I assume around 4 amps).

Am I right?

No!

If you do this, you're apt to burn out either a very expensive lab supply, or a very expensive high-power LED, or both.

LEDs have no way to limit the current going through them; they are, after all, diodes. You have to limit the current externally, either by putting a resistor in series with them or through some more sophisticated means, such as a current-limited switching regulator.

How to calculate the required series resistance:

Let Vsupply be the output voltage of your power supply, Vforward be the characteristic forward voltage drop of the LED stack, and Iforward be the maximum current you wish to run the LEDs at (or the maximum they're rated for, whichever is less).

Then Rseries = (Vsupply - Vforward) / Iforward

Use a resistor of AT LEAST that value, to avoid overdriving your LEDs.

Given this high-power LED stack is going to have several amps running through it, you're going to need to use a power resistor with sufficient wattage rating. The power dissipated by the resistor will be

P (in watts) = I^2 * R, where I is in amps and R is in ohms.

Hope this helps a bit...

Do not use a resistor with a 200W LED, buy a constant current LED driver that has PWM capability or build a proper circuit.

There are LED resistor calculators online... However, for your expected 200W, you need a better way

You could possibly get acceptable results with a mosfet/feedback transistor arrangement, but we will need some details. Vf of the LED (array I assume). Typical current? Got a datasheet?

Most 200W LED chips require around 6A at about 30V - 36V and also needs quite a large heatsink usually with a fan attached.

Use a CPU heatsink + fan to cool the LED.

You don't need a constant current supply to drive a high power LED - it's best, but you can also use a plain old "constant voltage" supply, and slowly work up the voltage while monitoring the current, until you find the voltage that gives you the approximate current you want - with a bench-top power supply, this is easy. That's what I did for my 150W flashlights, and I've had no issues doing PWM dimming with a suitably sized mosfet. The fet doesn't even get warm - it's amazing what the process engineers have done with MOSFETs.

The I-V curve of high power LEDs is considerably more forgiving than small LEDs (which, of course, isn't saying much)

DrAzzy seems to suggest that you don't need current limiting for LED. This is wrong.

DrAzzy:
slowly work up the voltage while monitoring the current, until you find the voltage that gives you the approximate current you want - with a bench-top power supply, this is easy.

That's exactly what I was planning to do!

But as Dave said: "LEDs have no way to limit the current going through them; they are, after all, diodes."

That would suggest that the LED would suck all the current from the power supply it could get.

Doesn't that apply on high power LED's (I'm usong cree XHP50 btw)?

Thank you all for your replies, I'm learning tons of stuff thanks to you :slight_smile:

Philip

ElectroPhilip:
But as Dave said: "LEDs have no way to limit the current going through them; they are, after all, diodes."

That would suggest that the LED would suck all the current from the power supply it could get.

Doesn't that apply on high power LED's (I'm usong cree XHP50 btw)?

Correct, the same thing applies.

While there is a bit of margin between "not enough voltage" and "too much voltage", it's pretty slim: according to the XHP50 data sheet, the 12V version of the XHP50 has essentially zero conduction below about 10.5 volts, and rises to full maximum rated current at about 12.2 volts. And between 2/3 of rated current (and therefore brightness) and full rated current/brightness, you have a span of only about 0.4 volts. If you shoot for the center of that 400 mV range and try to keep the voltage stable to +/- 200 mV to stay within it, that's +/- 1.6%. Pretty tight, not much margin there!

That's the basic reason why LEDs are usually driven from current sources, not voltage sources: with a voltage source, there's not much room between total darkness and catastrophic overheating.

You really should be using either a series resistor or a current-regulating switching supply, if you want to be sure of not burning out your LEDs.

ETA: you also need to consider the effects of temperature: these LEDs run very hot, and they have a -9mV/°C temperature coefficient of forward voltage. So if you powered them from a regulated voltage supply, that regulator would have to adjust the voltage to compensate. In other words, the range of allowable voltage moves with temperature. That's another thing a series resistor helps with, in reducing the effect of that voltage tempco.

The XHP50 is not 200W more like 18W, with 2 flavors 6V @ 3000mA and 12V @ 1500mA.
Either you have a pre-built array from those chips or you plan on building one yourself.
Give the full story of what hardware you have to work with, how far you have gotten and what the completed project will be.

Bench power supplies usually are voltage and current controlled. So setting a current limit then adjusting the voltage will ensure that it can't draw to much current.
But getting 30+V @ 4+A for a 200W build will require a good quality supply.
I would not drive a 200W LED or an XHP50 from only a constant voltage source.
As the LED heats up it's voltage requirement changes, how do you deal with that using constant voltage?

What are you using for a bench power supply?

Here is two good articles to read that should help you.
This one is a quick read.
At least take a look at #6 on that page.
And this one is a bit more info.

Again, thanks for your elaborate answers.

Let me give you some background information.
The XML's are indeed used in 3 series.

Normally a regular meanwell driver is used to power the LED, sufficient heatsinking is provided.
But the meanwell driver only dims between 10%-100% with pwm input. (I don't get it why they don't just dim to 0%, but the datasheet mentions this and I've tested it)

For a special application I want to dim between 0% and about 20%, it's a temporary setup so a lab power supply will suffice.

Do you think that this temporary setup will work without frying the LED's?

Best regards,

Philip

ElectroPhilip:
Do you think that this temporary setup will work without frying the LED's?

No idea; I'm not familiar with the Meanwell drivers, and I wonder how they would respond to a load which is externally PWM-controlled, especially when they are in their constant-current mode. They may be perfectly happy with that, or it may drive them bonkers.

At this point, I think maybe the best thing to do is just try out your idea, keeping LED ratings in mind, and see what happens.

Hi,
Can you post a copy of your LED array circuit, in CAD or a picture of a hand drawn circuit in jpg or png?

Tom..... :slight_smile: