I have a set of LED bulbs with horrible chips that I'd like to replace with CREE XHP70.3 HI M2 2D WHITE 5700K CRI90 SMD 7070 LEDs. The original driver supplies a bit over 9 volts, which won't be enough as the Crees have a voltage of 12v.
Is it possible to omit a current limited resistor and use a MOSFET to deliver the correct current? Finding a resistor that can handle 45 watts at the specific Ohm rating is sort of expensive.
I'd probably also need to use zener diodes to limit the voltage, as this will be used in a car.
Almost certainly not. Normally, high power LEDs need a constant-current driver. A simple resistor to limit current won't be suitable because the forward voltage of the led depends on the temperature, so no simple fixed value resistance is going to protect your LEDs and give them a decently long life.
this is effectively what a constant current circuit does. It uses a transistor, like a MOSFET, as an automatically varying/adapting series resistor that adapts to the changing forward voltage of the led as they heat up.
That makes sense. I'd gladly get a driver off the shelf, but I could not find anything even approaching 45 watts. I got the leds from KaiDomain, but they don't offer suitable drivers.
Can you recommend some place to look for high power drivers?
The driver may not have to dissipate 45 Watts. It's the led that dissipates 45W. The driver only has to dissipate the power it needs in order to to drop any excess voltage over and above what's needed to keep the current through the led constant.
Suppose that 15A is flowing through the led, controlled by the constant current driver, the led is up to its working temperature, and at that temperature, it's forward voltage is 3V. So the led is dissipating 3 x 15 = 45W. Now let's suppose the supply voltage is 3.7V. The constant current driver circuit has to reduce that 3.7V down to 3V to keep that current at 15A. So the driver circuit has to dissipate 0.7 x 15 = 10.5W.
Must have lost a word when editing my answer, I meant that in the sense of a current limiting resistor able to handle 45 watts.
But using both your advice, I still can't find a driver that can supply ~3.75A at 12V. The ones at ledsupply.com peak at 2.1A... Is 12V a rarely used voltage? I don't get it
It might be. Each individual LED chip in white LED modules usually has a forward voltage around 3.0~3.2V. Your module has 4 chips wired in series making the total forward voltage 12.0~12.8V. Any constant current driver will need a little more input voltage to produce the required output voltage, so the input voltage probably needs to be 13~15V.
But I'm not an expert in this type of high power led, I'm just telling you the theory as I understand it.
Because very few people have, and are able to use a 45 watt LED.
A LED of that power is usually mounted on a copper star base, which must be oven-soldered to a copper heatsink, and then fan force cooled. You say it's to upgrade a lightbulb? Streetlight?
Leo..
LEDs are "current controlled" and high-power LEDs need a heatsink.
A constant current swicthmode LED driver theoretically doesn't have to dissipate any power. The amount of actual wasted energy/heat depends on the design. The same is true of a regular switchmode constant-voltage power supply or voltage regulator.
As PaulRB says, you usually need more than the rated voltage for the constant-current driver to work. But there may be some drivers that can step-up the voltage as-needed.
With a constant-current power supply/driver it doesn't matter if it's capable of higher voltage. If the current is correct for the LED, the voltage will "fall into place".
If the power supply has enough available voltage (and can supply the total wattage) you can wire multiple LEDs in series and they all get the same current. That's a common thing to do with high-power LEDs... The same concept as regular light bulbs in parallel with constant voltage.
This the the "opposite" of a regular constant-voltage power supply where it's OK to have excess current capability and the current "falls into place".
The resistor doesn't have to handle 45W, but it's not going to work anyway... With 12V across the LED there is (almost) no voltage across the resistor and (almost) no current will flow, the LED won't get 45W and it will be dim.
Again, you need "extra" voltage for the resistor. This works great for "regular little" resistors. Usually the voltage is divided between the resistor and LED about equally and the small amount of wasted power is OK. But with your high power LED that would mean is about a 24V power supply with the LED & resistor both dissipating 45W for 90W total. Something like 18V would mean the resistor only has to dissipate half the power but it's still inefficient ...That's why a "good design" uses a switchmode driver.
That's worse because you need to control the current, not the voltage. Plus again, with 12V across the LED there is nothing left for the diode. And when there is a voltage drop across a diode, and current through it, it dissipates power just like a resistor.
It's an automotive headlight bulb, the copper base with the led is screw-mounted to a fan-cooled aluminium base with some thermal paste in between. I've seen this type of bulb draw 3.8 amps in a review, so the cooling should be at least adequate.
Thanks everybody for the theory. I'm starting to think I made a mistake and should have ordered the 6V version
It's not a cheap bulb and I expected a lot more output from it. It has a driver included, but I figured that whatever current it puts out, won't make full use of the 45W Cree led, and wanted to get a proper driver before spending a ton of time fiddling with something that probably won't work... I did not expect the search to be this hard though