Running high power led's without drivers.

Direct-drive works, but it's the most annoying way to drive an LED. The light output starts out very high, then declines rapidly (exponentially), but full depletion takes a long time and isn't necessarily matched to the minimum voltage that the battery can tolerate.

to the 4.2V that a fresh li-ion is capable of sourcing...

At that point, current is exactly 0ma.

I've trie this "overclocking" method just out of curiosity, I mounted the led to a copper heatpad(about 12 ga thick) mounted to a thermoelectric cooler, mounted to a large aluminum heatsink with 2 fans, I powered a 3.8v UV led with 5v directly from a computer psu and pushed 3.8 amps thru it(almost 11x current and basically 120x the heat) and it worked fine for hours without me noticing it being any dimmer, until I powered it back at 350ma compared to a fresh one and realized it was 1/10 as bright at the same current, it was being damaged in regards to light output basically after wastig the 3$ led I found there is no point in exceeding the limits, just like everyone else says

Looking at the graphs again: I'm wondering if it might be possible to get more light from a LED by running it at less than rated current if your heatsink isn't perfect.

eg. If you get 20% less light output at 70 degrees C, could you get the same light by running it at 450mA and not heating it up so much? Most of these LEDs get far too hot to touch if you run them at rated current so the loss due to temperature must be very real.

Possible but not likely.

The light output doesn’t go down as much. For this particular device, light output goes down 10% from 25 → 75c, and down another 15-20% from 75c → 150c.

The problem of high current for high-powered leds is really high temperature (from a thermal run-away) and the resulting low life (which leads to low light output in the long run).

Thanks for all the replies. I admit, some of the info in the debates was a little over my head, but I realize now theres not much point in running the LED's without a driver.

I'm still confused though why my numbers aren't adding up on this setup I made. I have some 3w led's that say they are rated at 700ma max current. I also bought a driver that is supposed to put out 2800ma. Seems like a perfect fit, 4x 700 = 2800ma. I wired up the driver, and had 4 of these led's in parallel, and using a 18650 battery, which is a single cell lithium ion battery. However, at full power, this setup only seems to be drawing about 700ma for all 4 of them, not the 2800 I was expecting. I read specs on the 18650 battery, brands vary, but they all seem to put out a minimum of 3 amps, many putting out 9 amps.

Out of frustration I did bypass the driver and hooked the battery directly up to the 4 led's in parallel, and they were only pulling 1A, again not the 2800 I was expecting. (note: one person claims i damaged the led's doing this, but I only did this AFTER already having hooked them up through the driver, and them only drawing 700ma)

PS: I was expecting these 4 led's to total 12w of power, and be a touch brighter than my cree xml-t6 flashlight which uses a single led which is 10w, but as is, they are nowhere near as bright.

Which driver are you using? The behavior you're describing pretty much matches what I'd expect to see with a "buck" or step-down driver intended for 2*Li-Ion cells in series (8.4V input.) For example, this driver claims to put out 2800mA, and explicitly says that it is designed for 8.4V input. Driving a high-power LED from a single LiIon cell is actually somewhat difficult, because power supply circuits that output a voltage NEAR their input are atypical. (Of course, driving LEDs has become a popular application, and there ARE circuits that will do this, but you have to make sure you get the right one...)

The direct-connection behavior is harder to explain. But since white LEDs have a voltage requirement of about 3.7V, and the battery has a nominal voltage of about 3.7V (which might be expected to "sag" somewhat under heavy loads), it could be that the battery just isn't capable of driving the 4 parallel LEDs at full power. (that's not a good sign for running them from a driver, either. But it might work.)

carl1864: I'm still confused though why my numbers aren't adding up on this setup I made. I have some 3w led's that say they are rated at 700ma max current.

The 18560 cells are available both with and without charge/discharge limitation. If you have cells with built in protection, this could explain why they max out at 1A.

Hi,

Seems like a perfect fit, 4x 700 = 2800ma. I wired up the driver, and had 4 of these led's in parallel, and using a 18650 battery, which is a single cell lithium ion battery. However, at full power, this setup only seems to be drawing about 700ma for all 4 of them, not the 2800 I was expecting.

No two LEDs are exactly the same! One of the for LEDs you wired in parallel will start conducting at a slightly lower voltage than the others. This one will get most of the current, and limit the voltage (and thus the current) for the other LEDs. If the difference between the LEDs is large enough, the conducting one can actually get a lot more current than it's rated for, and become a SEDD (Smoke Emitting Dead Diode), after that the other LEDs will follow one by one. Each LED MUST have it's own current control wired in series. In your case, the forward voltages were close enough to prevent instant meltdown, but in the long run the LEDs would have failed one by one. If you want to drive 4 LEDs you can wire them in series using one 700 mA driver (this requires a higher voltage or a step-up driver), or in parallel using four separate 700mA drivers.

No two LEDs are exactly the same! One of the for LEDs you wired in parallel will start conducting at a slightly lower voltage than the others.

While this is true in theory, the reality is that a set of LEDs from the same batch seem to match Vf pretty closely, and the number of commercial products that parallel LEDs directly and still seem to work (with even brightness) pretty well is quite large.

To demonstrate the effect, try paralleling a red and green LED after the current limiting resistor.

Well, the driver is designed for 18650 batteries, and says 4.2 input voltage. So input voltage should be good. Don't have a direct link, but its the same type typically used in flashlights to drive the 10w cree xml-t6 led's.

The battery does have protection on it, but when I experimented by hooking it directly up to a xml-t6 led, it pulled 1.6A and i'm guessing it can pull more, so its definetly capable of delivering more than the 700ma-1A that I was getting with the other led's.

Someone mentioned one of the weak led's taking all the current from the others. However if they are all wired in parallel, I don't see how this could happen, each one should get as much current as it wants right?

Am I really doing this the wrong way? Do they need to be in series instead? And if so, can anyone explain why they would need to be in series and parallel doesn't work? I don't exactly see why it would make a difference as long as they are being fed their correct voltage (for example say there are 4, 3.5 volt led's, either feed 3.5 volts to them all in parallel, or feed 14v to them in series, why would there be a difference).

Am I really doing this the wrong way? Do they need to be in series instead?

Yes, by having only one constant current driver, you then have no defined control mechanism for equal current flow to each parallel LED, your just assuming somehow that each will take its equal share, and while that may be your wish and desire that doesn't make it so. Kirchhoff has defined how current and voltage works in series and parallel circuits and is worth a review: http://physics.about.com/od/electromagnetics/f/KirchhoffRule.htm

And if so, can anyone explain why they would need to be in series and parallel doesn't work?

Well according to one of Kirchhoff's rules in a series circuit the current flow is equal at all points in the circuit. So if three leds are wired in series and 700ma is the circuit current flow then of course 700ma is flowing into and out of each led, so they all operate at the same current. But of course as the desired LED current is 700ma for your LEDs, your constant current driver needs to run at 700ma output only, you can't use one that outputs a constant 2800ma output.

I don't exactly see why it would make a difference as long as they are being fed their correct voltage (for example say there are 4, 3.5 volt led's, either feed 3.5 volts to them all in parallel, or feed 14v to them in series, why would there be a difference).

As other have said, you need to put LED forward voltage drop specifications into the background of your thinking, you need to keep controlling the current to a desired value in the forefront of your thinking. You don't operate or control a led by manipulating the circuit's voltage supply, you control the current by whatever method you deem to use, be it a constant current driver (which manipulates applied voltage to maintain a desired current) or with a series current limiting resistor (which just sets a max current that can flow given a constant applied voltage) which is not recommended for high power LEDs unless you run them less then their max rated current spec.

And if so, can anyone explain why they would need to be in series and parallel doesn't work? I don't exactly see why it would make a difference as long as they are being fed their correct voltage (for example say there are 4, 3.5 volt led's, either feed 3.5 volts to them all in parallel, or feed 14v to them in series, why would there be a difference).

If you wish to run LEDs in parallel then each LED needs it's own constant current driver rated at the leds operating current, 700ma in your case. If you run the LEDs in series then a single constant current driver will work with only the caveat that the driver must have a maximum output voltage capability of at least the combined forward voltage drop sum of the number of leds in the series string. Lefty

Someone mentioned one of the weak led's taking all the current from the others.

Whether a led is "weak" doesn't matter (how do you define "weak" anyway?). What matters is their Vfwd. The led with the lowest Vfwd will begin to light up, until its Vfwd reaches the level to light up the led with the 2nd lowest Vfwd, ...

So paralleling different leds have the disadvantage of uneven current. But it provides better reliability than serially lighting the leds.

What people usually do, with large number of leds, is 1) drive them serially, with individual drivers: each led string has its own drivers. or 2) parallel led string but with resistors to even out large current unevenness.

The 1st approach is the best but more complexity and cost; The 2nd approach offers a good compromise.

But it provides better reliability than serially lighting the leds.

Care to explain your thinking on that point?

Reliability in terms of if one led in the chain failed they all would stop lighting, basically saying you can do it three different ways you can light 100 leds in parallel (alot of current at a lower. Voltage + most expensive in term of material and wire but whichever one fails doesn't affect any others) you can light 100 leds in series ( higher voltage but less current, cheapest probably dependig on ease of getting the higher voltage, drawback is that if one fails all 100 go out) you can light ten in series and ten of those in parallel, now this is the compromise, one fails only ten go out but its not as expensive as all in parallel

you can light ten in series and ten of those in parallel, now this is the compromise, one fails only ten go out but its not as expensive as all in parallel

That's the approach they use in those led-based traffic lights.

retrolefty: Am I really doing this the wrong way? Do they need to be in series instead?

Yes, by having only one constant current driver, you then have no defined control mechanism for equal current flow to each parallel LED, your just assuming somehow that each will take its equal share, and while that may be your wish and desire that doesn't make it so. Kirchhoff has defined how current and voltage works in series and parallel circuits and is worth a review: http://physics.about.com/od/electromagnetics/f/KirchhoffRule.htm

And if so, can anyone explain why they would need to be in series and parallel doesn't work?

Well according to one of Kirchhoff's rules in a series circuit the current flow is equal at all points in the circuit. So if three leds are wired in series and 700ma is the circuit current flow then of course 700ma is flowing into and out of each led, so they all operate at the same current. But of course as the desired LED current is 700ma for your LEDs, your constant current driver needs to run at 700ma output only, you can't use one that outputs a constant 2800ma output.

This whole explanation was very helpful, and a lot more things are starting to make sense now. I think I must have been mistakenly assuming that there is current drop the same way there is voltage drop. Mistakenly thinking I have to add up the current of each led in series, the same way you add up voltage, but I guess this is wrong. If I understand correctly, there is voltage drop across each led, but no current drop, so if I have 700ma led's, and a 700ma driver, I can light up as many led's as I possibly want in series, as long as the driver is putting out enough voltage? Example if i want 100 3.5v led's, I would need a driver that outputs 350V, but only 700ma of current?

So you say I cannot power a 700ma led with a 2800ma driver? What would happen? I was thinking as long as the driver puts out equal or more that it is fine? Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

I can verify that I think I still have a defective green light... about 25% of the lights either flicker or are dead which would point to a series - parallel configuration. It's a big bugger... about 12 - 14 inches across and has a 14 X 14 grid of 5 mm green LED's arranged in a circle. I saw a traffic maintenance guy replacing one so I waited until he dropped the cherry picker down and I asked him for the defective one. It has 2 PCB's one for the display and one for a direct ac mains powered (No Transformer) single output CC driver. In series each "String" will see a constant current and the current multiplies as you add more "Strings" in parallel. This device is marked as having been Mf'd in 2005. AC power is 117VAC @ 12.6 W. This would point to Very high efficiency LED's operating at ~5 - 10 mA. I didn't want to connect it up and work on it... when I first got it because I then didn't own a 110V 1 - 1 isolation transformer and now it's a conversation piece...

Bob

Thanks, it sounded to me like you were advocating each LED was in parallel not the LED and current limiting device in parallel.

So you say I cannot power a 700ma led with a 2800ma driver? What would happen? I was thinking as long as the driver puts out equal or more that it is fine? Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

We are saying that there is a difference between a linear load and a non linear load. Also a constant current supply will keep upping the voltage until that current is reached. So running a 700mA LED from a 2500mA constant current supply will fry the LED by putting a lot more current down it than it can stand.

Things work differently with a constant current supply than a constant voltage supply.

carl1864:

retrolefty: Am I really doing this the wrong way? Do they need to be in series instead?

Yes, by having only one constant current driver, you then have no defined control mechanism for equal current flow to each parallel LED, your just assuming somehow that each will take its equal share, and while that may be your wish and desire that doesn't make it so. Kirchhoff has defined how current and voltage works in series and parallel circuits and is worth a review: http://physics.about.com/od/electromagnetics/f/KirchhoffRule.htm

And if so, can anyone explain why they would need to be in series and parallel doesn't work?

Well according to one of Kirchhoff's rules in a series circuit the current flow is equal at all points in the circuit. So if three leds are wired in series and 700ma is the circuit current flow then of course 700ma is flowing into and out of each led, so they all operate at the same current. But of course as the desired LED current is 700ma for your LEDs, your constant current driver needs to run at 700ma output only, you can't use one that outputs a constant 2800ma output.

This whole explanation was very helpful, and a lot more things are starting to make sense now. I think I must have been mistakenly assuming that there is current drop the same way there is voltage drop. Mistakenly thinking I have to add up the current of each led in series, the same way you add up voltage, but I guess this is wrong. If I understand correctly, there is voltage drop across each led, but no current drop, so if I have 700ma led's, and a 700ma driver, I can light up as many led's as I possibly want in series, as long as the driver is putting out enough voltage? You got it now.

Example if i want 100 3.5v led's, I would need a driver that outputs 350V, but only 700ma of current?

Almost, you need a driver that can raise or lower it's voltage to at least 350 volts while maintaining a constant current of 700ma. There may be a time (say at low temperature) where it actually has to lower the voltage some to maintain the same constant 700ma of flow. The current driver is really a current regulator, just as a voltage regulator works by maintaining a constant output voltage even with variable load resistance and or variable input voltage to the voltage regulator.

So you say I cannot power a 700ma led with a 2800ma driver? What would happen?

The constant current driver will force 2800ma of current through the LED, but only for awhile as soon the led will melt or explode open. Remember an LED cannot by itself control the current flowing through it, it's not like a incandescent lamp that has a fixed resistance which controls the current at a given voltage per ohm's law. LEDs don't obey Dr. Ohm. Once a led is forward biased by a voltage equal or greater then its Vf spec it acts like a direct short circuit and will self-destruct unless the current is controlled or limited by something external to the LED

I was thinking as long as the driver puts out equal or more that it is fine? No, a constant current driver puts out only a single value all the time, its rated value of say 2800ma. Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. That applies to constant voltage sources, where the current flow is determined by only the load resistance per Dr. Ohm, up to the maximum current capacity of the voltage source.

Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

No I'm saying constant voltage sources are different then constant current sources. LEDs are current operated and controlled devices (not voltage controlled) so they are best controlled by using a constant current source. If we must use a constant voltage source to power a led we need to add something else to control/limit the current, which is normally a simple series resistor. Lefty