Running high power led's without drivers.

Which driver are you using? The behavior you're describing pretty much matches what I'd expect to see with a "buck" or step-down driver intended for 2*Li-Ion cells in series (8.4V input.) For example, this driver claims to put out 2800mA, and explicitly says that it is designed for 8.4V input.
Driving a high-power LED from a single LiIon cell is actually somewhat difficult, because power supply circuits that output a voltage NEAR their input are atypical. (Of course, driving LEDs has become a popular application, and there ARE circuits that will do this, but you have to make sure you get the right one...)

The direct-connection behavior is harder to explain. But since white LEDs have a voltage requirement of about 3.7V, and the battery has a nominal voltage of about 3.7V (which might be expected to "sag" somewhat under heavy loads), it could be that the battery just isn't capable of driving the 4 parallel LEDs at full power. (that's not a good sign for running them from a driver, either. But it might work.)

carl1864:
I'm still confused though why my numbers aren't adding up on this setup I made. I have some 3w led's that say they are rated at 700ma max current.

The 18560 cells are available both with and without charge/discharge limitation. If you have cells with built in protection, this could explain why they max out at 1A.

Hi,

Seems like a perfect fit, 4x 700 = 2800ma. I wired up the driver, and had 4 of these led's in parallel, and using a 18650 battery, which is a single cell lithium ion battery. However, at full power, this setup only seems to be drawing about 700ma for all 4 of them, not the 2800 I was expecting.

No two LEDs are exactly the same! One of the for LEDs you wired in parallel will start conducting at a slightly lower voltage than the others. This one will get most of the current, and limit the voltage (and thus the current) for the other LEDs. If the difference between the LEDs is large enough, the conducting one can actually get a lot more current than it's rated for, and become a SEDD (Smoke Emitting Dead Diode), after that the other LEDs will follow one by one. Each LED MUST have it's own current control wired in series. In your case, the forward voltages were close enough to prevent instant meltdown, but in the long run the LEDs would have failed one by one. If you want to drive 4 LEDs you can wire them in series using one 700 mA driver (this requires a higher voltage or a step-up driver), or in parallel using four separate 700mA drivers.

No two LEDs are exactly the same! One of the for LEDs you wired in parallel will start conducting at a slightly lower voltage than the others.

While this is true in theory, the reality is that a set of LEDs from the same batch seem to match Vf pretty closely, and the number of commercial products that parallel LEDs directly and still seem to work (with even brightness) pretty well is quite large.

To demonstrate the effect, try paralleling a red and green LED after the current limiting resistor.

Well, the driver is designed for 18650 batteries, and says 4.2 input voltage. So input voltage should be good. Don't have a direct link, but its the same type typically used in flashlights to drive the 10w cree xml-t6 led's.

The battery does have protection on it, but when I experimented by hooking it directly up to a xml-t6 led, it pulled 1.6A and i'm guessing it can pull more, so its definetly capable of delivering more than the 700ma-1A that I was getting with the other led's.

Someone mentioned one of the weak led's taking all the current from the others. However if they are all wired in parallel, I don't see how this could happen, each one should get as much current as it wants right?

Am I really doing this the wrong way? Do they need to be in series instead? And if so, can anyone explain why they would need to be in series and parallel doesn't work? I don't exactly see why it would make a difference as long as they are being fed their correct voltage (for example say there are 4, 3.5 volt led's, either feed 3.5 volts to them all in parallel, or feed 14v to them in series, why would there be a difference).

Am I really doing this the wrong way? Do they need to be in series instead?

Yes, by having only one constant current driver, you then have no defined control mechanism for equal current flow to each parallel LED, your just assuming somehow that each will take its equal share, and while that may be your wish and desire that doesn't make it so. Kirchhoff has defined how current and voltage works in series and parallel circuits and is worth a review: Kirchhoff's Laws for Current and Voltage

And if so, can anyone explain why they would need to be in series and parallel doesn't work?

Well according to one of Kirchhoff's rules in a series circuit the current flow is equal at all points in the circuit. So if three leds are wired in series and 700ma is the circuit current flow then of course 700ma is flowing into and out of each led, so they all operate at the same current. But of course as the desired LED current is 700ma for your LEDs, your constant current driver needs to run at 700ma output only, you can't use one that outputs a constant 2800ma output.

I don't exactly see why it would make a difference as long as they are being fed their correct voltage (for example say there are 4, 3.5 volt led's, either feed 3.5 volts to them all in parallel, or feed 14v to them in series, why would there be a difference).

As other have said, you need to put LED forward voltage drop specifications into the background of your thinking, you need to keep controlling the current to a desired value in the forefront of your thinking. You don't operate or control a led by manipulating the circuit's voltage supply, you control the current by whatever method you deem to use, be it a constant current driver (which manipulates applied voltage to maintain a desired current) or with a series current limiting resistor (which just sets a max current that can flow given a constant applied voltage) which is not recommended for high power LEDs unless you run them less then their max rated current spec.

And if so, can anyone explain why they would need to be in series and parallel doesn't work? I don't exactly see why it would make a difference as long as they are being fed their correct voltage (for example say there are 4, 3.5 volt led's, either feed 3.5 volts to them all in parallel, or feed 14v to them in series, why would there be a difference).

If you wish to run LEDs in parallel then each LED needs it's own constant current driver rated at the leds operating current, 700ma in your case. If you run the LEDs in series then a single constant current driver will work with only the caveat that the driver must have a maximum output voltage capability of at least the combined forward voltage drop sum of the number of leds in the series string.
Lefty

Someone mentioned one of the weak led's taking all the current from the others.

Whether a led is "weak" doesn't matter (how do you define "weak" anyway?). What matters is their Vfwd. The led with the lowest Vfwd will begin to light up, until its Vfwd reaches the level to light up the led with the 2nd lowest Vfwd, ...

So paralleling different leds have the disadvantage of uneven current. But it provides better reliability than serially lighting the leds.

What people usually do, with large number of leds, is 1) drive them serially, with individual drivers: each led string has its own drivers. or 2) parallel led string but with resistors to even out large current unevenness.

The 1st approach is the best but more complexity and cost; The 2nd approach offers a good compromise.

But it provides better reliability than serially lighting the leds.

Care to explain your thinking on that point?

Reliability in terms of if one led in the chain failed they all would stop lighting, basically saying you can do it three different ways
you can light 100 leds in parallel (alot of current at a lower. Voltage + most expensive in term of material and wire but whichever one fails doesn't affect any others)
you can light 100 leds in series ( higher voltage but less current, cheapest probably dependig on ease of getting the higher voltage, drawback is that if one fails all 100 go out)
you can light ten in series and ten of those in parallel, now this is the compromise, one fails only ten go out but its not as expensive as all in parallel

you can light ten in series and ten of those in parallel, now this is the compromise, one fails only ten go out but its not as expensive as all in parallel

That's the approach they use in those led-based traffic lights.

retrolefty:
Am I really doing this the wrong way? Do they need to be in series instead?

Yes, by having only one constant current driver, you then have no defined control mechanism for equal current flow to each parallel LED, your just assuming somehow that each will take its equal share, and while that may be your wish and desire that doesn't make it so. Kirchhoff has defined how current and voltage works in series and parallel circuits and is worth a review: Kirchhoff's Laws for Current and Voltage

And if so, can anyone explain why they would need to be in series and parallel doesn't work?

Well according to one of Kirchhoff's rules in a series circuit the current flow is equal at all points in the circuit. So if three leds are wired in series and 700ma is the circuit current flow then of course 700ma is flowing into and out of each led, so they all operate at the same current. But of course as the desired LED current is 700ma for your LEDs, your constant current driver needs to run at 700ma output only, you can't use one that outputs a constant 2800ma output.

This whole explanation was very helpful, and a lot more things are starting to make sense now. I think I must have been mistakenly assuming that there is current drop the same way there is voltage drop. Mistakenly thinking I have to add up the current of each led in series, the same way you add up voltage, but I guess this is wrong. If I understand correctly, there is voltage drop across each led, but no current drop, so if I have 700ma led's, and a 700ma driver, I can light up as many led's as I possibly want in series, as long as the driver is putting out enough voltage? Example if i want 100 3.5v led's, I would need a driver that outputs 350V, but only 700ma of current?

So you say I cannot power a 700ma led with a 2800ma driver? What would happen? I was thinking as long as the driver puts out equal or more that it is fine? Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

I can verify that I think I still have a defective green light... about 25% of the lights either flicker or are dead which would point to a series - parallel configuration.
It's a big bugger... about 12 - 14 inches across and has a 14 X 14 grid of 5 mm green LED's arranged in a circle. I saw a traffic maintenance guy replacing one so I waited until he dropped the cherry picker down and I asked him for the defective one. It has 2 PCB's one for the display and one for a direct ac mains powered (No Transformer)
single output CC driver. In series each "String" will see a constant current and the current multiplies as you add more "Strings" in parallel.
This device is marked as having been Mf'd in 2005. AC power is 117VAC @ 12.6 W. This would point to Very high efficiency LED's operating at ~5 - 10 mA. I didn't want to connect it up and work on it... when I first got it because I then didn't own a 110V 1 - 1 isolation transformer and now it's a conversation piece...

Bob

Thanks, it sounded to me like you were advocating each LED was in parallel not the LED and current limiting device in parallel.

So you say I cannot power a 700ma led with a 2800ma driver? What would happen? I was thinking as long as the driver puts out equal or more that it is fine? Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

We are saying that there is a difference between a linear load and a non linear load. Also a constant current supply will keep upping the voltage until that current is reached. So running a 700mA LED from a 2500mA constant current supply will fry the LED by putting a lot more current down it than it can stand.

Things work differently with a constant current supply than a constant voltage supply.

carl1864:

retrolefty:
Am I really doing this the wrong way? Do they need to be in series instead?

Yes, by having only one constant current driver, you then have no defined control mechanism for equal current flow to each parallel LED, your just assuming somehow that each will take its equal share, and while that may be your wish and desire that doesn't make it so. Kirchhoff has defined how current and voltage works in series and parallel circuits and is worth a review: Kirchhoff's Laws for Current and Voltage

And if so, can anyone explain why they would need to be in series and parallel doesn't work?

Well according to one of Kirchhoff's rules in a series circuit the current flow is equal at all points in the circuit. So if three leds are wired in series and 700ma is the circuit current flow then of course 700ma is flowing into and out of each led, so they all operate at the same current. But of course as the desired LED current is 700ma for your LEDs, your constant current driver needs to run at 700ma output only, you can't use one that outputs a constant 2800ma output.

This whole explanation was very helpful, and a lot more things are starting to make sense now. I think I must have been mistakenly assuming that there is current drop the same way there is voltage drop. Mistakenly thinking I have to add up the current of each led in series, the same way you add up voltage, but I guess this is wrong. If I understand correctly, there is voltage drop across each led, but no current drop, so if I have 700ma led's, and a 700ma driver, I can light up as many led's as I possibly want in series, as long as the driver is putting out enough voltage? You got it now.

Example if i want 100 3.5v led's, I would need a driver that outputs 350V, but only 700ma of current?

Almost, you need a driver that can raise or lower it's voltage to at least 350 volts while maintaining a constant current of 700ma. There may be a time (say at low temperature) where it actually has to lower the voltage some to maintain the same constant 700ma of flow. The current driver is really a current regulator, just as a voltage regulator works by maintaining a constant output voltage even with variable load resistance and or variable input voltage to the voltage regulator.

So you say I cannot power a 700ma led with a 2800ma driver? What would happen?

The constant current driver will force 2800ma of current through the LED, but only for awhile as soon the led will melt or explode open. Remember an LED cannot by itself control the current flowing through it, it's not like a incandescent lamp that has a fixed resistance which controls the current at a given voltage per ohm's law. LEDs don't obey Dr. Ohm. Once a led is forward biased by a voltage equal or greater then its Vf spec it acts like a direct short circuit and will self-destruct unless the current is controlled or limited by something external to the LED

I was thinking as long as the driver puts out equal or more that it is fine?
No, a constant current driver puts out only a single value all the time, its rated value of say 2800ma.
Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need.
That applies to constant voltage sources, where the current flow is determined by only the load resistance per Dr. Ohm, up to the maximum current capacity of the voltage source.

Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

No I'm saying constant voltage sources are different then constant current sources. LEDs are current operated and controlled devices (not voltage controlled) so they are best controlled by using a constant current source. If we must use a constant voltage source to power a led we need to add something else to control/limit the current, which is normally a simple series resistor.
Lefty

Remember an LED cannot by itself control the current flowing through it, it's not like a incandescent lamp that has a fixed resistance which controls the current at a given voltage per ohm's law.

That's not true. Most (all?) of the led drivers actually control the voltage on applied on the led string based on a feedback voltage sensed on the current going through the led string. So they are actually controlling the voltage.

As to incandescent lamp, they don't have a constant resistance, as suggested by convention. They are "stable" in the sense that their resistance has a positive temperature coefficient (vs. negative for leds).

carl1864:
Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

A motor draws "what it needs" because of it has resistance. When you put a given voltage across a resistor the number of amps that passes is governed by the laws of physics (aka Ohms law).

LEDs also follow this law but their resistance isn't constant. The resistance of an LED varies with the voltage across it - it goes down as the voltage goes up. That's why they need special drivers.

he resistance of an LED varies with the voltage across it - it goes down as the voltage goes up.

There are two "resistance" concepts for a non-linear device like a led:

  1. Resistance: that is the typical V/I concept. For a led (and a diode in general), its resistance goes up as the current through it goes down.

  2. Dynamic resistance: that's the delta V / delta I concept. A led's dynamic resistance goes up as the current through it goes down.

But none of that is why we need drivers for leds: leds exhibit a nasty thermal behavior - its forward drop voltage drops as it heats up (negative delta v / delta c). So when you apply a constant voltage across an led, a current goes through it. That current heats up the led, driving down its Vfwd, which means more current goes through it -> more heat generated and lower Vfwd goes and still higher the current.

The "current limiting resistor" many peopke talk about is there really to provide a negative feedback mechanism in an otherwise positive feedback loop. And led drivers will adjust the voltage they apply to leds based on the current to avoid that thermal instability.

If leds don't exhibit such thermal behaviors (aka lateral mosfet), we wouldn't be talking this this.

dhenry:

Remember an LED cannot by itself control the current flowing through it, it's not like a incandescent lamp that has a fixed resistance which controls the current at a given voltage per ohm's law.

That's not true. Most (all?) of the led drivers actually control the voltage on applied on the led string based on a feedback voltage sensed on the current going through the led string. So they are actually controlling the voltage.

Well I disagree with your statement, The driver manipulates its output voltage in an attempt to control the current. But perhaps I'm just disagreeing with your use of the term 'control' rather then your understanding of the feedback control loop being used.
I would equate it to the terms used in classic PID control theory, where the 'setpoint' is the desired 700ma current flow value, the current is the measured 'process control variable', and the 'output' is the voltage used to adjust any error term until process variable equals the setpoint. So in classic control theory it would be called a 'current controller' never a voltage controller.
Lefty

dhenry:
That's not true. Most (all?) of the led drivers actually control the voltage on applied on the led string based on a feedback voltage sensed on the current going through the led string. So they are actually controlling the voltage.

Nope. The voltage is whatever comes out of the power supply. The driver has no control over that.

It achieves its function by varying its own resistance so that the resistance of the driver + the resistance of the string of LEDs allows a target current to pass. It's therefore controlling resistance, the voltage.

This is all word games though...and pointless.

Its all really the same, you can't separate voltage from current, one needs the other
However since you'd call the controller by what it was designed to focus on
a voltage regulator attempts to keep the voltage stable as a current regulator keeps the current stable
and the motor example isn't exactly the best relation because there is alot more than just "resistance" going on that affects its actual impedance and hence the current it draws, which would change dynamically with the conditions applied to the motor, the only example that works really is a regular resistor and even then that has a temperature coefficient that can be applied,