Running high power led's without drivers.

Remember an LED cannot by itself control the current flowing through it, it's not like a incandescent lamp that has a fixed resistance which controls the current at a given voltage per ohm's law.

That's not true. Most (all?) of the led drivers actually control the voltage on applied on the led string based on a feedback voltage sensed on the current going through the led string. So they are actually controlling the voltage.

As to incandescent lamp, they don't have a constant resistance, as suggested by convention. They are "stable" in the sense that their resistance has a positive temperature coefficient (vs. negative for leds).

carl1864: Same way you can use a 12v 1A ac adaptor to power things that only use 100ma, since they only take what they need. Or how on RC motors I use, I can use a 30A speed controller on a motor that only draws 15A if I want, since the motor only takes what it needs. Are you saying things work completely differently when it comes to led's and drivers?

A motor draws "what it needs" because of it has resistance. When you put a given voltage across a resistor the number of amps that passes is governed by the laws of physics (aka Ohms law).

LEDs also follow this law but their resistance isn't constant. The resistance of an LED varies with the voltage across it - it goes down as the voltage goes up. That's why they need special drivers.

he resistance of an LED varies with the voltage across it - it goes down as the voltage goes up.

There are two "resistance" concepts for a non-linear device like a led:

1) Resistance: that is the typical V/I concept. For a led (and a diode in general), its resistance goes up as the current through it goes down.

2) Dynamic resistance: that's the delta V / delta I concept. A led's dynamic resistance goes up as the current through it goes down.

But none of that is why we need drivers for leds: leds exhibit a nasty thermal behavior - its forward drop voltage drops as it heats up (negative delta v / delta c). So when you apply a constant voltage across an led, a current goes through it. That current heats up the led, driving down its Vfwd, which means more current goes through it -> more heat generated and lower Vfwd goes and still higher the current.

The "current limiting resistor" many peopke talk about is there really to provide a negative feedback mechanism in an otherwise positive feedback loop. And led drivers will adjust the voltage they apply to leds based on the current to avoid that thermal instability.

If leds don't exhibit such thermal behaviors (aka lateral mosfet), we wouldn't be talking this this.

dhenry:

Remember an LED cannot by itself control the current flowing through it, it’s not like a incandescent lamp that has a fixed resistance which controls the current at a given voltage per ohm’s law.

That’s not true. Most (all?) of the led drivers actually control the voltage on applied on the led string based on a feedback voltage sensed on the current going through the led string. So they are actually controlling the voltage.

Well I disagree with your statement, The driver manipulates its output voltage in an attempt to control the current. But perhaps I’m just disagreeing with your use of the term ‘control’ rather then your understanding of the feedback control loop being used.
I would equate it to the terms used in classic PID control theory, where the ‘setpoint’ is the desired 700ma current flow value, the current is the measured ‘process control variable’, and the ‘output’ is the voltage used to adjust any error term until process variable equals the setpoint. So in classic control theory it would be called a ‘current controller’ never a voltage controller.
Lefty

dhenry:
That’s not true. Most (all?) of the led drivers actually control the voltage on applied on the led string based on a feedback voltage sensed on the current going through the led string. So they are actually controlling the voltage.

Nope. The voltage is whatever comes out of the power supply. The driver has no control over that.

It achieves its function by varying its own resistance so that the resistance of the driver + the resistance of the string of LEDs allows a target current to pass. It’s therefore controlling resistance, the voltage.

This is all word games though…and pointless.

Its all really the same, you can't separate voltage from current, one needs the other However since you'd call the controller by what it was designed to focus on a voltage regulator attempts to keep the voltage stable as a current regulator keeps the current stable and the motor example isn't exactly the best relation because there is alot more than just "resistance" going on that affects its actual impedance and hence the current it draws, which would change dynamically with the conditions applied to the motor, the only example that works really is a regular resistor and even then that has a temperature coefficient that can be applied,

Nope. The voltage is whatever comes out of the power supply. The driver has no control over that.

It achieves its function by varying its own resistance so that the resistance of the driver + the resistance of the string of LEDs allows a target current to pass. It's therefore controlling resistance, the voltage.

The voltage out of the power supply has no relevance here. What a (linear) led driver does is to change its resistance so that the voltage across the led string generates the right current level.

Switching mode drivers work similarly, except that they generate a voltage directly across the led string, by stepping up / down the supply voltage.

To my knowledge, there is no led drivers, linear or switching mode, that controls the current directly (aka shunt style), for obvious reasons.

1 single 4.2v (Fully Charged) lithium battery will supply max brightness (in the flashlight world they say “direct drive”, most would say a short i guess lol) …

BUT, that 4.2v soon becomes 4.1…3.8 in less than a few seconds, your current as a consequent drops… the best way is to keep that current up is to build a driver, sadly most people in the electronics
think of super bright LED’s as anything under 30ma lol… i’m dealing with 5watt and 10watt RGB and 15watt white LED’s which all run fine at 4.2volts without a resistor (except for the 12/24vdc LED’s)

but you need a resistor if you wish the arduino to power it (unless you want to damage the pin/processor). if you wish to power you need a suitable power supply, a power transistor or a mosfet and a big heatsink, also you may prefer to use a PWM pin and use analogWrite on it with say a power transistor and gradually dim/light it (as i do with a 12watt 4.2v LED/TIP31,for that, i run it at around 0.9amps or Full 1.3amps, but i use a 300 ohm resistor on base to protect the pin on the arduino, i could go a lot higher with another method)

your power supply you plug into the wall does not deliver like a battery, chances are give it 4.7volts and you will see smoke, batters can only supply 4.2v, this is why manufacture specifications are important.

1 single 4.2v (Fully Charged) lithium battery will supply max brightness (in the flashlight world they say "direct drive", most would say a short i guess lol)

This is simply because the output impedance of the battery limits the current drawn. This would not necessarily apply to any voltage source of 4.2V.

I think this is what the OP is seeing when he says that the figures do not add up. By not taking into account the supplies output impedance you get a false idea of the current you can draw. It is not that simple either. The output impedance changes with the current draw, that is it is not a linear impedance, which is why the graph of voltage against current is not a straight line.

The voltage out of the power supply has no relevance here.

Apart for setting the maximum voltage output of the constant current driver (minus a bit for the drop across the driver's internal components) .

Some of us remember the constant current output of mechanical teleprinters. They outputted a 20mA constant current into the next teleprinter. If this was a few feet away the voltage needed to drive this current was small. But they could be connected at a distance of a few miles apart, this required a much bigger voltage. The ultimate was when it was disconnected so it never stood a chance of driving 20mA then it could output 180V, enough for you to get a bit of a shock from, but connected up it had a very low voltage on the terminals.