Myth - ALL LED'S require Resistors or....

the LED will die!

I'm fed up with people constantly telling people they must use a resistor with ALL LED's, this is not true, once we start dealing with high powered
LED's what's the point of the wasteful resistor?!

take for example this, an LED that handles 1amp at say 4v (your power supply is rated 2amps at 5v) now let's give a realistic
value, let's say 800ma

5/0.800 (yeah yeah, i'm not including the forward voltage drop for a reason, this LED does not exist, but please do add to the calculation if you must, a 3watt resistor?)
=6.25ohm resistor

4 watt resistor.

oh come on now, why?! all one has to do is look at the datasheet, and you'll see a nice curve (voltage / current) and you'll see there's plenty of scope unlike a 5mm regular 20ma LED....
providing you set a precise voltage or limit the current (but not with a 4 watt resistor)

the LED will be perfectly fine...

(Wait's for the.... "at the moment" remark) it really has to stop, if you use 9v on a 5v appliance would you expect magic smoke? probably so why would you expect every man and his dog
to over drive an LED and supply a higher voltage than it can handle!, would you supply 9v directly to an atmega? so why can't you trust a person supplying exactly 2.8v or 3.2v why obsess!

  • Please go look up "Direct Drive" running high powered LED's directly from an unregulated 4.2 lithium battery, they're all FINE....

So why why why why WHY must people KEEP telling others to use resistors with high powered LED?! what next, oh it's a 30watt LED, make sure to use a 15watt resistor to drive it at half power!

Calm down.

If your voltage is safely below the inflection point in the volt/current curve of the diode (including all possible manufacturing deviations in) then you don't need a resistor, no.

OTOH if you have several LEDs and they all need the same brightness then you need to control amperage.

Volts won't do, with or without resistors.

Still, the current through the LED is, has to be, limited - by whatever means, guy.

Last week you were posting about using an LM317 as a current regulator.
If the programming resistor is right, 125? for 10 mA or 12? for 100mA, then you can put an LED without a resistor on the output, but it's not as though you're somehow thereby "unfettered", liberated from the tyranny of resistors as you imagine.

cjdelphi:
why would you expect every man and his dog
to over drive an LED and supply a higher voltage than it can handle!,

Obviously you haven't spent much time around here.

cjdelphi:
so why can't you trust a person supplying exactly 2.8v or 3.2v ...

The world doesn't work that way.

cjdelphi:
I'm fed up with people constantly telling people they must use a resistor with ALL LED's, this is not true, once we start dealing with high powered
LED's what's the point of the wasteful resistor?!

I'm fed up with people not understanding the difference between "giving just enough information" and "giving too much."

cjdelphi:
So why why why why WHY must people KEEP telling others to use resistors with high powered LED?!

Whatever you claim. I rarely see people recommending a current limiting resistor in this case. You still need a current limitor, e.g. constant current supply. Which is usually when the distinction is made between using just a resistor.

A diode (any type of diode) is a "current driven device". We do not design the diode's circuit stuff based on voltages, but currents. So when you have a diode, you always talk about the currents (not voltages).

The forward or reverse voltages are of secondary interest, good to know them when considering overall concept the diode is used.

cjdelphi:
so why can't you trust a person supplying exactly 2.8v or 3.2v why obsess!

Because everybody around here has easy access to 5V and almost nobody has exactly 2.8/3.2V.

The are at least two problems with driving raw LEDs from a constant voltage supply:

  1. For a given current, there is a natural variation in forward voltage with different examples of the same LED. Therefore, if you drive an LED with a constant voltage that you haven't adjusted for that individual LED, then the current that flows (and therefore then brightness) is rather unpredictable.

  2. As the temperature increases, two things happen. The first is that the forward voltage of the P-N junction decreases, especially for red LEDs. The second is that the internal series resistance of the LED increases. These effects oppose each other, but the balance between them depends on the model of LED. If you are lucky, they will cancel out, or the increase in internal resistance will more than compensate for the drop in forward voltage, so that the current decreases with temperature. If you are unlucky, you will get thermal runaway and exceed the current rating of the LED.

Just because under certain carefully controlled situations you can get away without using a series resistor or constant-current circuit with an LED doesn't mean that it isn't a very bad idea in general.

In passing, I note that there are two kinds of person who post to this forum:

  1. Those who will design a circuit such that it is guaranteed to work, for all examples of the suggested components, and over reasonable variations in temperature, supply voltage etc.

  2. Those who take the attitude "I tried this and it worked, therefore it is OK". Maybe it works for you, but that doesn't mean that it will work for everyone else, or it will go on working, or it will still work when it is much hotter or much colder.

Guess which camp I belong to. I learned the hard way. Many years ago, I was asked to complete the design of an early British microcomputer (even though I was employed by the company as a software engineer). One of the problems I had to fix was that the real-time clock was getting corrupted at power down. So I added a circuit that prevented its /CS signal being activated when Vcc was below about 4.5V. The design was completed in summer, and manufacturing commenced. Unfortunately, when winter came, we started getting reports of non-functioning RTCs in the field, because I had omitted to allow for temperature variation in Vbe voltage of the transistor.

dc42:
I learned the hard way. Many years ago, I was asked to complete the design of an early British microcomputer (even though I was employed by the company as a software engineer). One of the problems I had to fix was that the real-time clock was getting corrupted at power down. So I added a circuit that prevented its /CS signal being activated when Vcc was below about 4.5V. The design was completed in summer, and manufacturing commenced. Unfortunately, when winter came, we started getting reports of non-functioning RTCs in the field, because I had omitted to allow for temperature variation in Vbe voltage of the transistor.

it reminds me of something that happened to me... many years ago too :grin:
I had to install a prototype in a steel plant (aciérie) , and it had t be done before the next monday. I finished on sunday, tried everything... well done, man, it works !
But on monday, the machine went crazy, motors didn't stop when they were supposed to stop etc...the rotary encoders gave weird informations !! It took me 2 hours to understand : there were interferences from other machines . OK, I had all the grounds connected, but some of the cables were not big enough ! I replaced them with large flat copper cables, all connected at the same point, and it was fixed 8)

So why why why why WHY must people KEEP telling others to use resistors with high powered LED?! what next, oh it's a 30watt LED, make sure to use a 15watt resistor to drive it at half power!

Absolutely no one of any sense advocates using a resistor with a power LED.
Absolutely no one of any sense advocates using no current control mechanism with a power LED.

running high powered LED's directly from an unregulated 4.2 lithium battery, they're all FINE.

Yep do you know why? Ring back when you have the answer or you have calmed down.

I don't think anyone has said that ALL LED's are the same.

Limiting current flow is a fundamental part of electronic design. I am very clear that when people are talking about LEDs... the kind we get in "starter kits" are not the same as kind we "light up rooms" with. The 20Ma Indicator LED and the "blind-yo-mama" 8500 Lumen 1AMP LED are not created equal.

Playing with a 1.2V 20mA LED indicators... add a resistor and keep it happy. Skip using resistors? Learn how some parts can work and then die young.

Playing with a Luxeon "wallet drainer", well, hey, it's your money... design a solution you can afford. They are designed for higher voltages and higher currents... Want to leave out resistors... have fun... the design limits when using power LED's is well known but if the perfect solution goes wonky (power supply acts up) then Murphy's law is surely going to avoid you, right? (sarcasm).

To me... if I spend that much money on a power LED (more than $3.00 let's say), just my luck something is going to go wrong... and with that extra current limit protection removed I'd feel wrong. I'd toss a low value resistor in just to feel safer. But hey that's me. Not you. And the next guy should decide that for themselves as well.

luxeon-altilon-1.jpg

led1.jpg

cjdelphi:
I'm fed up with people constantly telling people they must use a resistor with ALL LED's, this is not true, ...

We don't even say that for low-power LEDs. Take a look at this:

Circuit:

No current limiting resistors. There is a "current set" resistor but nothing in series with the LEDs.

But this circuit is designed for this. The advice that is normally handed out to newbies who "got my Arduino today" and are "trying to make an LED blink" and "know nothing about electronics" is reasonable, to put a resistor in series with the LED.

All MOS fets should have a gate resistor :wink:
All LEDs should have a series resistor :wink:
All grounds should be connected together :wink:

On we go....

Yeah, we don't need that silly common ground among multiple power sources... I heard electrons come with wings now... yeah that's the ticket...

pwillard:
Yeah, we don't need that silly common ground among multiple power sources... I heard electrons come with wings now... yeah that's the ticket...

time for another tro.... euh.... thread, sorry ]:slight_smile:

Calm down?

I was the definition of calm when i posted it... my my

So then, not all LED's need resistors (key word resistor) we all agree that's true then?

I was testing some leds with a variable voltage supply I had 12 leds in series I turned the supply up till they all came on.
Looked great I turned off the supply. I then came back to see how much I could go I turn on the supply and a big bang I now have a scar from
one of the leds. I must hit the pot on the supply..

I gave up on the voltage idea for leds and went with current controlled supply.

cjdelphi:
So then, not all LED's need resistors (key word resistor) we all agree that's true then?

What is your point (keyword point)?

The current through any LED must be limited, by whichever means appropriate.

It's like what James posted earlier,

Just trying to twist wise words into a trap for fools for lack of anything better to do?

It's a real tempest in a straw-man's teapot this.

I think this forum should hold the claim as most often visited by LED trolls. :smiley:

Lefty

If only diodes behaved like resistors, there won't need to be such discussion. Alas. Transistor, you're up next!