OK Then... 5mm white LED 3.51v

I’ve decided that I’m going to find out what all the fuss is with voltage driven LED’s alone.

since i’m using a low watt 5mm LED (since it can easily be monitored) I have an Oscilloscope on hand too (up to 1mhz but that’s not an issue here as it’s a clean waveform from a cheapy voltage regulator)

I started off by setting 1v, and then slowly creeping my way up while watching the current draw, at exactly 20ma I stopped adjusting the voltage…

Now, what am I supposed to do now to see it burn out and die? please do, I have a hot air gun (which will melt it) I’ll even use the kitchen thermometer while I gently heat it up to around 80c from 19c it now… providing i don’t melt it, what will happen you claim?

The voltage will rise enough to kill it will it?.

you dont own all the tools to perform this test, simply put your using your eyes as a "YEP THATS STILL BRIGHT" measurement device, based on a "YEP IT STILL WORKS" test platform

please continue, but I am planning a realistic test that involves more than a analog meter, a old fluke knock off, and your eyeball (the best low pass filter ever) in a serious lab where we deal with mostly LED lighting, involving tools specifically made for the trade to measure heat inside the part, voltage,current, luminosity and color temperature to the 7th decimal point.

half of which will be sampled at 1Hz for the weekend (next upcoming weekend) , before I have something serious to work on, like a car headlight using a single package LED array ... at 16 volts ... at 85C ambient ... in 85% humidity, with less than 1% degradation over 1000 hours

you have pushed me to fight you in an totally unfair fight, you have a scope available, I have about 45 thousand dollars worth of tools and software at my disposable just to measure LED's and nothing else, and 2 years experience battling OEM customers in their totally random claims, I do this shi* for a living ...

game on.

Funny i ask how i'm going to see the LED destroyed to see what you mean...

If it's so easy to destroy an led... you just can't tell me how, a knock off fluke or anything else has WHAT to do with an LED blowing...

Unless you tell me what to do to prove what you say... all you can do is insult ... wow.

Honestly, is this a big enough deal to fight over it like schoolboys?

Wikipedia has a good summary of the issue:

The current/voltage characteristic of an LED is similar to other diodes, in that the current is dependent exponentially on the voltage (see Shockley diode equation). This means that a small change in voltage can cause a large change in current. If the maximum voltage rating is exceeded by a small amount, the current rating may be exceeded by a large amount, potentially damaging or destroying the LED. The typical solution is to use constant-current power supplies, or driving the LED at a voltage much below the maximum rating. Since most common power sources (batteries, mains) are constant-voltage sources, most LED fixtures must include a power converter, at least a current-limiting resistor. However, the high resistance of 3 V coin cells combined with the high differential resistance of nitride-based LEDs makes it possible to power such an LED from such a coin cell without an external resistor.[105]

I've had this led running all day with no form of current limitng...

I'll bet it lasts just as long as an led with a 300ohm resistor, i can't understand why you'd need a resistor providing the correct voltage is being outputted the LED will not draw any more current providing voltage is static....

Keep that in mind then one could argue that current IS controlled via voltage alone but because of fluctuations i would not do it unless it's 100 percent safe from ripple eg a zener diode (unless a pass transistor was used)

Anyway this LED is still drawing 20ma no resistor and it works fine... now i want to know how this led will draw more current without touching the voltage

The current is limited by something, obviously, and once you punch over that precarious knee -- you're skrewed.
Get a clue, you've been stumping for this cause of yours long enough.
How many Subjects have you begun on this rot? [Several.]
Good Grief.
There's cause and effect and there's jumping to conclusions based on empiricism without regard to context or good sense.

... or ...

Hey, you're rejecting "science", man; good on you for sticking it to all those jackweed tool resistor ayatollah dopes!

cjdelphi:
Anyway this LED is still drawing 20ma no resistor and it works fine... now i want to know how this led will draw more current without touching the voltage

It won't ! As long as you don't increase the voltage. A resistor is not needed if you make sure that the power supply of each LED remains under its maximum ! But if it goes a little over it, the current will increase as pointed in JohnHoward post !
As it is easier to maintain, using a resistor, a current between 10 and 30mA than to make a power source of exactly 2V, then a resistor is a good way to drive and protect a LED

BTW, your power source contains....resistors :wink:

cjdelphi:
Funny i ask how i'm going to see the LED destroyed to see what you mean...

If it's so easy to destroy an led... you just can't tell me how, a knock off fluke or anything else has WHAT to do with an LED blowing...

Unless you tell me what to do to prove what you say... all you can do is insult ... wow.

you expect a major disaster when your looking for gradual death, but you just dont get that, it may take days, or weeks. Oh and all you do in every freaking thread you start on this topic is insult people, so buck up and take what you give

A gradual death? From precisely a 20ma current draw with a precise voltage...

Not only has this been running for almost 24 hours consuming exactly 20ma but this WILL work for a very long time...

So a regulated voltage IS safe... you keep banging on about conditions I'M NOT asking to be done...

I gave the facts... a precise voltage, 20ma draw... you say it needs a resistor but not WHY when i have a constant voltage...

I asked you what needs to be done to see failure and you go on about a 16v power supply, it's like you refuse to believe an LED is safe voltage controlled only... IT IS

Where once have i said LEDs don't require current limiting, and i AM by controlling it's forward voltage (but still i get told this is unsafe because of time/heat)

If that's true then how do i make this LED fail... (

cjdelphi:
If that’s true then how do i make this LED fail… (

When your “safe voltage control” deviates from its present, contrived trimming.

Besides, you are imagining that your voltage source is an ideal 0? Z**out**.
It’s not.

Ohhh now we’re getting somewhere

So you accept NO resistor is needed then!! Your argument like others stems from the fact the resistor is there for an Unknown cause… eg you don’t need safety goggles to solder BUT it’s recommended… NOT REQUIRED]

Yes or no question...

If you're able (which is tricky due gain, tolerance, etc) to supply the perfect voltage consistently everytime no fluctuations...

The LED would operate normally?

Yes or No

You're setting up a straw man argument:
you go to great trouble and get things tweaked ever just so, taking "advantage" of an unreliable characteristic of some piece of equipment or other and reckon that in so doing you've thereby proven, somehow, that "resistors are unnecessary"

Model rockets outfitted with streamers don't prove that parachutes are unnecessary - or repeal the Law of Gravity.

cjdelphi:
A gradual death? From precisely a 20ma current draw with a precise voltage...

forward voltage of the LED changes with tempature and time, your precise voltage is good now but it wont be later, or in different conditions

cjdelphi:
Not only has this been running for almost 24 hours consuming exactly 20ma but this WILL work for a very long time...

prove it, you have not yet, and I aim to disprove it next weekend (I need lab time)

cjdelphi:
So a regulated voltage IS safe... you keep banging on about conditions I'M NOT asking to be done...

what conditions are you talking about, the fact that your precise voltage will not stay that way or that your room temperature wont change?

cjdelphi:
I gave the facts... a precise voltage, 20ma draw... you say it needs a resistor but not WHY when i have a constant voltage...

cause for the umpteenth time forward voltage changes, your LED and everything around it FREAKING CHANGES

cjdelphi:
I asked you what needs to be done to see failure and you go on about a 16v power supply, it's like you refuse to believe an LED is safe voltage controlled only... IT IS

What in God's holy name are you blathering about? I have not mentioned a 16 volt power supply, and I refuse to believe cause I can measure it.

cjdelphi:
Where once have i said LEDs don't require current limiting, and i AM by controlling it's forward voltage (but still i get told this is unsafe because of time/heat)

If that's true then how do i make this LED fail... (

give it time or accelerate it, for god sakes you can run a led off any voltage you please if you are doing it right, but please go ahead and continue riding a motorcycle on a piano wire

I never said once resistors are not needed stop putting words in my mouth... i said if under the right conditions (which would NOT be right if they were anything other than perfect)

You can't say i'm wrong... i'm running an LED right now using this method, if the voltage fluctuated and destroyed the whoppingly expensive 30cents led... i'd have to grieve over the loss!

You keep ignoring the fact that in an unkown state i WOULD use some form of currenting limitor to protect it... but here i'm not (you show me a single post i've made where i said NOT to use resistors with LEDs... you wont i understand perfectly)

This is about the fact it IS possible to drive an led safely using voltage alone... if you say that's not possible i'll get an lm317 hook this LED to it's own external power source and see how long it runs for... i think i'll get bored of waiting long long before it dims or dies...

Voltage changes over time/temperature...

If that's your only argument it's pretty weak... you seem to fail to understand the job of a voltage regulator....

OMG its the forward voltage of the diode that changes, why do you not understand this?

Ok so I did a little bench top experiment, a 5mm red led, brand new off the tape, hooked it up to my benchtop power supply and slowly turned up the voltage.

at 2.314 volts I measured current using a tektronix multimeter at 20.09ma

I let it sit for 5 min on the bench and the current measured 20.31ma, right away theres half a ma just sitting there in open air with the ac on in the room. Is it a big deal, no, but it doesnt mean its not happening!

I then held it in my fist, instantly current draw started increasing, after 5 min it had crept up to 21.36ma

I then held it next to the CFL bulb in my desk lamp, which is not as hot as it was outside today, and it went up to 27.44ma, so yea if I set it to the "perfect" voltage inside my apartment, then took it outside today at noon it will start drawing 7.4ma more

finally I set it back on my desk and took a can of compressed air, held it upside down and gave it a little squirt, guess what, the current consumption dropped to 12.60ma

never once did the input voltage vary from 2.3 volts, so whoppie you know what a voltage regulator is! What voltage do you set it to? cause just in 20 min of putzing around I measured from one extreme to the other there is a 15ma difference on the same LED using a constant voltage source in different tempatures.

for my next magic trick I took the same LED, and attached a 150 ohm 5% simple crappy radio shack resistor to the LED, powered at 5.0 volts

at room temperature I measured 17.98 ma

after 5 min I measured 17.99ma

holding it in my hand for 5 min I measured 18.04ma

holding it to my lamp for 5 min I measured 18.15ma

using the compressed air to frost it I measured 17.19ma

yea I will just plop in the resistor and take the 0.96ma drift over a 15 ma drift any day, and almost all of that is the temperature of the resistor

Lol

You need to check your voltage input.....

Right then finally..

I just got the heat gun out, and as you claimed the current began to rise......

BUT.

My argument stands, I "selected" the voltage to determine the "current" at the given temperature If it had of been 30 degrees warmer, i'd still have selected a "20ma" draw, so then Answer this...

"If i built a circuit which decreased/increased voltage based on temperature" all it's doing is modifying the VOLTAGE of to the LED based on it's temperature.... tell me how that method is NOT current controlled - don't put words in my mouth again, this thread is about ANSWERS, another method would be to monitor's it current and decrease/increase it's Voltage.

Current can be controlled through Voltage under stable conditions...