So what (grin) resistor should i use for this LED then? :D

I'm not entirely sure how it would dim?...

The LED's max current exceeds anything the cheap switching regulators can provide, the regulator would kill itself before this 100watt LED, you might peak at 30watts but the regulator will shut down
before these LED's will, I should know i have several "Direct Drive" which run directly from 4.2v lithium batteries, the SSC P7 and Cree Q5, both perfectly fine after thousands of hours...

if your regulator can handle a max of 50 watts, just how would it kill an LED that can handle 100watts, even if you put your regulator to the same voltage as your input 12v+ more, you'll find the regulator struggling like hell because the LED can go much higher providing it's adequately heat sinked....

well theres no such thing as a 100 watt LED, you have a panel of led's

and all led's dim over time, but hey you got it figured out, so why argue right

heck why did you post?

You're avoiding the 'how' it will dim....

but that's ok.... we're not living in 1999 anymore providing the voltage is constant so will the current draw .... it maches the datasheets data.

cjdelphi:
we're not living in 1999 anymore providing the voltage is constant so will the current draw .... it maches the datasheets data.

Yes, that's right.
Congratulations for having exposed the "Resistor Conspiracy", a hoax manufactured and perpetrated by the hide-bound "EE" Theocrats - Create Your Own Reality!

The Op is basing this on the total leds looking like 10 ohm load 30 volt 3 amp 10 ohm.

You set it as the leds came on And it worked it will for now but it may not on the next set.

I played with current driving leds and the voltage idea your using.

I did it like this I hooked 10 red leds in series I supplied 1.5 to 25 volts as I turn the pot led 1 came on
the supply used lm317 turn it up led 9 came on went for led 10 and on it came.

I sat there and said cool till they popped like popcorn but my supply could put out 3 amps your is limiting the current.

You're avoiding the 'how' it will dim....

they dim just by being lit up, all led's will, go look at a 20 year old alarm clock

but that's ok.... we're not living in 1999 anymore providing the voltage is constant so will the current draw .... it maches the datasheets data.

Here is a little snippit from a chart I did for work, these were not cree led's but it should illustrate what I am saying. As the heat in the led's increased so did current, after 87 hours or so it was over 50ma more. Is this significant in your design and parts? I cant tell you that, but I can tell you that it was a constant 3.3 volts and currents were not constant for many hours. It did eventually reach a point where the line went more or less flat, but that was days after first being lit, nearly about 80 ma more than where they started.

chart.jpg

Hey, he reached equilibrium before disaster.

I've read that new leds and new solar cells lose about 10% in less than a year with normal use but the loss rate slows down after that.

Osgeld:

You're avoiding the 'how' it will dim....

they dim just by being lit up, all led's will, go look at a 20 year old alarm clock

but that's ok.... we're not living in 1999 anymore providing the voltage is constant so will the current draw .... it maches the datasheets data.

Here is a little snippit from a chart I did for work, these were not cree led's but it should illustrate what I am saying. As the heat in the led's increased so did current, after 87 hours or so it was over 50ma more. Is this significant in your design and parts? I cant tell you that, but I can tell you that it was a constant 3.3 volts and currents were not constant for many hours. It did eventually reach a point where the line went more or less flat, but that was days after first being lit, nearly about 80 ma more than where they started.

80ma?

80ma is barely enough to read a book with...

these take 8000ma eaay

80ma increase, is that 80ma MORE than the led can handle? In my case no

the point is the led was fed a constant voltage and its current consumption increased over time, you would see that if you bothered to read what I had said instead of bragging about your led's

its current consumption increased over time,

It is less a function of time, but a function of rising temperature and being driven by a constant voltage source. Even if driven by a voltage source, the rise in temperature will eventually stop, as the device reaches its thermal equilibrium. The question really is if that equilibrium is still within the device's performance envelope.

yes temperature is what I meant, but more importantly OMG someone understands!

True Or False?

  1. LED's are current controlled, not voltage? True or False
  2. Regulators have a Maxium Limit they can supply before thermal regulation steps in...? True or False?

If (Answer1==TRUE & Answer2==TRUE)
{
//LED is safe.....
}

is that Logic incorrect? I know my cheap switching reg's can supply 15 - 30 watt (if heatsink is adequate) .. how
will a 100watt ever "dim" from a max of a 30watt power supply?....

I'm trying to explain why in some, yes SOME instances, it's safe not to use a resistor... providing you have perfect
regulation of voltage with no deviation whats ever from heat or otherwise.

You'll find plenty of Lithium DD flashlights/torches on the market, they don't dim after 2 days lol, point is, anyone who claims "EVERY LED" need's some kind of current limiting device eg a Resistor or
Current Limiter (via a Semiconductor) is wrong, people in the flashlight world do it all the time, providing you don't supply > 4.2v (if it's possible, then sure add a limiter) on some LED.

Be 1 lumen from a 5mm LED or 50,000 from a monster sized emitter of LED arrays, as long as they don't go over that magical voltage number, I don't see the point of an LED on HIGH POWER LED's
if your LED requires a max of 50ma and anything over will kill it, yes give it a 300ohm resistor, protect it, but if your LED requies 100watts, and your power supply can only dish out 50watts at MAX.

Let's say we had a CV of 12v out (15v in via the Regulator) (30watt supply MAX before thermal shutdown kicks in)

12v in +(3.8v LED)- GND

POOF.

Dead, if it were a 5mm LED.

BUT, after all, LED's are "Current Controlled"

In this instance, let's use a 3.8v LED that supplies an array of LED's but each LED can handle 5amps easy, the entire array let's assumes 100watts of energy.
it sees 12v, it does the same as the little 5mm, it tries to take as much current as possible... the Regulator does so and provies it's full 50watts
of energy to the LED.

Now, we have the LED running at half it's power, it's not dead, it's emitting very brightly and getting very very hot, but we have a nice big 500watt heatsink in the form of a huge finned
heatsink i ripped out from a very old square more heavily armored than a tank and did my back in getting it into the car... so let's now take a look at the regulator.

for argument sake it does manage to supply 50watts and it's not oscillating between low and high power because it's maxed out..

The voltmeter, will show 2-3.8v (since it's 50% capacity, i'd guess the voltage across - and + terminals to be around 50% around 2.8-3.v and a huge voltage drop because it can't supply enough current
as a result the voltage would drop to whatever the LED's is comfortable at emitting at a specific voltage, but sure enough that LED will be alive, unless that Semiconductor peaks > 100watts I really
can't imagine it dying.

but i'd love to see myself proved wrong, on these cheapy regulators, if you show me it going POOF from a really expensive regulator >$100 which supplies > 100watts then fair enough, goahead limit the current
because i would..... I'd also limit the current if i connected it directly to a 12v battery...

I just don't see the need to protect it from something that could never supply enough current to kill it.... take a lithium 4.2v battery, connect a >3watt LED, it should run perfectly fine with no next to no dimming for thousands of hours, LED's fade i'm not against using Resistors, i always do use them always, what i can't accept is people telling me ALL LED's regardless need some kind of current limiting, no not always, if you're careful enough all you need is a voltage controlled regulator and nothing more and be totally safe even with a 5mm LED.

Set 2v via an Adjustable Regulator, connect it in series with an ammeter and adjust until it hits 15-20ma, and stop, read the voltage....

even a 5mm is not going to dim if all you supply is a Precise Voltage, you know the regulator would never produce enough heat to drift off
desired voltage, just how is the LED going to dim without a resistor then?

Set 2v via an Adjustable Regulator, connect it in series with an ammeter and adjust until it hits 15-20ma, and stop, read the voltage....

and that number will be wrong in a matter of time, the Fv of a led changes with heat, what part of that are you not grasping?

even a 5mm is not going to dim if all you supply is a Precise Voltage

if your going through the trouble of providing a precise voltage for a moving target why not just do it right in the first place?

And bull, any led will dim just by using it, now if its abused it will crap out much quicker than one kept in absolute perfect conditions, but it will still dim

Na the op is using the regulator as a current source and if you read the data sheet on most they lower the voltage to hold there output of current. I posted this before but some people don't read data sheet's and think that A plus B is C Na A can be A LOL.

Osgeld:

And bull, any led will dim just by using it, now if its abused it will crap out much quicker than one kept in absolute perfect conditions, but it will still dim

ok, so if i set a spare regulator to a a mid range level until it reads say 15ma, i'll leave it at 2.1v or 2.3v or whatever it is, i'll have an identical one to that i wont touch (only to power it up once to compare brightness), and i'll leave it on for a week and i'll take a before and after picture... don't expect me to do it right now, but I will tomorrow... and i'll see how much visible differences they are

There's voltage regulators and there's current regulators.

Characterizing the Thermal Resistance Coefficient of LEDs

http://www.lrc.rpi.edu/programs/solidstate/cr_thermalresistance.asp

Do LED bubls weaken/lose brightness over time?

But why ask people who have spent more than a little time actually checking these things when you can simply misinterpret a datasheet?

Hmmmm, led brightness drops off over the lifetime (50,000 - 80,000 hours?) by perhaps 30% before it dies. Then the loss is 100%.

So getting old sucks for leds too.

someone understands!

Surprisingly, few in the "you have to have a resistor" crowd understand the reason you had articulated as to how a resistor is needed.

cjdelphi:

Osgeld:

And bull, any led will dim just by using it, now if its abused it will crap out much quicker than one kept in absolute perfect conditions, but it will still dim

ok, so if i set a spare regulator to a a mid range level until it reads say 15ma, i'll leave it at 2.1v or 2.3v or whatever it is, i'll have an identical one to that i wont touch (only to power it up once to compare brightness), and i'll leave it on for a week and i'll take a before and after picture... don't expect me to do it right now, but I will tomorrow... and i'll see how much visible differences they are

A week? Try 3 months maybe more to get a real drop.

I wonder if leds kept colder than room temp, say in a walk-in freezer, would last appreciably longer, more than 10-12%?

Or what using a led as a light sensor does to its lifetime?

be sure to record your current as well over the time. I doubt you will be able to see much in a week without using a lux meter ... but then I tend to design stuff that will last more than a week anyway

let me ask, do you also drive your car with it always in first gear cause it works fine and you see no immediate damage?

LOL, you ask about driving a stick in this day and age?
Not that I don't have an abiding dislike for automatics but I'm approaching 60.