As recently as a year ago I bought a light for scuba diving that was HID, about 700 lumens. I have a similar HID light for riding my bike at night. Neither are viable products today since LED lights that produce 600-800 lumens are a little cheaper and are instant on, rather than taking a couple of minutes for the gas to reach its optimum temperature.
LED lights to replace incandescent or fluorescent bulbs in the home devote most of their weight and bulk to heat dissipation. UK dive lights have a little heat fin on the front of the lens http://www.uwkinetics.com/technology/lights-technology/lumen-booster which keeps the temperature of the LED close to the temperature of the surrounding water and improves the efficiency of the LED itself by avoiding heating.
When I read the wikipedia article on LEDs, I learn about 2 issues. The one I disregard is that "white" high power LEDs are blue LEDs with a yellow phosphor. I don't think the yellow phosphor is the source of heat, since it seems more or less equivalent to the coating on the inside of a fluorescent tube's glass and those are cool. The main issue is that the epoxy coating the LED has a high refractive index ( >4 ) and so light that hits it at any angle much different from 90 degrees is reflected internally. The edges of the LEDs are at right angles so that the reflected light bounces around inside forever or until it becomes heat.
But if the epoxy coating were spherical all of the light would be at a right angle to the surface on the first try. And if the epoxy coating were something like the dome shape we are all familiar with from low power through-hole LEDs, the light would all get out on the first few bounces. And then the LED would not get hot, the heat sink would not be needed?
Obviously smart people at Cree etc have thought of all this. What is it that I don't understand?
Well a typical incandescent lamp is about 5% efficient, where a led can approach 90% efficiency. However a 10 watt LED will still dissipate 1 watt of heat in a very small package. Such heat has to be dissipated so that the led is operating below it's maximum package temperature. The manufacturer's datasheet will give the package temperature limits and heat transfer efficiency of the device package to a heat sink. The math is not real simple but suffice to say that the user/designer of a led must ensure that they provide the heat-sinking need to keep the led from destroying itself.
I built a pair of RGBW "spotlights" with six watts per color of 1w LED's for halloween. Worked well. All I did was use standard heatsink compound onto a 3x4x1/8" chunk of aluminum that was part of a CD drive frame at one point. With all of the LED's on, after a few minutes of operation the aluminum was warm, but still comfortable to hold. The temperature was not enough to soften the hot-melt glue I used to mount them mechanically.
I use a 12 watt (10 watt output) LED can light in my photography and I love it, I've also made a 10w LED spotlight with a CPU heatsink from a 486 handling the dissapation of both the LED and the current regulator.. so cooling is needed, but you really don't need to go crazy, in my experience.. I do sometimes think certain things are engineered to LOOK engineered, rather than actually being designed the way they are for a purpose. Having worked with 10 watters, I can confidently tell you that if there's any thermal link between the back of that LED and the surrounding water at all- even just through a bit of heatsink compound to the aluminum housing or whatever- you're going to keep it well within operating temps in my opinion... [standard disclaimer as it cooks his forehead to a nice medium-well, 200 feet down..]
The same reason the sun is hot.
The process of energy into visible light is not 100% possible, all energy is transmitted in packets, you can only have full packets and only so many for each given wavelength at a given time, so the rest get spread out both up and down the energy spectrum.
Not! Either one.
I think the best blue LEDs are about 41% efficient, if you measure actual "energy of light photons leaving" vs "electrical energy in" (apparently sometimes called "quantum efficiency.")
"System Efficiency" seems to be measured in Lumens/Watt, which is pretty much a made-up number if you want to start talking about the mathematical definition of "efficiency." I mean, "Lumens" has a definition based on the physiological characteristics of the human eye (so 10W of "green" has many more lumens than 10W of "red.") Higher is better, but the best white LEDs are already produced more than 100Lm/W, and you know about "efficiencies" greater than 100%, right?
So for each watt of power you pump into a blue LED chip, you get about 0.4W of light out, and about 0.6W of heat that warms up the chip. A bit more of that 0.4W is lost in the phosphor, in the chip itself, in the lens and reflectors involved in getting the (now "white") light out of the package. And it all goes to heating it up...
Look at it this way. The filament of an incandescent light bulb is literally "white hot"; upwards of 1700 K ! (For any size of lightbulb. A 4W nightlight bulb will be uncomfortably hot, and 300W halogen bulbs became infamous for starting house fires. A white LED will have "ceased to function" (and probably "failed dramatically") at about 1/4 that temperature (425K is about 150C)
The LED has a voltage drop across itself and a current through it when it is working. the VI is total power it takes in, ~~then I^2R is the heat lost across the diode. A diode is a piece of doped semiconductor so just like conductors it has resistance, just not defined as V/I, but dV/dI or slope on V vs. I curve. The heat component (I^2R) is a major part of power loss, while the VI-I^2*R is what turns into light.~~ The resistance might be big at high temperature so cooling it helps lower the heat lost. My high power LEDs in my lab all have large heatsinks.
I'll have to check how to calculate resistive heat lost on diodes. Maybe you can only measure it?!
Thanks for the insight. I think I understand better now.
Bill gave me more insight into what a strange measure 'lumen' is than I had any idea about!
It seems like a bunch of the available (white) devices produce about 60 lumens / watt while 100-105 lumens /watt is mentioned some places (more as coming soon than available now). That certainly implies more of the energy going to heat (in currently available devices) than I thought at first. It also suggests that future LEDs might be cooler and brighter. My impression from some of the graphs I've looked at is that overall efficiency of converting power into light is not much different for fluorescents and LEDs. I guess that might make some sense since conceptually similar quantum phenomena are used to produce photons from electric energy and then a phosphor to shift the spectrum to a more pleasing one. The fluorescent devices are physically larger and don't seem to have the heat dissipation issue because they don't concentrate the energy all in a tiny space.
overall efficiency of converting power into light is not much different for fluorescents and LEDs.
Yes, that matches my understanding. There are a number of modifying factors. For example, LEDs perform better in cold environments. There was an article a year or so ago touting how much energy Walmart was saving by putting LED based illumination in their freezer cases. LEDs scale downward to small sizes much better.
conceptually similar quantum phenomena are used to produce photons from electric energy and then a phosphor
Very nice observation! I never thought of it that way, but it seems pretty accurate.
There's a nice forum for LED discussions at http://www.candlepowerforums.com/
This is nominally a site for discussing flashlights, but the "LED" subforum gets pretty technical, and there is much discussion of practical aspects (heatsinking, where to buy, etc) as well.
Someone there pointed out a while ago that even a 100% efficient LED would still "feel" hot, because when you put your finger on it, your skin will happily convert the visible light emitted back into heat!
BTW, In my experience (grr) many of the failure modes of fluorescent lamps are heat-related. Especially in CFLs where they're trying to cram electronics and the tube itself into a tiny space.
It's funny that most people thought that vacuum tube technology was useless and compact fluorescent light (CFL) is just one of them that is still kicking! After reading some wiki I was reaffirmed that LEDs and CFLs have similar principles, quite different compared with incandescent light. The two still have some slight differences if you care to know:
*While LEDs have conduction electrons moving inside semiconductor, CFLs have free electrons moving in partial vacuum.
*While conduction electrons excite semiconductor bonds in LEDs, free electrons excite gas molecules in CFL.
*While excited semiconductor bonds decay and emit light in LEDs, excited gas molecules emit UV light in CFL, which strikes fluorescent powder, which turns UV into visible light.
*There is electrical resistance in semiconductors against electron motion but no electrical resistance in vacuum tubes, but the fluorescent powder does waste energy converting UV into visible light.
I guess that's why these trade-offs made them similar in efficiency.