Can use buck converter instead of resistor to set proper LED voltage?

Nearly every basic circuit I've come across that involves LEDs uses resistors to lower the voltage, typically 5V and higher, to the correct one for each LED, typically 1.8-3.0V, so they don't burn out or die prematurely. Perhaps more advanced circuits use voltage regulators. But both waste energy as heat, which is inefficient.

So I was wondering if it's possible and advisable to use a small buck converter, which uses more efficient switching circuitry, to bring a higher voltage down to an LED's proper voltage level, so that less energy is wasted. This would seem to be especially useful in a remote device that runs on batteries or solar/wind, like a standalone indoor night light or outdoor security light.

If the answer is yes, then what about boost converters, so you can use a single AA or AAA battery, or even 1.5V coin or button battery?

It's not the voltage that needs controlling but the current, say you have an LED that gives the proper brightness with 15 mA flowing, then a constant current controller would hold the current at 15 mA whatever the voltage, as long as it was higher than the minimum required and lower than the maximum the circuit could tolerate.

No, you cannot.

You're trying to hold the current through the LED within the rated operating range - typically 1~20mA for an indicator LED. However, in the regime of interest, the I-V curve of an LED is practically vertical - a tiny difference in applied voltage makes the difference between the LED being too dim, and the LED burning out. The forward voltage of the LED also depends on temperature, too.

So yeah, you still need a resistor.

Exceptions to needing a resistor on an LED:

  • With a bench power supply, if you're hands aren't shaky and the voltage adjust pot is clean and smooth, you can slowly ramp up the voltage while watching the current, and run an indicator LED off a constant voltage - at least as long as it stays at the current temperature. (the forward voltage of LEDs drops as they get hot, meaning a positive feedback cycle...).
  • With high power (ie, ~1W dies) LEDs, in those panels with a bunch of LEDs in it (to make the 10~150W LED panels), the I-V curve is less-vertical. If your cooling is solid (to prevent positive feedback as they heat up and forward voltage drops), and you're real careful setting it up and give it some safety margin, you can run them off a constant voltage, without added ballast resistors. This is not the best driving scheme, but it can be made to work reliably. High power LEDs are significantly more forgiving of abuse (as long as you keep them cool) than indicator LEDs are.
  • There exist constant current buck/boost converters of various descriptions, as well as linear constant current drivers. These are the standard and "correct" solution for driving higher power LEDs.

Thanks for the responses. I guess I underestimated how complex driving LEDs actually is, or can be in real-world applications, and how they actually operate. I always assumed that the resistor was there to keep the voltage down to a level that the LED(s) could tolerate, but also high enough to activate the LED, and that the LED itself would "regulate" the current, which is why you can power an LED with say a 9V switching adapter provided you used the proper resistor value, without it burning out, even if the adapter can put out several amps.

So, let me rephrase my question. Are there LED driver circuits that, like a resistor keep the voltage within levels that the given LED(s) can tolerate, but do so more efficiently than a resistor, so as to waste less power, for a given input voltage, while also "regulating" the current, so as to get the desired light output and maximal battery life?

What you are describing is a constant current switching driver. Not impossible but for a single 20mA LED is a bit pricey and they will be quite rare to get hold of.

Grumpy_Mike:
What you are describing is a constant current switching driver. Not impossible but for a single 20mA LED is a bit pricey and they will be quite rare to get hold of.

The application I was thinking of would have a bunch of LEDs, not just one. If it was just one then a resistor is fine, if inefficient. But, 5, 10, 15 or more LEDs, on for long periods of time, seem to call for a more efficient power delivery method. I wonder why there are no self-contained switching circuits on a chip. Is it because the caps and inductors involved are just too big and put out too much heat or EM noise?

Use a driver like MAX7219, can current control up to 64 individual LEDs (all with the same current).
15 levels of brightness control too.
There are other similar chips, such as TLC5940 where you can individually control brightness per output.

Or you can go with WS2812B, which is RGB and each color has 8-bit PWM control.

You can find all kinds of constant current drivers for LEDs.
https://www.digikey.com/products/en/integrated-circuits-ics/pmic-led-drivers/745?FV=1f140000%2Cffe002e9&mnonly=0&ColumnSort=1000011&page=2&stock=1&pbfree=0&rohs=0&cad=0&datasheet=0&nstock=0&photo=0&nonrohs=0&newproducts=0&k=led+driver&quantity=&ptm=0&fid=0&pageSize=25&pkeyword=led+driver
Most require external components to set the current level, so now you're going from a simple 10 cent (or less) resistor for one LED to a chip and several components.

For 20mA, it's hardly worth it for individual LEDs if they are being driven from a 5V source.

I'm thinking of trying the same thing but not worried about voltage. They make these strip RGB leds, 1m can have 144 leds each drawing 60A, 20MA FOR EACH COLOR and they sell various power supplies for the amount of strips you want to use, but they are a little pricey. $10.00 to $15.00 apiece. So I want to but 4 strips @ 144 leds each so you would need a power supply of 5v x 20A. But problem is i can only buy one strip at a time and I want to use 5v x 20A power supply. So I was thinking of using a current spliter so I can use one strip at the time until I get all 4 strips. But I was wondering if the circuit can hold 5A for each leg without a load on the other 3 legs. I think this what you asking but on a smaller scale. With a current splitter you can make different Amperage amounts each led. Maybe this idea can give you some ideas you might to try. Making the circuit is simple to make and only uses resistors but by using multiple resistors would keep the heat down and solve your problem. Well good with your endeavor.

I think this what you asking but on a smaller scale. With a current splitter you can make different Amperage amounts each led

Can you now?
Never come across such a circuit, pray tell how this magic works. It is clear you know little about ohms law. A 20A power supply rating means you can supply up to 20A not that it pushes 20A through a circuit. Their is no need for this mythical current splitter physics does it for you.

I wonder why there are no self-contained switching circuits on a chip. Is it because the caps and inductors involved are just too big and put out too much heat or EM noise?

Yes but also their are much better ways of controlling the LEDs, like the WS2812 that has a liner regulator inside the LED package. You can’t put big values of inductors and capacitors on an IC chip they would have to be external and that would spoil the whole thing.

habanero:
I wonder why there are no self-contained switching circuits on a chip.

There are self-contained DC-DC converters in single packages, integrating the inductor, capacitors, switching elements and controller. https://www.digikey.com/products/en/power-supplies-board-mount/dc-dc-converters/922

These are just spiffier, more expensive, more compact versions of the DC-DC converter modules you can get from pololu or ebay.

Grumpy_Mike:
Can you now?
Never come across such a circuit, pray tell how this magic works. It is clear you know little about ohms law.

Surely its Kirchoff's laws, not Ohm's law that is relevant here.