Efficiencies of power supplies with PWM

Datasheets of power supplies usually give efficiencies for different load levels. Is it possible to make a general statement about the efficiencies when PWM switching the load? So, if I e.g. power an LED at 50% PWM, which, when ON, puts 90% load on the power supply, do I have to look at the efficiency for 45%, or for 90% and 0%?

In some way it certainly depends on the relationship between the PWM frequency and the switching frequency of the power supply. The latter one doesn't seem to be in the datasheets, though, so I do not have an intuition there.

I have also no idea how it works with dc-input cc drivers.

If general statements are not possible, I would be interested in the following devices:
-the LDB-300Lh this
-PLC-45-12

The efficiency of switching LEDs is pretty different from switch mode regulators...

En for switching a led, what do you call efficient? If you drive it via a transistor and with a linear regulation, for example a resistor, do you call the energy in the resistor efficient? Aka, are you only interested in the loss in the mosfet?

And are you interested in the switching losses of the mosfet? That would all make it more complex etc.

But in the base, the efficiency doesn't really change if you dim it, the losses due to switching are not very high (in a good design). So that leaves you with the steady state efficiency.

What will change is the power it draws. Is that what you're interested in? If you have a setup which is 90% efficient (is it a switch mode constant current driver or whatever) for a 3W led and you dim it to 50% the power you need is
50% x (100% x 3W / 90%) = 0,5 x 3W / 0,9 = 1,7W of input power. And 50% x 3W = 0,5 x 3W = 1,5W output power.

But what is you're real concern?

What I mean is: Imagine I have a 10W LED lamp, that I, for the sake of simplicity, drive directly from a 10W-output power supply, with ideal PWM-switches (I do not want to consider the loss at the MOSFET).
The full load efficiency of the supply may be 83%, the 50% load efficiency max be 50% and the no-load power consumption may be 1W.

I am now driving my LED with a 50% duty cycle via PWM with frequency f.
What will my power consumption then be?

  1. 50%*1W + 50%*10/83% = 6.5W, that is, the mean of the two consumptions, or
  2. 5W/50% = 10W?

Obviously, when f is very low, then it is just an LED that is sometimes switched on, and sometimes switched OFF. That is clearly case 1.
When f gets very high, though, then I at some point the processes in the supply will not be able to resolve the load change and case 2. applies.

Now the questions are:

  1. Is that consideration correct?
  2. Where does that transition actually happen? Is a common, real power supply in case 1 or 2?
  3. Is there a similar view on the processes in CC driver?

The PLC-45 is not dimmable so PWM is not relevant.*

For the LDB-300L, they give you efficiency graphs for varying voltages (in and out) but not for varying current. I assume that's at 100%, and that should represent worst-case power-loss.

If you want to know the efficiency at 50% PWM, you'd probably have to measure it. (Of course it's zero-percent efficient at zero-percent PWW and the specs show it consumes 10mA with "no load".)

Obviously, when f is very low, then it is just an LED that is sometimes switched on, and sometimes switched OFF.

The specs call for 100-1000Hz PWM I assume below that you begin "blinking" it.

  • Like all switching supplies there is internal PWM.

DVDdoug:
The PLC-45 is not dimmable so PWM is not relevant.*

But I can but a MOSFET between the power supply and the LED. I even kind of explicitly mentioned that in my last post.

DVDdoug:
For the LDB-300L, they give you efficiency graphs for varying voltages (in and out) but not for varying current. I assume that's at 100%, and that should represent worst-case power-loss.

Of course they don't give you that for varying currents, because there shouldn't be any with a CC supply. The various voltages also do not help me. The tell me the difference between the efficiencies of powering one 5W led or 2 5W leds, but not for running a 10W led with a 50% duty cycle.

DVDdoug:
If you want to know the efficiency at 50% PWM, you'd probably have to measure it.

I am pretty sure one could do some theoretical considerations. I tried it above. One of the things I am missing is the common frquency that switching supplies are switched at.

DVDdoug:
(Of course it's zero-percent efficient at zero-percent PWW and the specs show it consumes 10mA with "no load".)

That is why I gave a no-load consumption in my example calculations above :slight_smile:

ElCaron:
Obviously, when f is very low, then it is just an LED that is sometimes switched on, and sometimes switched OFF. That is clearly case 1.

Yeah.... but remember that when the PWM frequency is super low.... eg 0.1 Hz, and with 50% duty cycle.... then the load will be powered for 5 seconds, and will be unpowered for 5 seconds. So it's going to be off for quite a long time. This is basically a DC consideration, and the efficiency can be considered under DC terms.

If considering power efficiency, just need to model (or measure) the losses in the source (the power supply) and know the amount of power delivered to the load.

For relatively high frequency, the power into the load will depend on frequency... and depends on what the load is of course... like...resistive...or a mix of resistive and reactive. The efficiency would also depend on what's inside the power source. Switch mode power supply efficiency is relatively high due to the reactive components (L and C) it uses.

One of the things I am missing is the common frquency that switching supplies are switched at.

No such thing. Switching frequency could be anything from 20kHz to 2MHz. It is usually kept above audible frequencies.

It depends on how an individual LED power supply accomplishes PWM. Some are going to actually turn current to the LEDs on and off, some will adjust the constant current according to duty cycle of the incoming PWM.

There is no one single design that every LED constant current switch mode power supply uses. So you have to rely on the specs and your own testing.

BTW, the datasheet for the LDB-L LED CC regulator gives the switching frequency that they use, but I don't see how that will help you calculate efficiency.

Also, page 3 of that datasheet would seem to indicate that it regulates current to a constant value based on the duty cycle of the incoming PWM, not by shutting the LEDs on and off with the incoming PWM.

Ok. It seems to come down to measuring …

I have the problem that I want to dim RGB strips with MOSFETs. I want to illuminate the whole wall around the room of about 16 meters. The issue is that I do NOT want to do that with the full, almost 200W (let’s say 12W/m) of light that 16m of usual RGB strip have. From my experience with strip as indirect lighting, <20W will probably enough, so a duty cycle of about 10%.
However, since during the duty cycle the full 192W are drawn. I would not feel good to just put in small power supply because in mean, it will have a lower draw.
On the other hand, if I install a sufficiently powerful supply like this, I wonder if I will have terrible efficiency because I am at a mean of 10% load, or good efficiency, because full power is drawn during the duty cycle.

ElCaron:
However, since during the duty cycle the full 192W are drawn. I would not feel good to just put in small power supply because in mean, it will have a lower draw.

Probably overthinking things. If you use a mosfet switch, you usually get high very efficiency. Driving LEDS will just be a matter of making sure that the power source is able to output the required power. The LEDs will be using that power.

A smaller power supply will have relatively less power output capability. The 'draw' is basically how much power or current the load draws from the power supply.

You are using "dumb" RGB LED strips, with a +12V input and then R, G, B for control lines?
Probably better off with the room in multiple segments, with a power supply per segment.
12V, 10A supplies are not expensive, and 3 N-channel MOSFETs per segment to sink current thru the LEDs.

Good DC-DC converters will have an efficiency v. load graph. You always have to size the
supply to handle the peak possible current, anyway, so I think you are pretty much resigned
to be running most of the time below maximum efficiency (which is usually fairly near the
maximum power output). However good supplies still have reasonable efficiency across most
of the output current range.

Thanks for the further inputs. I'll look for a sufficiently powered one then. I'll report what efficiency I measure.

CrossRoads:
You are using "dumb" RGB LED strips, with a +12V input and then R, G, B for control lines?
Probably better off with the room in multiple segments, with a power supply per segment.
12V, 10A supplies are not expensive, and 3 N-channel MOSFETs per segment to sink current thru the LEDs.

I will segment anyway, one wall, one controller, so 12 MOSFETs overall. I don't see how multiple power supplies would help me, though. They will still be in the low load range.

On the other hand, if I install a sufficiently powerful supply like this, I wonder if I will have terrible efficiency because I am at a mean of 10% load, or good efficiency, because full power is drawn during the duty cycle.

  1. Forget efficiency, you are totally misusing the word, and it is not relevant, in any way to your project.

  2. If you only have a 10% draw from your LEDs then you only need a power supply capable of supplying 10% of the maximum current ( plus the relevant safety margin of say another 10% to 20% ). Sure the peak current will be the same as if you had the LEDs running at full power but that would be smoothed out by the large capacitor you would put on the output of the power supply. This capacitor would keep up the voltage during the ON peaks of the PWM.

Grumpy_Mike:

  1. Forget efficiency, you are totally misusing the word, and it is not relevant, in any way to your project.

Could you elaborate that? Since we are talking about the "EFFICIENCY vs LOAD" graphs in the datasheets, I don't quite understand how that is a misuse of the word.

Grumpy_Mike:
2) If you only have a 10% draw from your LEDs then you only need a power supply capable of supplying 10% of the maximum current ( plus the relevant safety margin of say another 10% to 20% ). Sure the peak current will be the same as if you had the LEDs running at full power but that would be smoothed out by the large capacitor you would put on the output of the power supply. This capacitor would keep up the voltage during the ON peaks of the PWM.

That would be an option. However I would still be interested in my question. Why would I go and implement a software limit for the dimmer, if I don't have to.

Should just figure out how much power your set of LEDs require..... such as "P" Watts.

Then choose a switch mode power supply that can output....say...

Psupply(max) = P*(1/70%) = P*(10/7) Watt, so the LEDs draw say 70 percent of max output power of the supply.

Well, without the capacitor that Grumpy Mike mentioned, I am still not sure if that is a wise idea. During the PWM duty cycle, the strip will draw 20A. Do you really think that it is safe (or even possible) to take that from a Power supply that is perhaps built for 4A, only because the 20A are just drawn 10% of the time at 1kHz?

To store the charge for one duty cycle at 1kHz, the capacitance would have to be 1.7mF. But voltage of the capacitor drops exponentially when discharging, so it would have to be much bigger or the the voltage of the power supply would break down considerably in each cycle, which doesn't sound too healthy.
I would need 10mF for a voltage drop to 10V, 20mF for 11V and 40mF for 11.5V.
So would I actuall put in something like this?: http://www.mouser.de/ProductDetail/Kemet/ELH229M025AS4AA/?

No point in doing anything until you can figure out how much real power your full set of LEDs require.

If 10 Watts.... try a 15 Watt supply.

No point in doing anything until you can figure out how much real power your full set of LEDs require.

One of us is not understanding the point.

I don't know how much MEAN power my full set of LEDs will consume when dimmed to a sufficient level. But I know one thing: It will be MUCH less than the 200-240W that it will roundabout draw during a duty cycle
It doesn't matter if the mean is 10W, 20W or 50W, my concerns about about a 15W, 30W or 75W power supply that I stated above stand.

Did you read anything that I said?