Power led regulation with PWM

I am trying to not have to order more parts and use what i have to hand, namely ATMEGA328 with arduino firmware. Is it a feasible idea to run a 3W led via a transistor and a pwm pin to control the apparent voltage the led needs?

Basically the led needs 10.2V and i have 12V, if i set the pwm to about 85% duty would that effectively control the led with out killing it?

I have all the parts to hand to make this, but if i have to I'll just buy a proper dc-dc converter.

Thanks for any suggestions :-)

What sort of LED? (datasheet)

What transistor?

Do you have some current-limiting set up?

Basically the led needs 10.2V and i have 12V, if i set the pwm to about 85% duty would that effectively control the led with out killing it

Basically PWM does not control the voltage as such but gives a pulse signal. See this:- http://www.thebox.myzen.co.uk/Tutorial/PWM.html So it will always send out the peak voltage. Whether this is too much for your LEDs is hard to say, but generally you need a constant current supply for high power LEDs. It depends on the peak power rating of the LED and if there is somthing that will limit the current.

Hi,

In this case a dc-to dc converter isnt necessary or probably worth the effort.

The LEDs have a Forward Voltage of 10.2V. You plan to use a transitor to switch current, check the datasheet or find its exact VDrop or by testing it in a circuit. For this example though lets estimate it would provide a voltage drop of 0.5V or so. The excess voltage as it stands in this configuration is: (12V VCC - 10.2VF LED - 0.5V Transistor) = 1.3V.

Adding a resistor to this circuit to handle the 1.3V as a load and limit the current to the LED is a common practice. To calculate the values for the resistor you will need to know the Peak Forward Current of your LEDs from their datasheet. This will also allow you to calculate the required power rating for the Resistor.There are numerous resources on the web that explain how to calculate and select the proper series resistor value.

Lets look at the general power efficiency of this option vs using a DC to DC Converter. Say you purchase a converter that claims to be 90% efficient. In this example we use a resistor to handle 1.3V of the 12V which equates to 89.16% Power efficiency, so its very close.

Is it a feasible idea to run a 3W led via a transistor and a pwm pin...?

It requires a few more parts than that. As Mike says, high-power LEDs (1W or more) are usually run from a special constant-current LED power supply.

These circuits can be tricky to build, and most people just buy the constant-current supply. There are special ICs, or some switching voltage-regulator chips can be used to make a switching current-regulator.

Constant-current supplies normally do use PWM, but there's a bit more to it... There is an inductor to "smooth out" the current from the PWM pulse, and there is feedback "monitoring" the current and adjusting the pulse-width to keep the current constant.

Adding a resistor to this circuit to handle the 1.3V as a load and limit the current to the LED is a common practice.

That's NOT a common practice with high-power LEDs, and it's BAD PRACTICE when most of the voltage (~10V out of 12V) is dropped across the LED. Since LED's are "constant voltage" devices, any changes in power-supply voltage, or in the LED voltage (due to temperature, etc.), will end-up as equal changes across the resistor... For example, if the power supply voltage were to rise by 10%, the voltage across the resistor will almost double! Therefore, the current throught the LED and resistor will also double!

You can get-around that problem by using a higher-voltage supply (maybe 24V), but then the resistor needs to dissipate more than 1W. ...So, the "standard solution" is to use a constant-current switching regulator, which is much more efficient.

A regulated power supply won't vary by 10% (when operating normally), but the point is that the effect of any variations is magnified. And actually, you can often get-away with this stuff in a one-off hobby-project, because you can adjust the resistor value by trial-and-error to compensate for component variations. (That's assuming you don't blow-up too many parts during the trial-and-error testing. ;) )

DVDdoug, Just wanted to clarify a couple of things and explain my last recommendation a bit more. I did say it was common practice because it is, I agree thats it's not best practice though. I see this method of using a resistor for HP LED current limiting employed in commercial products all the time. The most recent example is a high powered LED flashlight I bought from Costco. Your advise is true as it pertains to developing an ideal solution, especially if the OP is using an unregulated Power Supply which wasn't specified. I incorrectly assumed they weren't. I recommended using a resistor over other solutions simply because their primary requirement specifically asked for cost effectiveness. Your recommendations are without question the best solution, but you point out it can be done successfully with using a resistor if some proper care to the design is given. I personally wouldn't like to implement such a simple driver in one of my projects, but would if I trusted the stability of my power source, and was constrained by costs. I would also try to drive the LED under its max ratings.

Yes you see a lot of very poor designs commercially for sale. In the case of a flash light part of the current control mechanism is the battery impedance. Therefore it does not translate to a mains power supply. The point is that with a power LED the constant current like performance you get with a seriese resistor will not work because of its low value. Also the forward voltage fluctuates more as temprature and age affect it.