Go Down

Topic: LED dimming using Atmega328, generating 1V to 10V (Read 137 times) previous topic - next topic



My client needs a way to dim LEDs using MCU (atmega328 or ESP8266).
I am designing a module that can provide dimming voltage to the LED driver.

The LED driver uses 1V to 10V input to dim 10% to 100% intensity, my hardware must provide that voltage using atmega328. I have 5Volt power supply. What are my viable options to step up from 5V to be able to generate 1V to 10V variable?
I am thinking of PWM thru a gpio and then using optocoupler to duty cycle a 10Volt to get 1V (to 10V) that the driver needs.
LM2577 OR LM2596 based circuit can boost from 5V to 10V.

Pls suggest anything better?

Thanks and Regards,


Most opto couplers are slow, so use a low PWM frequency and suitable RC filtering after the opto coupler.

Or use a high speed opto coupler...
[ I will NOT respond to personal messages, I WILL delete them, use the forum please ]


The (Meanwell?) LED driver might already provide that 10volt internally.
You just need to switch that (Dim+ and Dim-) with e.g. an optocoupler.


I dont have details of the actual LED driver the client is using. I am required to provide 1V to 10V. Now I have used 12Volt as supply after PWM and RC filter to get a DC voltage from PWM.

RC values are 2K2 and 10uF.
This gives a cutoff freq of 7Hz.
PWM freq is 200hz.
Will the driver work with the impedance of RC filter?


If you want 0-10volt (or 1-10) DC from a 5volt PWM signal, then there are basically two ways.

1) amplify 5volt PWM to 10volt PWM, and use an RC filter.
2) use an RC filter to smooth 5volt PWM, and amplify DC 2x with an opamp.

#2 could be better if the load is unknown (#1 can't be loaded without voltage drop).

Go Up