PWM -> Linear Dimmer?

Hello,

I am working on a circuit for use with a dimming LED Driver that takes a 0-10V linear input. I have worked with a variety of averaging/filtering circuits before, but would like to know what everyone thought as far as the most cost effective way to take the PWM arduino output at 5v and convert that to a 0-10V analog signal corresponding to the proper duty cycle would be. The lease amount of ripple possible is always desired, but I'd like to stay in a cost effective method.

Also, when filtering the high frequency content out to get an "averaged" signal, I would assume its best to filter after amplifying to 10V, is this correct? Any advice or suggestions would be much appreciated.

Thanks & Regards,

LP

would assume its best to filter after amplifying to 10V, is this correct?

Yes.

I would use a transistor to take it up to 10 or 12V and then a low pass filter.

jengil:
I am working on a circuit for use with a dimming LED Driver that takes a 0-10V linear input ... take the PWM arduino output at 5v and convert that to a 0-10V analog signal corresponding to the proper duty cycle would be.

Hi LP, I'm not clear what question you are asking. To dim an led, you need a pwm signal. You cannot dim an led properly with an analog voltage.

Paul

I assumes as he said this:-

use with a dimming LED Driver that takes a 0-10V linear input.

That he wanted to condition a signal to feed into this driver.

Sounds good Mike, I will give that a try. And Paul, Yes I have a driver that uses analog input voltage as a dimming control.

So you want to take a pwm signal from the Arduino, turn it into an analog voltage, feed that into your led driver which then turns the analog voltage back into a pwm signal again... hmm... I see a potential simplification there!

But if the driver in question is a constant current power driver, the scope for simplification is limited.