So, I'm in a predicament. I'm in the process of building my home theater and I want to use 12 volt LED tape for accent lighting (behind some acoustic panels and downlighting some steps). The AC lighting that I installed is going to be controlled with some Z-Wave dimmers so that the lighting can be controlled through a Logitech Harmony Ultimate, subsequently allowing lighting levels to be programmed into the scenes in the Harmony (lights dim for movie, go up during pause, etc.).
I found some dimmable power supplies that would allow me to dim the power to them with an AC dimmer, then they would output 12 volt to the LED strips. I'm hesitant on this for a few reasons, 1 because the manufacturer doesn't guarantee compatibility with any of the Z-Wave 'low-voltage, magnetic dimmers' I found, and 2, I need this to be flicker free, especially at very low levels, and 3, they're expensive. I'll be paying around $600 just for the power supplies...
I had an idea though. Could I 'read' the AC voltage level (assuming the AC dimmer dims voltage) with a basic voltage divider or something similar with an input on an Arduino (scale it from 0-255 for PWM), then simply output that value to a digital pin, then into a hefty transistor or SSR? That way I could use a constant 12 volt, 600 watt power supply and power the LEDs that way.
Would this idea work? Are there any reasons why it wouldn't? What would the circuit look like in order to read the AC? Any help would be really appreciated. I'd love to put the money towards something else in my theater!