I've got an electronic dimmable ballast with 0-10V control, the light rating is from 1% to 100%.
Measuring the voltage on ballast's control PINS gives 10V, so the ballast itself is producing current, the light output is at full power.
I've got a circuit using a LM324N OpAmp:
Receives on PIN 3 and 2 the +5V PWM signal from arduino
Receives on PIN 4 and 11 the +12V
Outputs on PIN 1 0V to +12V
Then the signal from the OpAmp passes trough a v-reg that brings it down to +10V.
Ardunio PWM PIN is at value 0, measuring the v-reg output PIN will give me 0V.
When I connect the ballast's control PIN+ to the v-reg output and PIN- to common circuit ground, the ballast will automatically dim the light to a certain value, but for sure it's more than 1% because the light is still intense.
If I measure the v-reg output PIN I read something like +1.5V, but I still have the arduino PWM value at 0.
I assume that the ballast is injecting 1.5V back into the circuit and that's the reason I cannot full dim down.
If I start increasing the PWM value then the ballast dims up to the full power.
I've got two questions for those HW gurus out there.
(The most important question) Is this the correct way to dim an electronic ballast using the 0-10V control ?
How do I bring that +1.5V to 0 when the ballast is connected ?
(The most important question) Is this the correct way to dim an electronic ballast using the 0-10V control ?
No
How do I bring that +1.5V to 0 when the ballast is connected ?
Have a lower impedance driving the input pins.
It looks like the input pins need to sink current.
Could you post a schematic of the op amp and voltage regulator as from the description it sounds very wrong.
The voltage regulator on the end is just wrong. This takes a variable input and produces a fixed output. This is exactly what you don't want. What is more when you provide less than the required regulation voltage it no longer acts as a voltage regulator.
Basically ditch the voltage regulator and feed the output of the op amp directly into the base of a transistor. The collector to 12V and the emitter to a 470R resistor to ground. Also connect the resistor to the input pin of your controller.
Maybe I'm thinking this wrong.. but the v-reg in this case is acting as a protection to the ballast, it will not allow more than 10V to pass, since the opamp is feeded by 12V, but will allow less than 10V to pass.
What would be the role of the transistor on the scenario you've described ?
We're talking about a Helvar EL239sc with two pins for the 0-10V control.
I was able to get the following info from Helvar:
The computer must have an analog output 0 - 10 VDC. Buffer should be a "sink"-type buffer. The current should be taken from
the HFC-ballast. The HFC-ballast is a current source.
Active light control range is 1 - 10 VDC. These are the limits when the light level is changing.
Maybe I'm thinking this wrong.. but the v-reg in this case is acting as a protection to the ballast,
Yes your thinking is wrong.
Buffer should be a "sink"-type buffer. The current should be taken from
the HFC-ballast. The HFC-ballast is a current source.
That says it all, you don't need to supply a voltage to this controller at all you need to sink current from it. Sinking current is draining it to ground.
What would be the role of the transistor on the scenario you've described ?
To act as a low impedance sink. An op amp can only sink a small amount of current, the transistor would act as a buffer to allow the controller to sink current.
(a buffer is something that cushions two signals. In this case it allows you to turn a high impedance output into a low impedance one.)
The transistor would allow current to flow out of the controller pin until the voltage across the emitter resistor was equal (plus 0.7V) to the voltage on the base, thus it would allow control of the current sink. For a proper design we would have to know what amount of current the controller sinks. That way we could get the emitter resistor the exact right value.
@BillHo: the dimmable circuit was taken from that post. But the user never told if it really worked nor told about the voltage injection from the ballast.
For a proper design we would have to know what amount of current the controller sinks. That way we could get the emitter resistor the exact right value.
I can do a real world measurement later today, but I think I've read somewhere that the control current is at max 2mA.
@Grumpy_Mike: So basically will reduce the ripple effect ?
Thanks for the patience and also for the contribution, but I'm a newb at electronics.
Did I make it this time ?
I'm now wondering the working principle, after reading the Wikipedia article about transistors does the following sentence descrives it ?
"In a grounded-emitter transistor circuit, (...), as the base voltage rises the base and collector current rise exponentially, and the collector voltage drops because of the collector load resistor."
From my understanding, when the opamp signal starts rising on the base, the ballast control voltage (+10V when base=0V) connected to the collector will start to drop.