We're supposed to set the brightness of our light bulb to a certain value that is equivalent to a certain value of lux.
Suppose we're in a laboratory, and this laboratory has to have 300 lux, so to satisfy the 300 lux we're suppposed to set a brightness value to the light bulb that is equivalent to 300 lux in terms of analog values from 0 to 255. Does anyone know the relationship between them?
No... You'll have to get a lightmeter and do the calibration yourself.
The brightness depends on the bulb (it's wattage rating and efficiency), the average/RMS voltage* applied to it the light fixture, the distance from the bulb, the reflectivity of the room, and windows or other sources of light, etc.
An incandescent light is [u]dimmed[/u] dimmed by turning-off the power for a part of the AC cycle (similar to PWM dimming). But depending on the software, there is no guarantee that a value of 126 corresponds to half-voltage or half-power (not the same thing), and in fact it may not be controlled by a 0-255 value in software.
DVDdoug:
No... You'll have to get a lightmeter and do the calibration yourself.
The brightness depends on the bulb (it's wattage rating and efficiency), the average/RMS voltage* applied to it the light fixture, the distance from the bulb, the reflectivity of the room, and windows or other sources of light, etc.
An incandescent light is [u]dimmed[/u] dimmed by turning-off the power for a part of the AC cycle (similar to PWM dimming). But depending on the software, there is no guarantee that a value of 126 corresponds to half-voltage or half-power (not the same thing), and in fact it may not be controlled by a 0-255 value in software.
I forgot to indicate that I have a luxmeter.
If I do the calibration myself, do you mind to elaborate? I'm a beginner and I haven't the slightest idea what to do. Thank you very much, sir.