Hello:
I am making a prototype Uno LED light controller to drive/monitor an array of 3W LEDs to provide lighting for an indoor grow light.
I plan to use some cannabilized 12V power supplies from old computers to power the arduino and LEDs (Arduino for on/off power according to desired photoperiod, maybe irrigation later....).
The LEDs are “full-spectrum, 3W” LEDs mounted on chips (or small PCBs with soldering lands on them). There is VERY little documentation with the LEDs other than there is an “apparently nominal” voltage rating of 3.2 volts printed on the package. I have no other documentation on these LEDs.
I have been experimenting with them by driving (3) of these LEDs in series, and biasing the current through the string with a single, IRF510 MOSFET, with the LEDs connected in series in the drain circuit. So the circuit is just: Vcc connected to the 3-LED string (series), connected to the IRF510 MOSFET drain, with the source lead grounded.....that’s it.
The issue is: How to best bias the Mosfet string (and I plan to use a LOT of these 3-LED strings) using the Arduino to turn each string on/off using shift registers. I know I could set up a feedback loop to monitor the current in each string, but I want this to be simple and relatively cheap.
Plan A: Just use some plain old JFETs in parallel and short the gate-to-source on each of them to act as a constant current source. The ones I have been using are good for about 50 mA each so I have been using 4 to 5 of them in parallel to provide enough current to light the LEDs without them getting warm at all. This is putting about 200-225 mA through the string without anything warming up at all. I am quite sure that they could handle more current, but I am still experimenting and I want to stress the LEDs at little as possible while still emitting “substantial” light.
This method DOES keep the LED currents stable and constant, the only issue is that I have to make hardware changes to change the LED currents/illumination. I anticipate making changes to the drive current for each string to try and find a “sweet-spot” of illumination/heat-generation/LED longevity.
Plan B: Use a single IRF510 MOSFET (or another type) to bias each string. Right now I am just using a variable resistor in a voltage divider to bias the gate of the MOSFET to the desired Drain-Source current. The problem here is that I can’t control the current very precisely (it drifts up and down...I am using a Fluke 77 to monitor the current DIRECTLY).
I like the MOSFET-controlled idea in that it keeps the parts count down and could more easily evolve into something more elaborate with some feedback control with some additional circuitry, but for now I am just trying to get the LED currents stable using the MOSFET.
Thanks for any ideas!.....