I want to learn how to do the math to use the right LED drivers to run LED lights. These are the specs that I am working with:
3W Royal blue LED :DC Forward Voltage:3.6V~3.8V Forward Current: 700mA
3W Red LED DC Forward Voltage (VF): 2.1V-2.4Vdc DC Forward Current (IF): 350mA~1000mA
100w Full spectrum Current: 3A Voltage: 30V-36V
Specs of LED Driver constant current:
100w, input V 85V-265V, Output V 32-34V, Current 3000mA, efficenty 88%, power factor .98, opt temp -20-80
If I am understanding correctly. I could use one 100W LED driver with specs above to drive one 100W full spectrum. One end to the AC outlet, the other end directly onto the LED correct?
Now, the rest is where I get confused, and when I try to research I come up with totally different stuff, hence I know I am using the wrong lingo. Hopefully you guys can help me out with a bit more lame terms, or give me helpful keywords to google. I don't have much experience with electronics, but I don't mind research and reading.
So 1 100W LED driver, should be able to power up 30 3w LEDs, since it only adds up to 90W I would have 10W to spare. From what I've read it is not that straight forward, specially since the specs of the actual LEDs are different.
Lets start with the blues. LED Driver 30V to 34V output, will use 32V for the math since it uses some for itself. So 32V/3.7V=8.65, so will round up to 9. I can connect 9 Blue LEDs in series, then connect to the driver output of around 32Vs.
W=A x V = 9 (.7A x 3.7V)= 23.31 watts
Does that mean I can parall 4 sets of 9 on series, and it would be 23.31x4=93.24 watts usage?
Using an online LED calculator, Using 32 volts as source, 3.7 VF, 700 mA current, and 30 leds I get a Circuit's total current consumption of 2499.5mA , but if I put 31 Leds, curent consumption for the circuit goes down to 2456 mA, which is counter intuitive since I figured the more LEDs the more current consumption.
Seems I am having a hard time grasping the concept of Amps. Watts=AmpsxVolts. Since LED driver is in rage of 30 to 36 v, right at 33.33V at 3A = the 100w they advertise. So the 3A is per Volt then, or there is only 3A available even at 35V?
The other side of the Amps, on the LEDs. Blue leds are 3.7 V and .7A current. If I only had 3A total available, and each LED uses .7A, does it mean I could only light up like 4 or 5 LEDs before I max out on Amps? I think the red LEDs use 1A, does it mean I could only light up 3 of those leds with a 100W unit?
All the examples that I've found online do the math and use a resistor to make up for the difference for the volts, but I haven't found anything that goes a bit more into detail as to how the math works for the current and the amp.
For example, if I have a 12v source and LEDs with 2VF, I could put 6 on a series and plug straight to source, or I could use 6 resistors and connect 6 LEDs parallel. But what is going on with the amps and the current? Going back to my BLUE 9 LED series. The driver is 32V at 3A, if I connect one series of 9, the whole series would get the 3A, if I parrallel two series, would each series then only get 1.5A?