Help understanding AMPs, Current, Watts, Volts etc in lame terms

I want to learn how to do the math to use the right LED drivers to run LED lights. These are the specs that I am working with:

3W Royal blue LED :DC Forward Voltage:3.6V~3.8V Forward Current: 700mA

3W Red LED DC Forward Voltage (VF): 2.1V-2.4Vdc DC Forward Current (IF): 350mA~1000mA

100w Full spectrum Current: 3A Voltage: 30V-36V

Specs of LED Driver constant current:

100w, input V 85V-265V, Output V 32-34V, Current 3000mA, efficenty 88%, power factor .98, opt temp -20-80

If I am understanding correctly. I could use one 100W LED driver with specs above to drive one 100W full spectrum. One end to the AC outlet, the other end directly onto the LED correct?

Now, the rest is where I get confused, and when I try to research I come up with totally different stuff, hence I know I am using the wrong lingo. Hopefully you guys can help me out with a bit more lame terms, or give me helpful keywords to google. I don't have much experience with electronics, but I don't mind research and reading.

So 1 100W LED driver, should be able to power up 30 3w LEDs, since it only adds up to 90W I would have 10W to spare. From what I've read it is not that straight forward, specially since the specs of the actual LEDs are different.

Lets start with the blues. LED Driver 30V to 34V output, will use 32V for the math since it uses some for itself. So 32V/3.7V=8.65, so will round up to 9. I can connect 9 Blue LEDs in series, then connect to the driver output of around 32Vs.

W=A x V = 9 (.7A x 3.7V)= 23.31 watts

Does that mean I can parall 4 sets of 9 on series, and it would be 23.31x4=93.24 watts usage?

Using an online LED calculator, Using 32 volts as source, 3.7 VF, 700 mA current, and 30 leds I get a Circuit's total current consumption of 2499.5mA , but if I put 31 Leds, curent consumption for the circuit goes down to 2456 mA, which is counter intuitive since I figured the more LEDs the more current consumption.

Seems I am having a hard time grasping the concept of Amps. Watts=AmpsxVolts. Since LED driver is in rage of 30 to 36 v, right at 33.33V at 3A = the 100w they advertise. So the 3A is per Volt then, or there is only 3A available even at 35V?

The other side of the Amps, on the LEDs. Blue leds are 3.7 V and .7A current. If I only had 3A total available, and each LED uses .7A, does it mean I could only light up like 4 or 5 LEDs before I max out on Amps? I think the red LEDs use 1A, does it mean I could only light up 3 of those leds with a 100W unit?

All the examples that I've found online do the math and use a resistor to make up for the difference for the volts, but I haven't found anything that goes a bit more into detail as to how the math works for the current and the amp.

For example, if I have a 12v source and LEDs with 2VF, I could put 6 on a series and plug straight to source, or I could use 6 resistors and connect 6 LEDs parallel. But what is going on with the amps and the current? Going back to my BLUE 9 LED series. The driver is 32V at 3A, if I connect one series of 9, the whole series would get the 3A, if I parrallel two series, would each series then only get 1.5A?

The rules change for LEDs once the current rises above 100mA.
Below this you can use resistors to limit the current. Above it you have to use a constant current supply. Basically a constant current supply can only control one LED or if there is enough voltage several LEDs in seriese. That means that all those LEDs in series have to want / need the same current. You can not mix and match LEDs with a single constant current drive.

I think you are mixing up things.

Yes, by all means i do realize i am the one that it is all mixed up, once the math is understood it all does add up, for the most part :wink:

The LED Driver I mentioned on my post is Constant Current. 100W, 30 to 34V output, 3A.... So the constant current part means the 3A is the constant? So yes I can power the 100W LED that runs on 30to36V at 3A current correct?

Now, as far as the other LEDs, since they call for .7A and 1A, since the 100W LED Driver is at 3A it wouldn't work for those? Would I then need one LED driver that runs .7A for my Blue Leds, and another LED driver that runs 1A for the Red Leds?

There is no easy way to break the 3A down to run 1A LEDs?

The LED Driver I mentioned on my post is Constant Current. 100W, 30 to 34V output, 3A.... So the constant current part means the 3A is the constant?

Yes.

So yes I can power the 100W LED that runs on 30to36V at 3A current correct?

One LED is not going to take 30V it will take typically 3 to 4V if it is a single element LED. That voltage rating is the maximum you need to supply. A 3A constant current drive could be powered by a 5V supply so long as it could supply 3A. What a constant current drive does it to automatically adjust the voltage output until 3A flows. If it can do that with a few volts there is no need to have a power supply capable of more voltage.

Now, as far as the other LEDs, since they call for .7A and 1A, since the 100W LED Driver is at 3A it wouldn't work for those?

Correct they would burn and die.

Would I then need one LED driver that runs .7A for my Blue Leds, and another LED driver that runs 1A for the Red Leds?

Yes.

There is no easy way to break the 3A down to run 1A LEDs?

Correct. I am not sure if there is a complicated way either.

So I keep looking at LED drivers. The biggest one that I found that outputs a constant current of .7A does up to 40V allowing me to hook up to 11 Blue LEDs. If I wanted to light up 40 of these, I could purchase 4 of these LED drivers.

If I look at "bigger" drivers, with more Watts, the current is also higher. There is no way to run more than 11 of these blue LEDs at the same time with only one driver? Basically my best solution would be to purchase many of these smaller drivers?

close, but no cigar.

if your LED is looking for 100 watts, you want to pick a 100 watt driver.
you will find them in 20V, 32v, 38v, 42v, 48v... etc.

the voltage range is often in the 50% to maximum, meaning you cannot use them to power less than the voltage that is 50% of the maximum. some are more like 10volts minimum, some a little less.
the idea is that there has to be come juice left over to use to measure !

as was stated your LED is probably using 3 to 6 volts.
let's use 6 volts.. if you have 6 LED, at 6 volts, the total voltage drop would be 36 volts.
a 38v driver would probably have a lower voltage level of 18 volts, meaning you would have to use 3, but no more than 6.

also, you do not want to run near or at maximum. typically a safe area is up to, but not more than 80%

the more you spend on the driver, the closer to the maximum you can push your circuit. I have had some of the cheap chinese versions fail at a 70% load. the Fulham drivers, the ThoroLED can be pushed a lot closer to maximum, but cost 3 to 4 times more.

check the meanwell data sheets
http://www.meanwellusa.com/product/led/LED.html
50% minimum voltage, 350mA constant current

Fulham data sheets

10 vol minumum, to ??? 350mA constant current

Robertson Data sheet
7 to 27volts, 350mA

Ok, so I think I've done my homework, now not sure how I can do the math to make sure I don't burn the whole house down.

I plan to create 4 plates for lack of a better word out of aluminum with proper heat sinks and fans.

Two of them will each have:

1 100 W full spectrum LED
40 3 W blue LEDs 3.7 FV 700mA
5 3 W deep red LED 2.3 FV 700mA
5 3 W red LEDs 2.3 FV 1000mA

The other two:

1 100 W full spectrum LED
10 3 W blue LEDs 3.7 FV 700mA
20 3 W deep red LED 2.3 FV 700mA
20 3 W red LEDs 2.3 FV 1000mA

To drive al these leds I plan to use:

4 x 100W 3000mA to run one 100W each
2 x 40W 1000mA to run 20 red LEDs each
2 x 10W 1000mA to run 5 red LEDs each

1 x 40W 700mA to run 15 blue LEDs

13 x 30W 700mA . 9 would run 10 blue or red LEDs, and 4 of them would run 10 blue and one red LED

That is a total of 18 AC to DC adaptors, can I just plug all this onto a power strip?

Is this where the w = a x v formula applies?

33Vx3Ax4 units = 396W for 4 100W full spectrum
2.3Vx1Ax50 units = 115W for 50 red LEDs
2.3Vx.7Ax50 units = 80.5W for 50 deep red LEDs
3.7vx.7Ax100 units = 259W for 100 blue LEDs

That is a total of 850.5 Watts.

So a 120V AC with a 15amp breaker would give me up to 120vx15=1800 Watts, which would mean I am well inside the safe zone? I would run the lights on its own breaker.

This is why I LOVE forums!

Not sure if ok to post links, but found this driver at mouser Mean Well Driver

Operation Mode: Constant Current (CC)
Output Current-Channel 1: 700 mA
Output Voltage-Channel 1: 357 V
Output Power: 249.9 W

Does that mean I could serial all 45 deep red and blue LEDs, 3.7Vx40 plus 2.3Vx5, for a total of 178 Volts at 700mA. The driver seems to handle up 357V. If my math is correct, the Watts used would be 124.6 so again within the range of the driver correct?

Assuming I got the LED drivers down, how to I calculate for how much AC I use/available?