Power supply required for a LOT of LED's??

G'Day gang,

I think I've been on here discussing my project before. I have a MIDI keyboard here, now, I've written all the software that makes it so when I play the keyboard, I've got 12 LED's that turn on and off with brightness being dependant on how hard you hit the keyboard and the LED being which note you push (or, two LED's if you hit two notes). Thx Arduino Mega!! :smiley:

I'm more of a software guy just kind of wandering into the hardware side of things now to help a friend with her art project. The problem now is this: These LED's are going in lanterns, the LED's we're going to buy can be found here: http://cgi.ebay.com/100-pcs-5mm-White-LED-Lamp-18000mcd-Free-Resistors_W0QQitemZ160356124113QQcmdZViewItemQQptZLH_DefaultDomain_0?hash=item2555f845d1&_trksid=p3286.c0.m14

My friend has a vision of 48 lanterns, 4 lanterns for each semitone (between C to B inclusive). With 4 of those LED's in each lantern that would be 192 LED's!! (that's a lot)

About the LED's:
The LED's say they are Forward Voltage: 3.0 - 3.2V DC Maximum Current: 20mA Continuous, 50mA peak for 10% Pulse Width

if I'm going to be using PWM to control the brightness, and if the player hits every key at once at full strength, this could power all LED's at a full duty cycle, requiring 50mA (because it's on peek)

so that would be 50mA, at 3.2DC... 3.2*0.05 = 0.16 Watt per LED
0.16 * 192 = 30.72 Watts of electricity required.

Would this mean I could get away with purchasing a 12VDC power supply at 4A current? (providing me with 48W to play with)

I don't know much about the resistors I'd need to do this, but I'm assuming a 470 ohm resistor connected to each LED in the circuit would be fine? Will putting resistors in the circuit change the 30.72W required power assumption I'd calculated??

Also, for each note on the midi keyboard, it's going to power 4 lanterns (with 4 LED's in each), so 16 LED's. @ 50 mA, this would be 800mA, so could I use a BC337 transistor to switch these groups on and off??

Once again, I'm not really an engineer and help on putting this together would be greatly appreciated!! :smiley:

Thanks so much,
Scotty

IMHO, the best approach would be to use a power supply close to the LED's forward voltage, but not under it - i.e. a 5v supply, not a 12v. If you use a 12v supply, you'll be burning off a lot of excess power in the series resistors to drop down to ~3v.

The .16w per LED you calculated is the power dissipated by the LED itself. You also need to consider power dissipated by the resistor - if you are pulling 50mA and using a resistor to drop the other 9v (12 - 3), the resistor would be dissipating .45w each!

Better to use a 5v supply, in which case each resistor would need to drop ~2v at 50mA, which would only be .1w for each resistor. You'd need roughly a 47 ohm resistor in this application.

All that said, I would design for 20mA current, since that's the continuous rating. If you plan on 20mA, each LED would dissipate .06w, and each resistor only .04w. So .1w for each LED-resistor circuit, times 48, is only 4.8w. 5v supplies are cheap, you might as well get a 2A (10 watt) supply so you'll be down around half duty on it.

requiring 50mA (because it's on peek)

That is the peak current it can stand not the peak current you get when you switch it on. An LED will always draw the same current in any given circuit. I think you are mixing this up with things like motors that take an initial surge of current before they settle down.

Yep, I sure was!! :smiley:

I've been talking on ##electronics a fair bit and was introduced to http://led.linear1.org/led.wiz by theBear :). He was nice.

It turns out to be cheaper if I use a 15V power supply (at 2A) to power them in my groups of 4 (one 100 ohm resistor per 4 LED's). That way my circuit would only use about ~1A of power anyway :slight_smile:

Thanks