Powering Lots of LEDs (or anything really) With "Excess" Voltage

I have been lurking for a good while but this is my first actual post. So hopefully I haven't missed a million answers to my question(s) due to ineffective searching/forum scanning. While I am asking before finalizing how I want to lay out a project I am currently working on, I am extremely new to DIY electronics so I'd like to know the answer just for the sake of knowledge as well. This will actually be my first electronics project assuming I finish it!

So to lay out current project as a starting point. I am modding a PC case and I want to toy around with various lighting effects using LEDs. My current estimation will be something to the effect of 40 5mm RGB LEDs (cheapy version I am using for design/testing/prototype are common cathode with forward voltages estimated around 2.2/3.5/3.5v and forward current of 20mA). I am using an Uno for my initial testing and just throwing together various configs on a breadboard to see what works out the best. The power supply for these will be the 12V PSU that will be powering the PC itself (840W 70A on a single +12v rail).

Some additional information: I do not need individual control of each LED though I may set up separate "zones" of several LEDs. I am however controlling the color channels individually and I may add in a "dimming" function for the whole setup (though this will probably be done in software probably as a multiplier/offset for the adjustments for the individual color channels). While prototyping is being done with an UNO for ease of use, I will probably use a pro mini or something of that size for the final design.

Currently with my prototyping layout I am doing high side switching using 3 PWM outputs on the Uno to drive 3 NPN transistors that drive 3 PNP transistors that are switching 12VDC from a small "brick" style power supply to the anodes for each color channel. Each LED color channel is linked in parallel with like channels with an appropriately sized resistor on each anode (i.e. 5 LEDs = 3 color channels, 15 individual channels and 15 resistors in 3 sets of 5 linked in parallel). Everything is using the ground from the 12VDC PSU. In the final solution, I would essentially use the same design with slightly larger transistors, and possibly resistor networks instead of individual resistors to save on space.

NOW after over-explaining all of that my actual question is in how I "should" be powering these LEDs (or any device of a similar voltage/current for that matter) vs how I intend to.

For instance, instead of pushing 12V to my LEDs and adding resistors to drop it to an acceptable level for them to function, would it be better for me first to regulate the voltage in some way to something closer to what they are rated for across the board, and then switch that to them (with smaller resistors)?

I have looked around for answers to this and in most cases it seems like the answer is that a.) use resistors because they are inexpensive and simple and b.) because the wasted power is so minimal that it's not a bother.

If cost were not an issue what would be the "best" way to do it in terms of power efficiency, generated heat, efficient use of PCB space, etc? In my current project the power efficiency is not an issue, but I am limited on space and would prefer to generate as little heat as possible beyond what the LEDs already add to the equation. Future projects may require battery power and efficiency would be a HUGE plus there.

Hopefully this makes sense and I am asking the question correctly! Thanks!

It would be much simpler to use a common anode device, then all you need is a single transistor to pull it down to ground. To get down from 12V to the 5V you need use a switching regulator for best efficiency.

Grumpy_Mike:
It would be much simpler to use a common anode device, then all you need is a single transistor to pull it down to ground. To get down from 12V to the 5V you need use a switching regulator for best efficiency.

Thank you for your response. I'm am indifferent on the high or low switching side of things, I just happened to buy one of those 50 RGB LEDs for $7 sort of deals for the learning/prototyping phase and it was just easier to pick up 3 50 cent PNP transistors locally than try to find a bunch of replacement LEDs at 10X the cost. I will probably go with common anode there for the reason you mentioned in the final design. I may also end up using a TL5940 or something as well or go with something completely different. I have found all sorts of possibilities on the logic side of things that I may toy around with before it's all said. I just included that part of the post as typically people ask "what is your specific application" when suggesting solutions.

The only question I really had was on the most efficient way to power 40 RGB LEDs with each being 3 separate channels or the rough equivalent of 120 2.2-3.5V 20mA LEDs. Using a switching regulator did "sound" the best from my inexperienced perspective for dropping the voltage from 12V to something lower to power a lower voltage device. I just wasn't sure if there was any advantage in dropping the voltage before the resistors on the LED side of things.

From what little I understand the switched regulator will waste power as well, just less power than a linear regulator at the expense of providing "noisier" power on the lower voltage end? Would it be better to drop the voltage to the highest point needed (3.5V for green/blue) and then drop it with a second regulator in parallel with the higher voltage channels (i.e. drop it to say 2.3V for the red channel) or to just drop it to 3.5 and use bigger resistors on the red channel? Or the opposite and drop it to the lowest voltage for red and then amplify it slightly for the higher voltage blue and green? Or just drop to 5V and use resistors down to 2.2/3.5V?

Honestly in the current project I can pull 3.3V directly from the PSU for the red channel and 5V for the blue and green this PSU has like 180W(30A) available on +3.3/+5V connectors (I had mostly just been planning on the 12V for the convenience of not having to run another cable). What I am mostly looking to get out of this post is some good basic ideas for efficiently powering general "simple" devices (lights, switches, relays, etc) with my source power being too high to use directly.

Thanks again for your quick reply. I apologize for any confusion or if I am over complicating the situation. I think my issue is as much not knowing how to ask the right question as anything haha.

You need a resistor or some other current limiting circuit when you drive an LED, full stop. You can not just regulate to a voltage.
Forget looking for a soloution that waste no power, it does not exist. Switching regulators can be 80 to 90% efficient. The most efficient way is to have a constant current regulator for each LED but that gets expensive when you have a lot of LEDs. Most LED driver chips require common anode LEDs.

You are best switch regulating down to 4 or 5 volts and then using a resistor to limit the current.
I think you missunderstand the concept of an amifier. It is for signals not powering voltages and you need a power rail of at least the maximum voltage. These are just as inefficient as linear regulators. What you describe as an amplifier is a step up, or boost regulator much the same as your step down or buck regulator.

Do not worry about noisy output voltages from a switching regulator.

Since you mention it, the obvious approach is to use the 5V supply for the LEDs as you already have this as the lowest practical voltage and it is already regulated with an efficient switch-mode supply. (In most cases the 12V supply from the PC is not actually regulated but approximately tracks the 5V supply.)

Note that switch-mode regulators are more efficient at higher (output) voltages as a (the) major source of loss is the voltage drop of the rectifiers. 5V is an excellent compromise in this case.

As Grumpy points out, you regulate to the LEDs by current using resistors calculated according to the different voltage drops (and possibly tweaked for white balance), not by voltage. It will most likely be cheaper - and thus more practical - to use the common cathode LEDs plus two transistors per channel, and of course you could not use common anode LEDs with different supply voltages anyway.

Another consideration is whether you wish to multiplex the LEDs by banking (common) cathodes onto N-channel FETs (or - NPN transistors). For construction ease, you would indeed want to look into driver array ICs.

And yes, a PC (and its power supply) is full of noise so whatever you do is unlikely to contribute to this to any meaningful degree. Such transient "noise" is on the other hand, pretty much irrelevant to the LEDs.