Go Down

Topic: 12v Power Supply? (Read 4239 times) previous topic - next topic


Hey guys,

I picked up a string of RGB LED's, and I need a 12v power supply to feed it. Can I use just about any AC wall wart I have laying around?



depends on how much current you will be drawing


If your talking about a premade strip with resistors designed for use with 12v, then yes, almost anyway so long as it is designed for the current you need
it has to be dc unless you want to rectify it yourself


The old standby, "non-regulated" wallwarts actually put out 4-5V above their rated voltages
when unloaded and for small loads. The output only drops to the rated voltage at the
rated current for the device. So, your Leds may be a little brighter if the 12V wallwart
has a high current rating.


Ah, thanks for the advice.

Is there something that indicates if it is regulated?

I'm going to find, I'm guessing and old cellphone charger would have enough power?

Like, I have one here (that's not powerful enough) that says "12V 2A MAX".

Would that be appropriate for a 1 metre LED Strip? He only said it was 12V... It looks like it has resistors between each LED.

It was expensive, so I don't want to burn it out : )

Thanks again for the help!


The higher current [more than 1A] old style wallwarts were heavy bricks. I have one
that says 12V, 1.6Amps, and it weighs a heavy  25 oz. It's non-loaded output
voltage = 16.3V.

I should imagine that regulated wallwarts will give a non-loaded output voltage close to
the rated value, and they'll be fairly small and light, due to using switching power supplies
rather than large transformers. OTOH, being a lot of complicated electronics, they may
also be less reliable.


just search 12v power supply on ebay and work with 60% of the clarified power


Hmm. I found an 18W and a 8 watt... I'm guessing neither of those are "close enough"?

Neither of them had more than 1A. What should I be more worried about? Amperage or wattage?


Power (watts) = volts * amps

You really worry about both at the same time.

12V x 1.5A  18W
12 x 2/3A = 8W
Designing & building electrical circuits for over 25 years.  Screw Shield for Mega/Due/Uno,  Bobuino with ATMega1284P, & other '328P & '1284P creations & offerings at  my website.


Nov 25, 2012, 09:57 pm Last Edit: Nov 25, 2012, 10:16 pm by redlazer Reason: 1
Ok, now I understand what's going on here. Thanks a lot CrossRoads. For some reason, it all clicked into place with that post.

The guy at the store told me a number - I'm sure he said 12 Volts. But given the context, that doesn't make sense - that's only half the equation.

How can I determine the wattage of the lights?

Is the ratio truly flexible - say for example, (this is not what I'm doing yet) if I'm trying to light a 100W incandescent light bulb, so long as I get a total of 100W to the bulb, it'll light, correct? Why is everything high voltage but low amps - is it just because amps are the "dangerous" part?

More on topic, if I find a 6V 2A DC power supply, would that work? Or am I seriously confusing my terms here?


I picked up a string of RGB LED's,

You need to first know how much current these things draw.


I picked up a string of RGB LED's,

You need to first know how much current these things draw.

The guy said 12V.

I'm pretty sure they are the same as these ones:


Thanks again for the help guys. I knew I should have just bought a power supply from him.


current is not the same as voltage

the adafruit ones you linked to say 60ma per segment, so if that is close to yours 0.060 * how many segments are you using * each strip = how much current you will need to supply at a minimum.


The guy said 12V.

Yeah, and he still doesn't know the current.


You need 60mA of current at 12V for every 3-LED segment of your strip.
If you have 10 segments connected, then you need 10 x 60mA = 600mA.
600mA x 12V = 7.2W

Most things are just rated for volts & current tho.

The LED strips will work at lower voltage as well, they just won't be as bright.

Here's why: The LEDs have a voltage drop across them when they turn on. Says it 3.2V. The resistor then dissipates the rest of the voltage.
So if you're running from a 12V source, and you took some measurements, you would see 3.2V across the LED, and (12-3.2) = 8.8V across the resistor.
If the resistor was sized to limit current flow to 20mA, then its value would be 8.8V/.02A = 440 ohm.

Say you decided to run on a 9V supply.
9V -3.2V = 5.8V across the resistor. Current is then limited to 5.8V/440ohm = 13.2mA.

Actually, it's a little less even, because the transistor that switches on & off to control it also has some voltage drop, say 0.7V for an NPN transistor.
so (9V - 3.2V - 0.7V)/440 ohm = 11.6mA.

So your supply could be as ~4, 4.5V and the LEDs would still light,they would just be dimmer and dimmer due to the current limit resistor that is built into the strip.

If you use a low Rds N-channel MOSFET, instead of a fixed voltage drop it will a fixed on-resistance, perhaps 50mOhm (0.05 ohm) if you use a good one.
So now the equation changes to
(12V - 3.2V)/(440 + 0.05) = 19.99mA at full brightness
(9V-3.2)/(440+.05) =  13.2mA - so a couple of mA more per segment.

But now your supply can be right down to just above the LED's voltage drop, so 3.2 to 3.6V (many Blue LEDs have a voltage range of 3.2 to 3.6, with Red & Green the same or lower) since there is no 0.7V drop across the NPN transistor, just a small resistance across the MOSFET instead.

Hope this helps.
Designing & building electrical circuits for over 25 years.  Screw Shield for Mega/Due/Uno,  Bobuino with ATMega1284P, & other '328P & '1284P creations & offerings at  my website.

Go Up