12v Power Supply?

Hey guys,

I picked up a string of RGB LED's, and I need a 12v power supply to feed it. Can I use just about any AC wall wart I have laying around?

Thanks!

depends on how much current you will be drawing

If your talking about a premade strip with resistors designed for use with 12v, then yes, almost anyway so long as it is designed for the current you need
it has to be dc unless you want to rectify it yourself

The old standby, "non-regulated" wallwarts actually put out 4-5V above their rated voltages
when unloaded and for small loads. The output only drops to the rated voltage at the
rated current for the device. So, your Leds may be a little brighter if the 12V wallwart
has a high current rating.

Ah, thanks for the advice.

Is there something that indicates if it is regulated?

I'm going to find, I'm guessing and old cellphone charger would have enough power?

Like, I have one here (that's not powerful enough) that says "12V 2A MAX".

Would that be appropriate for a 1 metre LED Strip? He only said it was 12V... It looks like it has resistors between each LED.

It was expensive, so I don't want to burn it out : )

Thanks again for the help!

The higher current [more than 1A] old style wallwarts were heavy bricks. I have one
that says 12V, 1.6Amps, and it weighs a heavy 25 oz. It's non-loaded output
voltage = 16.3V.

I should imagine that regulated wallwarts will give a non-loaded output voltage close to
the rated value, and they'll be fairly small and light, due to using switching power supplies
rather than large transformers. OTOH, being a lot of complicated electronics, they may
also be less reliable.

just search 12v power supply on ebay and work with 60% of the clarified power

Hmm. I found an 18W and a 8 watt... I'm guessing neither of those are "close enough"?

Neither of them had more than 1A. What should I be more worried about? Amperage or wattage?

Power (watts) = volts * amps

You really worry about both at the same time.

12V x 1.5A 18W
12 x 2/3A = 8W

Ok, now I understand what's going on here. Thanks a lot CrossRoads. For some reason, it all clicked into place with that post.

The guy at the store told me a number - I'm sure he said 12 Volts. But given the context, that doesn't make sense - that's only half the equation.

How can I determine the wattage of the lights?

Is the ratio truly flexible - say for example, (this is not what I'm doing yet) if I'm trying to light a 100W incandescent light bulb, so long as I get a total of 100W to the bulb, it'll light, correct? Why is everything high voltage but low amps - is it just because amps are the "dangerous" part?

More on topic, if I find a 6V 2A DC power supply, would that work? Or am I seriously confusing my terms here?

I picked up a string of RGB LED's,

You need to first know how much current these things draw.

oric_dan(333):

I picked up a string of RGB LED's,

You need to first know how much current these things draw.

The guy said 12V.

I'm pretty sure they are the same as these ones:

Thanks again for the help guys. I knew I should have just bought a power supply from him.

current is not the same as voltage

the adafruit ones you linked to say 60ma per segment, so if that is close to yours 0.060 * how many segments are you using * each strip = how much current you will need to supply at a minimum.

The guy said 12V.

Yeah, and he still doesn't know the current.

@redlazer,
You need 60mA of current at 12V for every 3-LED segment of your strip.
If you have 10 segments connected, then you need 10 x 60mA = 600mA.
600mA x 12V = 7.2W

Most things are just rated for volts & current tho.

The LED strips will work at lower voltage as well, they just won't be as bright.

Here's why: The LEDs have a voltage drop across them when they turn on. Says it 3.2V. The resistor then dissipates the rest of the voltage.
So if you're running from a 12V source, and you took some measurements, you would see 3.2V across the LED, and (12-3.2) = 8.8V across the resistor.
If the resistor was sized to limit current flow to 20mA, then its value would be 8.8V/.02A = 440 ohm.

Say you decided to run on a 9V supply.
9V -3.2V = 5.8V across the resistor. Current is then limited to 5.8V/440ohm = 13.2mA.

Actually, it's a little less even, because the transistor that switches on & off to control it also has some voltage drop, say 0.7V for an NPN transistor.
so (9V - 3.2V - 0.7V)/440 ohm = 11.6mA.

So your supply could be as ~4, 4.5V and the LEDs would still light,they would just be dimmer and dimmer due to the current limit resistor that is built into the strip.

If you use a low Rds N-channel MOSFET, instead of a fixed voltage drop it will a fixed on-resistance, perhaps 50mOhm (0.05 ohm) if you use a good one.
So now the equation changes to
(12V - 3.2V)/(440 + 0.05) = 19.99mA at full brightness
(9V-3.2)/(440+.05) = 13.2mA - so a couple of mA more per segment.

But now your supply can be right down to just above the LED's voltage drop, so 3.2 to 3.6V (many Blue LEDs have a voltage range of 3.2 to 3.6, with Red & Green the same or lower) since there is no 0.7V drop across the NPN transistor, just a small resistance across the MOSFET instead.

Hope this helps.

Thanks for the detailed response Crossroads.

I just don't understand why he would tell me they're 12v - it seems like I'd need to know the current, or the voltage and the amperage, for it to be useful information.

Ah well. Now that I have the real information, let's move on. I do, unfortunately, have more questions:

I managed to find an old wall-wart, and it's rated: 12V 2A. It's a limiting power supply as well - I seem to recall this being good!

However, 2A seems like too much, given the requirements mentioned. Am I correct? I have a 1 metre strip, which has 20 segments on it.

So, if I have 20 segments, that's 20 x 60mA = 1200mA, or 1.2A.

If I supply it with 2A, is that too much? Will I burn out my LEDs?

I'm excited to finally get this ball rolling!

The LEDs will only draw the current allowed by the current limit resistors.
Could be a 12V car battery, capable of supplying hundreds of amps to turn an engine over - the current drawn will only be what the resistors allow.

yes a circuit will only draw what it needs, when it comes to current its safer to be over the requirement than under

really with LED's it doesnt matter the voltage, its how much current it passes ... you could shoot 100vdc though them, if its an acceptable current its cool ... but your going to have a hard time finding something to actually accomplish that (with basic understanding of things) , its just an example

Aha! That's what I thought.

Well then, time to sacrifice this adaptor in the name of science!

Thanks again!

Speaking of sacrifice, you're more likely to blow the leds than the adaptor, if you've
done anything wrong [expect this 50% of the time when "first" hooking something up].
So play it safe,

  1. connect just one load first, and see how it goes.
  2. if it works ok, and no smoke, then do the rest.
  3. if smoke comes out, order more parts.