Go Down

Topic: Understanding Current - Too many amps? (Read 8379 times) previous topic - next topic

UncleMoki

Trying to learn more about electronics, specifically current.  Does it hurt a device if you use a power supply with an output rating where the current (amperage) exceeds what is needed by your project?

  • Using the Arduino Uno as an example - I read statements like this that essentially say it needs 9-12V and 250mA.  So I find an old power brick that says it has an output of 9V and 2.5A.  That's 10x more amperage than the Arduino needs ... but I guess that doesn't matter, as long as the minimum amount is available, and it's ok to use?
  • A more specific project example - I'm playing with some EL Wire which generally needs 100VAC @ 2KHz and "draws" (if that's the right term) about 10mA per meter.  To get the high frequency AC output needed, I use a 12V inverter (DC to AC).  I have some old 12V power supplies that range from 800mA to 5A and I don't know whether it matters which I use.  It sounds like it doesn't matter, as long as the power supply can output enough for the amount of EL Wire in my project?  What about the inverter ... shouldn't it have a limit too?


BTW - I have a basic understanding that the "load" (as required by your components) determines how much amperage is needed but I guess I always thought that you could damage something if you provided more than needed. 

Note:  As I was typing this I glanced over at the power supply for some networking equipment and noticed it says "input 100-240VAC ~ 0.5A.  Well I KNOW that more than 500mA are available from my wall socket ... so ... hmmm.  I guess it really doesn't matter?

DVDdoug


Ohm's Law
describes the relationship between voltage, resistance, and current.  

More voltage = more current.  More resistance = less current.

...A car battery can put-out hundreds of amps to power the starter.   But, if you hook-up a regular LED with a 10K resistor, you'll get about 2V across the LED and 10V across the LED and that works out to about 10mA.   And if you connect the LED without a resistor it will burn-out (maybe even explode).

tinman13kup

Look at it this way;
  power supplies (whether it's a battery, usb charger, switch mode supply, adjustable supply etc) all will have 2 factors. How much voltage it puts out and the MOST current it can supply before damaging the unit (usually by heat).
  The modules or leds or mcu's you attach to that power supply require 2 things to operate correctly. The correct voltage input, and a supply that is capable of supplying AT LEAST the max current it will use.

Think of it this way- a car has a 12v battery in it. it can spin the starter, energize all the relays, computers, and lights. That's a lot of current, but still 12vdc

Now take 8 D-batteries and put them in series. That's 12vdc. Energizers or Duracells, it doesn't matter, they won't even light up the headlights let alone the starter, but will readily light up some leds on the workbench.

Those leds will also happily work if you take them out and hook them up to the car battery.

Both supplies are 12vdc, but rated at vastly different current levels. Now on the other hand, you also really don't need to use a car battery to light up a few leds when those d-cells will handle the load. You also don't want to be pulling the max current a supply can deliver on a regular basis and expect it not to go belly up in a short lifespan.

  If you have a 5vdc- 500mA module or load, then use a supply that will handle 800mA or more, and while a 100A supply would work just fine, it's a bit overkill and a waste of money
Tom
It's not a hobby if you're not having fun doing it. Step back and breathe

MarkT

Go back to the good old water analogy - If I have a ten foot high water drum tank the water comes out
at the same pressure as if I have a ten foot high dam on a large reservoir, I can connect
my water tap to the bottom of either and not know the difference, but if I have a massive
metre-diameter pipe connected the reservoir will cope and the tank won't.

Think of voltage as like pressure acting on charge, current is the flow of charge.
[ I DO NOT respond to personal messages, I WILL delete them unread, use the forum please ]

DrAzzy

That all said - there exist old non-regulated wallwart style supplies that will output a much higher voltage than they're rated at under light load. These are pretty uncommon now, but you should always measure the output voltage with no load when republishing an old supply, before hooking it up to something that matters.
ATTinyCore and megaTinyCore for all ATtiny, DxCore for DA/DB-series! github.com/SpenceKonde
http://drazzy.com/package_drazzy.com_index.json
ATtiny breakouts, mosfets, awesome prototyping board in my store http://tindie.com/stores/DrAzzy

rogertee

one thing to remember as you increase current and accidentally short it out it may melt your wires so gauge the wire size to match current

ron_sutherland

A good power supply should current limit a little past its rating (e.g. 110 to 130%) without being damaged. That can be used to guard aginst an overload. For example, if I supply the ATmega328p from a regulator that is rated at 150mA it would be difficult to overload since the MCU is rated for 200mA (though I could still damage individual pins).  Anyway, that current limit can be useful for making things more bullet proof (should that be desired).

ChrisTenone

Simple answer: Your circuit will draw as much power as it needs, as long as your power supply can supply enough current. Having excess capability to supply current is not a problem. Your re-think about the wall socket is a good way to comprehend it.
What, I need to say something else too?

gpsmikey

My preference is to use a supply rated at double the requirements.  Running a supply at the full rating tends to make it run hot (they don't typically have a lot of "design" margin in them).  I have also found that running them at the limit tends to have more ripple on the supply (yes, you could add another capacitor to help that issue).  I just prefer not to run things close to the limits - I find I have less failures that way  :)
mikey
-- you can't have too many gadgets or too much disk space !
old engineering saying: 1+1 = 3 for sufficiently large values of 1 or small values of 3

UncleMoki

Thanks all for the replies ... that helps.  

More voltage = more current.  More resistance = less current
... if you connect the LED without a resistor it will burn-out (maybe even explode).
From what I'm understanding, if you hooked-up an LED straight to the battery it would burn-out but only because the car battery's 12V is about 2x the max input voltage rating of an LED and about 6x what the LED needs in order to light-up.  Adding an appropriately sized resistor is important in order to lower the voltage to within the LED's "safe" range.   But my question is more about the importance of current ratings.  I don't know the typical current rating of a car battery or what's needed by a single LED, but going with that example - assuming I used appropriately sized resistors in my circuit in order to lower the voltage, would it be safe (for the LED) to use a car battery to power the LED, even though the car battery's current rating is WAY higher than needed by the LED?  From this thread, it sounds like the answer is yes.

... on the other hand, you also really don't need to use a car battery to light up a few leds when [8 d-cells] will handle the load.
LOL - thanks ... yes ... good point and very much appreciated (seriously).  Reminds me of the kids in my old neighborhood pulling Red Rider wagons with homemade "boom boxes" made out of plywood, car stereo parts (presumably stolen, possibly salvaged) and a car battery.  Yes, you CAN do that but SHOULD you ;) ?



Go back to the good old water analogy ...Think of voltage as like pressure acting on charge, current is the flow of charge.
Thanks.  I keep trying to fall back on the water analogy but for some reason it only "almost" sticks for me.  I guess I let it "sink-in" (no pun intended) wrong a few times because I keep getting it backwards ... current=pressure (wrong?) and voltage=flow (wrong?).  But if as you say, current=flow and voltage=pressure, I guess that makes more sense.

 
... there exist old non-regulated wallwart style supplies ... you should always measure the output voltage with no load ....
Ah, good point.  The concept of an "unregulated" power supply is something I was reading about.  It would be nice if they were easier to identify without a multi-meter or without breaking one open.



one thing to remember as you increase current and accidentally short it out it may melt your wires so gauge the wire size to match current
A good power supply should current limit a little past its rating (e.g. 110 to 130%) without being damaged. That can be used to guard aginst an overload. ... that current limit can be useful for making things more bullet proof.
... Running a supply at the full rating tends to make it run hot ... tends to have more ripple on the supply...  I just prefer not to run things close to the limits - I find I have less failures that way  :)
Ah, thanks for that ... very practical advice.



Simple answer: Your circuit will draw as much power as it needs, as long as your power supply can supply enough current. Having excess capability to supply current is not a problem. Your re-think about the wall socket is a good way to comprehend it.
Thanks!  I appreciate the affirmation.  I feel I've gained a little freedom and understanding. 

gpsmikey

Quote
"From what I'm understanding, if you hooked-up an LED straight to the battery it would burn-out but only because the car battery's 12V is about 2x the max input voltage rating of an LED and about 6x what the LED needs in order to light-up.  Adding an appropriately sized resistor is important in order to lower the voltage to within the LED's "safe" range. "
Not exactly - an LED is basically just another diode - look at the voltage vs current plot for a diode - it is almost nothing until the voltage hits the Vf (Voltage forward) voltage of the diode at which point the current rises VERY steeply for very little rise in voltage.  A LED needs to be thought of more as a current operated device than what voltage does it want.  The resistor limits the current to the LED (or even a regular diode for that matter).  Take a look at this link http://www.electronics-tutorials.ws/diode/diode_8.html where they show the curves for various color LEDs (the forward voltage is different depending on what material is used to make the LED for the different colors).  Yes, you could operate an LED without a resistor, but you would have to provide exactly the right voltage to hit the spot on the curve (and the current changes fairly quickly for small variations in voltage in that area of the curve).  Using a significantly higher voltage than the LED wants (12v for example) lets you use a resistor to limit the current and small changes in either the LED diode or the supply are basically ignored because of the linear characteristics of the resistor that is limiting the current.  For example - you have a 12v source and a LED that takes 2v at the desired current of 20ma - you calculate the resistor needed to limit to 20 ma by 12v-2v = 10v (the voltage you need to "drop").  Now you want 20 ma at 10v so using ohms law, you have R=E/I  or 10/.020 = 500 ohms.  So 10v through 500 ohms gives you 20 ma.  If the voltage went up to 14 on the battery, you now would be dropping 12v (14-2) across that resistor which is still only 24ma - a relatively small increase.  If you were driving the LED with a voltage source, look at the I-V curve and see what a 2v change would make in current - from bright to "poof".
mikey
-- you can't have too many gadgets or too much disk space !
old engineering saying: 1+1 = 3 for sufficiently large values of 1 or small values of 3

UncleMoki

Ah, I think I see your point.  So it's not so much that I'm using a resistor to "drop the voltage" to within range.  Instead, I should think of it like this - take what I know about the "load" (LED specs) and my power source (12V) and use them to calculate the resistor that fits the circuit.  

  • First - I can ignore current for a sec and use the "voltage loop" rule (I don't recall the law's name) to determine the amount of voltage needed across the resistor - with a 12V source and  knowing that the LED's forward voltage is 2.2v (LED data sheet), I can calc how much voltage should go across the resistor ... 12 - 2.2 or 9.8v.  
  • Second - Using Ohm's Law and knowing that the LED's optimal current is 20mA (LED data sheet), I can calculate the size of the resistor I need ... V=I*R or 9.8=.02*R ... solve for R ... R=9.8/.02=490.  So I need about a 500 Ohm resistor.


Sound right?

Been almost 20 years since I've done any calculus but the diode's function sounds cubic.  Graphing all of this sounds interesting (points of intersection, rates of change and derivatives, etc.) to understanding how they relate with each other and their characteristics.  But I'll have to let the calculus simmer a bit ... and see if it clicks at some point.

CrossRoads

Yes, Kirchov's Law,
https://en.wikipedia.org/wiki/Kirchhoff's_circuit_laws
If you know the LED's Vf, and the current you want to flow, then V=IR is simple Ohms Law to calculate the resistors.
(Vs - Vf)/current = Resistor
(12V - 2.2V)/.02A = 490 ohm
Vr = Vs - Vf = 9.8V.
Pw = I^2 * R = .02A*.02A*490ohm = 196mW, so 1/4W resistors will do.

From this you can also see how multiple LEDs in series can 'use' the same 20mA for more efficient use of power.
Say 3 in series:
(12V - 2.2V - 2.2V - 2.2V)/0.02A = 270 ohm
Pw = .02 * .02 * 270 = 108mW

Only works when LEDs can be wired in series - for smart LED strips, you need to look at the control chip's datasheet to see how 12V is handled for chips normally powered from 5V.
Designing & building electrical circuits for over 25 years.  Screw Shield for Mega/Due/Uno,  Bobuino with ATMega1284P, & other '328P & '1284P creations & offerings at  my website.

gpsmikey

Yes, you have it correct (that is just basic math, not calculus).  Once you calculate the "exact" value (500 ohms in this case), then you look at the standard resistor tables to find the nearest standard value that will do what you want.  In this case, that is 510 ohms.  Next you need to look at how much power the resistor needs to dissipate - in this case, power = (current * voltage)  for this example, that would be 10v * 0.020A = 0.2w so you could use either a 1/4 or 1/2 watt resistor (my choice being which I have in my junk drawer or which is cheaper to buy 100 of  :)   )  
mikey
-- you can't have too many gadgets or too much disk space !
old engineering saying: 1+1 = 3 for sufficiently large values of 1 or small values of 3

UncleMoki

Yes, you have it correct (that is just basic math, not calculus).    
I don't recall graphing multidimensional equations in basic math ... but hey, I grew up in California so maybe we just "did math" differently  8)  :-\  :'(.  Seriously though, thank ... that helps.


...you can also see how multiple LEDs in series can 'use' the same 20mA for more efficient use of power.
Say 3 in series:
(12V - 2.2V - 2.2V - 2.2V)/0.02A = 270 ohm
Pw = .02 * .02 * 270 = 108mW

Only works when LEDs can be wired in series - for smart LED strips, you need to look at the control chip's datasheet to see how 12V is handled for chips normally powered from 5V.
Good example, thanks.  Right now the closest things I have to a project is to create some kind of audio visualizer with some flavor of microcontroller and some combination of EL Wire and/or LEDs ... so good info to keep in mind if any part of my visuals include LEDs in series.

Go Up