Thanks all for the replies ... that helps.
More voltage = more current. More resistance = less current
... if you connect the LED without a resistor it will burn-out (maybe even explode).
From what I'm understanding, if you hooked-up an LED straight to the battery it would burn-out but only because the car battery's 12V is about 2x the max input voltage rating of an LED and about 6x what the LED needs in order to light-up. Adding an appropriately sized resistor is important in order to lower the voltage to within the LED's "safe" range. But my question is more about the importance of current ratings. I don't know the typical current rating of a car battery or what's needed by a single LED, but going with that example - assuming I used appropriately sized resistors in my circuit in order to lower the voltage, would it be safe (for the LED) to use a car battery to power the LED, even though the car battery's current rating is WAY higher than needed by the LED? From this thread, it sounds like the answer is yes.
... on the other hand, you also really don't need to use a car battery to light up a few leds when [8 d-cells] will handle the load.
LOL - thanks ... yes ... good point and very much appreciated (seriously). Reminds me of the kids in my old neighborhood pulling Red Rider wagons with homemade "boom boxes" made out of plywood, car stereo parts (presumably stolen, possibly salvaged) and a car battery. Yes, you CAN do that but SHOULD you ;) ?
Go back to the good old water analogy ...Think of voltage as like pressure acting on charge, current is the flow of charge.
Thanks. I keep trying to fall back on the water analogy but for some reason it only "almost" sticks for me. I guess I let it "sink-in" (no pun intended) wrong a few times because I keep getting it backwards ... current=pressure (wrong?) and voltage=flow (wrong?). But if as you say, current=flow and voltage=pressure, I guess that makes more sense.
... there exist old non-regulated wallwart style supplies ... you should always measure the output voltage with no load ....
Ah, good point. The concept of an "unregulated" power supply is something I was reading about. It would be nice if they were easier to identify without a multi-meter or without breaking one open.
one thing to remember as you increase current and accidentally short it out it may melt your wires so gauge the wire size to match current
A good power supply should current limit a little past its rating (e.g. 110 to 130%) without being damaged. That can be used to guard aginst an overload. ... that current limit can be useful for making things more bullet proof.
... Running a supply at the full rating tends to make it run hot ... tends to have more ripple on the supply... I just prefer not to run things close to the limits - I find I have less failures that way :)
Ah, thanks for that ... very practical advice.
Simple answer: Your circuit will draw as much power as it needs, as long as your power supply can supply enough current. Having excess capability to supply current is not a problem. Your re-think about the wall socket is a good way to comprehend it.
Thanks! I appreciate the affirmation. I feel I've gained a little freedom and understanding.