amps volts and power supplies

why can a power supply never supply to much amperage but can supply to much voltage and fry your "thing"? If a device only pulls the amount of amps it needs why isnt the same true for volts?

and my second question, whats the cheapest way to step down 12v DC from a battery to 6v DC with a max draw of 4amps? I could use a through hole chip or a laptop charger type thing with an In/Out connector.

Until you do some reading on BASIC electricity... It's' Free on Wiki too. Not even electronics... perhaps the simplest answer is they're like cups, they can only hold so much before the magic smoke escapes.

Doc

ive done the basic reading but when googling couldnt find a comparison to water flow vs pressure which i usually see :wink:
yes the magic smoke is what I have trouble containing.

jointtech:
why can a power supply never supply to much amperage but can supply to much voltage and fry your "thing"? If a device only pulls the amount of amps it needs why isnt the same true for volts?

The simple Ohm's law formula explains the relationship and interdependence between voltage, current, and load resistance. Finding a short primer on Ohms Law should help you on a journey of understanding on this topic.

and my second question, whats the cheapest way to step down 12v DC from a battery to 6v DC with a max draw of 4amps? I could use a through hole chip or a laptop charger type thing with an In/Out connector.

That's a max power output of 24 watts, not something you are going to get by real cheaply with. A switching voltage regulator module is probably the best avenue to look for.

OK, so we have a power supply - its job is to generate a steady constant voltage to the load, and only fail if the load starts to demand more current than the supply can provide. A regulated power supply is extremely good at this, holding the voltage constant to with 1% whatever the load (within the specified current range) - its this good because its engineered exactly for this task.

The important realization is that when you have a constant voltage source such as a power supply the current that flows depends on the load - the supply doesn't care what the current is at all (unless it goes too high, in which case the supply fails to provide enough current and then cannot sustain the constant voltage - the power supply is overloaded).

So you need to know what the load does when given that voltage - if it takes more current than the supply can provide you have the wrong power supply. Otherwise it will simply work.

If your load can only take 5V safely then you obviously don't want to connect it to a 12V supply - you will damage the load.

If your load wants 5V and will use under 1A you can connect it to a 5V 1A supply or a 5V 2A supply or a 5V 100A supply - it will still get 5V and take the same current (it can't sense the current limit of the supply until it overloads it).

If your load is a short circuit then it will draw as much current as the supply can provide and what happens depends on the current limit of the supply (heat, smoke, bang!). But this is a fault condition - supplies should not be overloaded.

Hope that makes more sense now?