I understand that the recommended voltage for the power source for an Arduino Uno is between 7 and 12 volts. I'm trying to understand what this translates into, in practical terms, for using a "wall wart" as the power source.
I've read that the stated voltage for most wall transformers is a minimum voltage, and that they can actually run significantly higher (in some cases, as much as twice as high) as the stated voltage. This is consistent with the results I got from a (very) small amount of testing of a few such transformers.
If I use a 9V wall wart, it will be right in the recommended range if it runs near the bottom of its range, but will be significantly higher than recommended 7-12 volts if it is running near the high end. Alternatively, if I move down to a 6V unit, it will be OK near the high end but will provide inadequate voltage if it is running at the minimum (rated) 6V.
Presumably, one could buy (or build) a better power supply which has more accurate voltage control, but this gets expensive and seems to defeat the whole point of having the 5V regulatot built into the Arduino board itself.
So what's a poor Arduinoist to do?
You could just test one with a multimeter. Alternatively use a device for powering USB style devices and just connect it to V in, as it will supply 5V. Beware of excessively cheap wall warts as they may not meet safety standards.
A 7.5V DC adapter would work under all conditions and I think is best suited for powering an Arduino through it's PWR connector.
There are two common types of power supplies: the older, "unregulated" style that uses a big heavy power transformer; and the newer, smaller, lighter switching power supply (like a cell phone charger). With the older style (Power transformer), voltage changes drastically (20-30%) as the current increases. So a transformer rated at 12v 1 amp will commonly put out more than 15v open circuit or no load. Only at 1 amp of current draw will the transformer put out 12v. On the other hand, a regulated switching power supply will out put it's rated voltage across a wide range of currents. So to answer your question, it depends on what type of power supply you use. You can use a 12v power transformer to drive a 9 volt regulator like a lm7809 to produce a 9v 1 amp supply. This is inefficient because the regulator burns off the excess voltage as heat, but it will work. Any "well- regulated" 7-12v power supply will work. You should probably Google "voltage regulation" and also read the faq page in regards to powering an arduino. When in doubt, check it with a voltmeter.
I use 5V wallwarts to power lots of my designs, and ditch the onboard 5V regulators.
7.5V, 9V, 12V are available too if you want the onboard regulator still.
There are two common types of power supplies: the older, "unregulated" style that uses a big heavy power transformer; and the newer, smaller, lighter switching power supply (like a cell phone charger). With the older style (Power transformer), voltage changes drastically (20-30%) as the current increases.
Not strictly true...
The older style with transformer can come regulated, a lot of cheaper ones have the transformer and ouput AC, more expensive ones will have a bridge rectifier, and more expensive still, will have a regulated out.
As long as the switch mode version, has a high frequency transformer, I'm happy, the transformerless supplies capacitor fed worry me.