Has anyone else noticed that there are a ton of constant current power supplies out there for extremely low prices that are very small and quite powerful? I've got a couple I'm experimenting with that supply 6-11 volts at 350ma. I take it that that means they go as low as 6V and as high as 11V trying to supply 350ma to a fancy led or string of leds.
Well, why can't we hook one of those up to an arduino and have it run just fine. There's already a regulator on the arduino that will take the incoming voltage and hold it at 5V for the board, so it shouldn't matter that the CCPS (constant current power supply, I just made that up) is varying its voltage as the current load changes.
A constant current source (theoretically) would supply infinite voltage if it were open circuited. Of course, real-world current sources are limited by their design. The maximum open circuit voltage is called the "compliance" voltage.
If you tried to power a voltage regulator (like the input to an Arduino board) with a constant current source, one of two things would happen:
(1) if the Arduino needed more current that the CC source supplied, the voltage would be too low and the Arduino would not even boot up.
(2) If the current source was set to a higher current that the Arduino required, the current source would swing right up to it's maximum voltage (the compliance voltage).
Since both 1 and 2 are worthless
You see that you don't want to power an Arduino (or most anything else) with a constant current source.
What they ARE good for, however, it powering LED's and LASER DIODES. These devices have a nominal forward voltage drop, but it varies from part to part and varies with temperature. You NEVER want to power an LED or laser diode with a constant VOLTAGE source.
A constant current source is like an "electronic spring". It will flex - give and take a little bit to accommodate the LED or laser. The nominal voltage drop across the LED or laser, times the constant current equals the input power to the device. As the device warms up, it's forward drop will change slightly, and the current source will simply adjust itself and keep providing the correct current to the device.
Without the electronic "spring" to take out fluctuations in operating parameters, the device could work fine one minute and burn out the next.
That's why, in simpler setups, you always use a resistor in series with an LED... to limit the current. The resistor acts like an electronic "spring".
Did all this make sense?