Hello all, this is my first post, although I've searched and lurked the forum for a while. My electronics experience is very rusty and rudimentary, so I'm going to ask some very dumb questions. Please feel free to mock.
I'm trying to understand current by experimenting with my Arduino Uno and a digital multimeter, and there are a couple of things I don't understand:
1) If I switch my meter to A/mA and connect it to the 5V or 3.3V power pins on the Uno, I get constantly fluctuating numbers - so fast that I couldn't even tell you where the decimal point is. Is this because the multimeter is varying the drawn current to remove its own internal resistance from the equation? If I include a resistor in series, it behaves more like I expect: a 270-ohm resistor causes my meter to show 18.5mA on the 5V pin and 12.2mA on the 3.3V pin. From a straight equation standpoint, I=V/R with no resistor in series would yield I=5/0=undefined. But that can't possibly be true in the real world - can it?
(edit: the spec sheet says the 3.3V pin carries a current of 50mA, which doesn't clear up my mysterious current readings at all)
2) The spec sheet on the Uno says that the digital output pins generate 5V at 40mA. But if I write a simple sketch that sets the pin (I'm using #7, but it shouldn't matter) HIGH, my multimeter reads 5V at like 87mA. What gives? Technically, I'm setting it high for 10 seconds, then setting it low for 10 seconds in a loop. The multimeter readings vary between 87mA and approx 0.15mA.
As you may have guessed, I'm still having trouble wrapping my head around current vs voltage, and how it all works together in a circuit.