I am working through the first stage of the arduino starters manual and came across the following section:

"In this case Digital Pin 10 is outputting 5 volts DC at (according to the Atmega datasheet) 40mA (milliamps) and our LED ? s require (according to their datasheet) a voltage of 2v and a current of 20mA. We therefore need to put in a resistor that will reduce the 5v to 2v and the current from 40mA to 20mA if we want to display the LED at it ? s maximum brightness. If we want the LED to be dimmer we could use a higher value of resistance. To work out what resistor we need to do this we use what is called Ohm?s law which is I = V/R where I is current, V is voltage and R is resistance. So to work out the resistance we arrange the formula to be R = V/I which is R = 3/0.02 which is 150 Ohms. V is 3 because we need the Voltage Drop, which is the supply voltage (5v) minus the Forward Voltage (2v) of the LED (found in the LED datasheet) which is 3v. We therefore need to find a 150 ? resistor. So how do we do that?"

I understand all of this except were the 0.02 comes from.
Is that the current that the LED requires(20mA) or is the difference in current from the arduino output current and the LED required current (40mA - 20mA = 20mA)

Also I think they could have worded that section better - the output pin does not output 40mA in any sense (that's actually the absolute maximum current you should ask from it). A pin can produce a lot more current than that (and you will likely damage the chip in the process). The current that flows depends on the pin and the load connected to it, its not a property of the pin.

The output pin is basically acting as a voltage source, but with some series resistance (roughly 40ohms) due to the characteristics of the output transistors on the chip. Most of the time you can simply think of it as being at 5V when HIGH and 0V when LOW, and use that as a basis for your calculations.

So pin at 5V, LED wants 2V, difference (across the resistor) will thus be 3V, so R = V/I = 3.0 / 0.02 = 150 ohms.

A somewhat more accurate result would take account of the output resistance of the pin (about 40 ohms as I said), so 110 ohms resistor would be more likely to give the 20mA spot on (it usually doesn't matter at all).

Incidentally once you've connected your LED and resistor and set the pin HIGH you can try measuring the voltage at the pin (I think it will turn out to be about 4.4V rather than 5V, exact value depends on the actual output transistor's performance.

Thanks for you help. Just to double check, does the 0.02 (20mA) come from the difference between 40mA and 20mA. To look back at a specific section of the manual:

We therefore need to put in a resistor that will reduce the 5v to 2v and the current from 40mA to 20mA if we want to display the LED at it ? s maximum brightness.

The Voltage I use in my calculation is 3V (5V - 2V = 3V).
So does the current I use in my calculation 0.02A come from (0.04A - 0.02A = 0.02A) ?

No, the 20mA comes your requirement. You need to limit the current flow to 20mA to avoid damaging the LED.
You know the source voltage, 5V.
You know the LED voltage, says it 2V in this case.
The voltage left to dissipate across the resistor is 3V.
The current flow from the pin is effectively unlimited. The current flow thru the LED is effectively unlimited.
The only current flow you have control over is thru the resistor.
If you know the resistor is 3V, and you want 20mA to go thru it, then you can apply Ohms Law to calculate it:
V=IR, or V/I = R. 3V/.02A = 150 ohm.

and our LED ? s require (according to their datasheet) a voltage of 2v and a current of 20mA. We therefore need to put in a resistor that will reduce the 5v to 2v and the current from 40mA to 20mA if we want to display the LED at it ? s maximum brightness.

The resistor is there to limit the current, nothing more. The voltage drop is entirely induced by the diode. All diodes induce a voltage drop, even the light emitting kind. This is a result of the electrons having to push through the depletion layer created by the PN junctions in the device. This forward voltage drop is the minimum required voltage to push through that layer and allow current to flow. Less than that voltage, and you basically have an open circuit. Above that voltage, and you have a short circuit.

Just to reinforce what the other guys said, voltage sources can “supply”
a lot of current [eg, 120VAC mains can supply 15-20 AMPs], but the
the amount that actually goes into the load is determined by what
the load is [eg, a 15W bulb draws a lot less current than a 150W bulb,
since the resistance of the former is much greater than that of the
latter].

Similar for any other voltage source and load.

In constrast, a current source will try to force a its specified current
into a load, but most devices we deal with are essentially voltage
sources, not current sources. Eg, batteries, AC mains, Arduino pins.

markandersonaudio:
Thanks for you help. Just to double check, does the 0.02 (20mA) come from the difference between 40mA and 20mA. To look back at a specific section of the manual:

We therefore need to put in a resistor that will reduce the 5v to 2v and the current from 40mA to 20mA if we want to display the LED at it ? s maximum brightness.

The Voltage I use in my calculation is 3V (5V - 2V = 3V).
So does the current I use in my calculation 0.02A come from (0.04A - 0.02A = 0.02A) ?

Others have answered already but I wanted to make sure you are not confused. The 20mA comes from the specification of the diode. It has nothing to do with the 40mA limit of the pin. If the pin could output a maximum of 100mA, you would still be using the 20mA in the calculation as that is the current that will be going through the resistor and the diode. Is this clear?