So I've been trying to get my head around why you need a resistor in series with an LED (the simplest circuit), and I have come to this conclusion; Basically a digital output pin has an absolute maximum current of 40mA allowed, now the output voltage is 5V, and lets say the resistor has a resistance of 50 Ohms (I have no idea what the typical LED resistance is) Using the equation I = V/R, the current would be 100mA, which is way over the absolute maximum, and therefore you need a resistor in series, now using the values in this example I guess a 100 Ohm resistor, which brings a total resistance of 150 Ohms, now the current equals 33.33..mA which is under the maximum and therefore this is acceptable?

Correct me if I am wrong I just would like to know if it has finally clicked, also what actually is the typical LED resistance? And what this 'forward voltage drop' when talking in terms of an LED?

Thanks