Basic resistance question

So I've been trying to get my head around why you need a resistor in series with an LED (the simplest circuit), and I have come to this conclusion;
Basically a digital output pin has an absolute maximum current of 40mA allowed, now the output voltage is 5V, and lets say the resistor has a resistance of 50 Ohms (I have no idea what the typical LED resistance is)
Using the equation I = V/R, the current would be 100mA, which is way over the absolute maximum, and therefore you need a resistor in series, now using the values in this example I guess a 100 Ohm resistor, which brings a total resistance of 150 Ohms, now the current equals 33.33..mA which is under the maximum and therefore this is acceptable?

Correct me if I am wrong I just would like to know if it has finally clicked, also what actually is the typical LED resistance? And what this 'forward voltage drop' when talking in terms of an LED?

Thanks

You're close, and forward drop is the missing piece of the puzzle.

Forward voltage drop is the voltage the LED eats. You subtract that from the supply voltage to find the voltage for the Ohm's law calculation of resistance. So you'd use 3.8V in the resistor calculation for a 5V supply and a 1.2V forward drop diode.

-br

If you look at an LED datasheet, they don't give the resistance value.
Instead they give the Forward Voltage as the LED's characteristic
and the Max. Forward Current as the LED's maximum rating.

The forward voltage is the "voltage drop" across the LEDs when it is lit (i.e. operating in forward bias... an LED after all is a diode)
It varies with LEDs, some 2V or lower, some are 4Volts (like blue leds).

Subtract the "LED voltage drop/forward voltage) from your Voltage source.
You'll come with a value, let's call it E.

"I" will be the forward current you want to use for your LEDs. Some LEDs max at 30mA, but you don't necessarily have to use 30mA. If you want your LEDs a bit dimmer, you can use a lower forward current. (Some high-efficiency LEDs can be bright with very little forward current, due to it's design, lens, etc.)

Using Ohms Law R=E/I compute for R.

That's the resistor you need... typically, the value you'll get is not available. Resistors are made in a limited set of values (E96 for example). So pick the next highest available value.

So say the forward voltage drop is indeed 1.2V instead of a 100 Ohm resistor you could get away with a 50 Ohm resistor as the total resistance would be 100 Ohms (assuming the LED has a resistance of 50 still) and 3.8/100 is 38mA, just under the max?

It is a misconception to say an LED has a fixed resistance, and one cannot model it this way. You account for its entire effect with the voltage drop, not your made-up 50 ohms.

Vasquo has laid it all out nicely.

To work it for a 1.2 volt red LED and a desired current of 20 ma:

5V - 1.2V = 3.8V
3.8V / 0.020A = 190 ohms

The closest available standard resistor ("E24") is 200 ohms.

-br

Edit: inserted the words "a fixed" and changed ma to A.

Ok got it, so you generally want to run LEDs a tad below their max current?

and 3.8/100 is 38mA, just under the max?

Yeah sure, but that's typically over the max. rating of LEDs (30mA).

Do you want to end up with burnt out LEDs or shorter lifespan LEDs?

Turns out fortune favors those who design with safety margins. Who would have guessed? 8)

I think you'd be surprised, upon experimentation, how little perceived difference in brightness you get for the extra 10ma of current.

Good luck with your project,

-br

The power on led on your arduino board is probably running at 3ma and is perfect as a power on indicator. Why people feel they have to always run their leds at their maximum continuous rated value of 20ma is beyond me. Help keep the planet green, cut back on the led current to what is actually needed for the purpose of the led. :wink:

Lefty

...lets say the resistor has a resistance of 50 Ohms (I have no idea what the typical LED resistance is)

The resistance of an LED is non-constant and nonlinear. At low voltages, the resistance is relatively high, and at high voltages, it's very low. (A diode or LED basically "turns off" when the voltage across it is low, and it "turns on" when the voltage across it is high-enough.)

Ohm's Law is a physical law. It's ALWAYS TRUE* and you CAN calculate the resistance under the particular conditions if you know the EXACT voltage/current characteristics. If the LED is rated 20mA at 2V, that's 2V/0.020 = 100 Ohms.

But, those voltage/current characteristics vary from part-to-part and with temperature. So if you apply 2V, you might get a lot more than 20mA (the resistance might be lower than 100 Ohms) and the LED might burn-up, or you might get a lot less than 20mA, and the LED will be too dim. That's why we use something else (typically a resistor) to control the current, rather than applying a controlled voltage.

  • In AC circuits wiht inductors and/or capacitors, there can be phase differences between the current and the voltage. So, if you measure the voltage and current it can seem like Ohm's Law isn't true. But, if you measure voltage and current at any instant in-time, you'll ll find that the law holds true.