Simple question about resistor for LED

Hi...

i am trying to understand basic electronics so i don't need to only copy projects i see on the internet, but also be able to create my own.
So, my problem is that i am trying to figure out how to find the right Resistor to use with LEDs.

I know about the Ohm's Law: V=IR and i though i should be able to use it to find the R value.
But my problem is that i have the LEDs for reaaaaaaally long time and i have no idea about any of it's values.

I have read often that a average LED takes about 20mA current.
So it would make something like:
V = I R
R = V / I
R = 5 / 0.020
R = 250 Ohm

right?

But what if i don't know the I from the LED, how can i find that out?

Also, i read about this formula for calculating the Resistor for a LED:
R = (Es - Eled) / Iled

Es - source voltage (V)
Eled - voltage drop across the LED (V),
Iled - current through the LED (A)

Would it be better to use this formula?

Thanks...
=)

This site has a nice calculator:

http://led.linear1.org/1led.wiz

They have some helpful suggestions. One is, if you don't know, guess 20 mA for the current.

Would it be better to use this formula?

Yes, you take into account the voltage drop over the LED. So if it is 2V you want:

R = (5 - 2) / 0.020 = 150 ohms

"right?" Wrong !

Your simple equation ignores the voltage dropped across the LED - as Nick details.

The best way to find the voltage drop of an led is to wire it up with a1K resistor and measure it. Then use that figure in your calculations. Subtract this voltage from the LED supply voltage and do ohms law to find the resistance. You decide what I or current to give it is not a function of the LED only in so much as you could give it too much.

Questions about LEDs and resistors are never simple...

So, if I do that i am finding out the maximum current?
So it is the lowest value resistor that i can use?

Would you say if i get to a value of 220 Ohms that would be safe to use or should i always leave a margin and use maybe a bigger resistor?

boguz:
So, if I do that i am finding out the maximum current?

Do you need the maximum current? If you're not building a flashlight then maximum is usually too much.

boguz:
So it is the lowest value resistor that i can use?

Would you say if i get to a value of 220 Ohms that would be safe to use or should i always leave a margin and use maybe a bigger resistor?

At 5V supply current, 220 is safe for most LEDs, it's safe for Arduino pins and you'll easily be able to see if the LED is on or not.

If you really do want maximum current then you can't use a resistor, you have to use a circuit that fixes the amps to a particular value independently of voltage, eg. 20mA. This is what LED controller chips do.

So, if I do that i am finding out the maximum current?

No.

The only way to find the maximum current is to look in the data sheet. Otherwise you guess what you think the maximum current might be. A guess of 20mA is a good guess but it is not always right.

By doing that you are measuring the forward volt drop which allows you to calculate a resistor to have the current you choose flowing. Whether this is the maximum or not is down to the characteristics of the LED.

fungus:
Do you need the maximum current?

Hmmm, no, i don't need maximum current. I was asking so if i could find the maximum than i could bring it down to a good level. It was a kind of theoretical question.

So, with the breadboard i made some tests...
i plugged a LED with a 220Ohm resistor.
The voltage from the +5 to ground was 4,89V and across the LED it was 3,08V.
But i was already using a resistor, so i am guessing that is affecting my readings...
But if i use the Ohm's law like this then i get 6mA.
Could this be right?

also, when i tried measuring the current, i found out that i CAN'T!!! =/
I thought i knew how, but it turns out i don't.
I was measuring it in line (with the multimeter leads on 20A and COM): my +5V from Arduino was going to +Lead of the Multimeter, and the -Lead was going to the LED, which was going then to resistor and then to ground. But still i was just reading nothing!

ai, ai, ai... what a newbie!!!! :~

with the multimeter leads on 20A

So you expect to see 6mA on a 20A full scale meter? How many digits does it have.
Put the range down to 200mA and then you stand a chance.

boguz:
i plugged a LED with a 220Ohm resistor.
The voltage from the +5 to ground was 4,89V and across the LED it was 3,08V.
But i was already using a resistor, so i am guessing that is affecting my readings...
But if i use the Ohm's law like this then i get 6mA.
Could this be right?

If there's 3,08 across the LED then there's 1,81 across the resistor (4,89-3,08).

1,81 / 220 = 8,2 (mA)

Grumpy_Mike:

with the multimeter leads on 20A

So you expect to see 6mA on a 20A full scale meter? How many digits does it have.
Put the range down to 200mA and then you stand a chance.

You'd need a 6 digit multimeter to see a decent reading...

boguz:
So, my problem is that i am trying to figure out how to find the right Resistor to use with LEDs.

LEDs work different, depending on how much voltage is dropped across them. The simple explanation is that LEDs are "open" (high resistance, low current flow) until their forward voltage is applied. At which point, they turn into a "short" (low resistance, high current flow). The resistor you select is to limit the current allowed to flow when the forward voltage is applied.

Most LEDs have a maximum forward current around 20mA, which is why that number is so popular. This does not mean, however, you must use 20mA. That's just the maximum you should let flow through the LED if you want reasonable life. (Note that you can push more current through, but at significantly lower life.)

The forward voltage depends on the color of the LED. No matter what voltage you try to drop a across a LED, it will ONLY drop its forward voltage. Again, that's why you need the resistor and the basis of how the two work together.

If you put 5V across a LED and a Resistor, the LED will only drop its forward voltage. The rest of the voltage is dropped across the resistor. Using ohm's law, you can determine current that will flow through the two.

Multimeters with a diode test option can usually tell you the forward voltage of the LED. Or ask Mike pointed out, you can always use a really high value resistor to find out. (Red is around 1.6V, Green is around 1.8V, and Blue is around 2V).

The forward voltages are closely related to the energy of photons of light of the relevant colour,
which can be calculated from the wavelength w (in nm) as 1240/w (in volts). This leads to values
of about 1.9, 2.45 and 2.6 for 'normal' hues of red, green, blue.

Measuring an RGB LED I have gave 1.8, 2.6 and 2.6V - rough agreement (the real situation is more
complicated, the band-gap voltage of the semiconductor material is also very important - here I believe
both the green and blue LEDs are gallium nitride, hence the same forward voltage (the colour can be tuned
by clever design that affects the quantum mechnics of the junction).

fungus:
Questions about LEDs and resistors are never simple...

...

MarkT:
The forward voltages are closely related to the energy of photons of light of the relevant colour, which can be calculated from the wavelength w (in nm) as 1240/w (in volts).

Looks like fungus is right. We are moving onto physics. :slight_smile:

Grumpy_Mike:

with the multimeter leads on 20A

So you expect to see 6mA on a 20A full scale meter? How many digits does it have.
Put the range down to 200mA and then you stand a chance.

Sorry, maybe i didn't explain it right. In my Multimeter i need to change where i plug the +Lead according to what i am measuring (Resistence and Voltage is in one place, and current is another). So i just meant that the lead was connected to the "plug" where it read 20A (and also tried the "plug" where it reads "mA").
The in the multimeter i have tried 200mA, 20mA,...

= =
Ai, ai, ai... LEDs LOOK like such a simple thing, but then it's like we need a full year's course just to understand it! This is one of the things i like in electronics, you can always dig deeper... :wink:
= =

Thanks guys. Tonight when i get home i'll try a couple more tests and i'll post any results i have!

(and also tried the "plug" where it reads "mA"

That should have worked. As it didn't then you might have blown the fuse in your multimeter. The current input is often fused to protect the circuits but they do blow easily.

Grumpy_Mike:
you might have blown the fuse in your multimeter. The current input is often fused to protect the circuits but they do blow easily.

This ^^

Those fuses blow really easily, they're only about 200mA.

Oh yes, the fuse in the multimeter was blown. and i am guessing it has been blown for a looooooong time!!! :sleeping:

as to the resistors, i think i will go with 240 Ohms this time.
Thanks, you gave me much to think about...
:wink: