150/ 220/ 330 Ohms Resistor

I know how to apply ohms law and calculate the resistors using is LED, however, I read some books and found that some circuit is using 220 ohms instead of 150 ohms (150 ohms should be the calculated value) , is it a great problem for it ?

I found that even i use 330 Ohm the circuit seems still working.

Thanks

it will just change the intensity that go through the diode.

so from :

330 ohm will give around 11 mA
220 ohm 17 mA
150 ohm 25 mA

the foramula is :

(Tension delivered - tension needed)/R = Intensity.

I use 5volts for delivered by arduino and 1,2 volts needed by Del. So : (5-1.2)/330 = 0,01151515

A larger resistor is just about always OK, and may even be desirable if the aim is for the circuit to use less power, but of course the LED may not be as bright. At some point, with a large enough resistor, the LED will not light at all. Using a 1000? resistor with small 3mm LEDs and a 5V supply is not uncommon, and they're plenty bright for pilot lamps and such.

If you don't have a multimeter, I'd recommend getting one, it's a very basic instrument to have and you will get no end of use out of it. A little time playing with an LED, a meter, and a few resistors is instructive. Check the voltage and current specs on the LED's datasheet if you have it. There are some pretty nice meters available for well under $100 that may be all you'll ever need, here is one example, and here is another.

Hope this helps!

One web site to calculate leds and resistors :

I have some red water-clear LEDs that light happily with a 10k resistor at 5V (OK not very bright).

I had a follow-up question related to this, but I can start a new thread if that is preferable:

In the "Fade" tutorial example, the instructions say to use a 220-ohm resistor between the LED and the digital output pin. However, the LED I purchased has a forward voltage of 3.0V and forward current of 20mA. According to the calculation I've done by hand and using two online tools, the correct resistor should be 100-ohms.

Is it just a safety precaution that the tutorial advises 220 instead, or am I missing something in my calculation?

the correct resistor should be 100-ohms.

Yes, R = ( source voltage - diode voltage drop) / .020 amps.

Is it just a safety precaution that the tutorial advises 220 instead, or am I missing something in my calculation?

No, the turtorial didn't know you were going to be using a LED with a 3.0vdc forward voltage drop., Standard typical red LEDs have much lower Vf rating, so need higher ohm resistor.

Lefty

Interesting! Thank you! I picked up a handful of LEDs from RadioShack to test out these example sketches, and I was just assuming they were standard.

and I was just assuming they were standard.

Assuming anything in electronics has cost most of us lots of wasted time and money in the past, and that is a fact you can take to the bank. :wink:

PS: And so far I've found assuming things on the software side is a time sink also.

Lefty

For all but white LEDs (which use fluorescent dye) the light output of an LED is fairly accurately proportional to forward current. Select the current depending on how bright you want it to be (within safe limits of course).

High efficiency LEDs require less current for the same brightness (1mA or less will usually do) which allows extended battery life (or a brighter LED or both). 25 years ago LEDs required 30mA to light up fairly weakly - they can be about 1000 times brighter these days, so there is usually no need to put 30mA through them unless you are using the LED to illuminate.

Having said that not all LEDs are high efficiency these days, its useful to know which kind you are working with.