Go Down

Topic: How to calculate the value of resistor to protect LED and Arduino Pin (Read 23646 times) previous topic - next topic

sudar

In most of the getting started tutorials for Arduino (like this Blink example) it is recommended that we have place a 220 Ohm resistor. I know that it is calculated using Ohms Law, but I am interested to know how this value is calculated?

Initially I thought it was to protect the LED. I followed this guide from Evil Mad Scientist. The articled explained that a typical Red LED has a voltage drop of 1.8 V and a current of abound 25mA and Arduino Pin has an output of 5V. Using these values and the formula from the article, the calculation is

V = (power source) - (Voltage Drop) = 5V - 1.8V = 3.2 V
I = 25 mA

so R = V/I

R = 3.2/0.025 = 128 Ohms.

Which is not equal to 220 Ohms.

Then, I came to know that we should also protect the Microcontroller pin. I saw from Atmega 328P datasheet, that the maximum DC current in a Pin should be 40mA. So using this value

V = 5V and I = 40mA

R = V/I = 5/0.040 = 125 Ohm.

Which is also not near to 220 Ohm.

I did a research about it and the more I read about it, the more I am getting confused. Can someone help me understand how this value is calculated? Thanks for your time.

Latency

These are really good questions Sudar. I'm sure many people will consider them to be basic but when you are just starting out (like me) it is confusing when people say, "Don't worry... 220? will be fine." Ok, but how do you *know* that? :)

-LT

jackrae

Your first set of calculations are correct.  However many simply round numbers up to a common value which means a little less current flows through the LED.  For most cases this is irrelevant, unless you actually want to push the brilliance to the maximum rated value.  Many of us would be satisfied using something like 470ohm but then the LED current would only be around 7mA, but probably bright enough as an indication that the output was active.

Remember that an LED is not a linear device so any change in forward voltage will have a non-linear change in current flow.  Using larger than calculated limiting resistors is a prudent measure.

But others will disagree - it is a public forum.

runaway_pancake

220? is a "standard" value.

5V / 220? = 22 mA

That 40 mA max is to be avoided.

The 20mA LED thing is kind of bogus.  Most LEDs are plenty bright with 10 mA and aren't that much brighter at 20 mA.

5V - 1.8V = 3.2V
3.2 V / 10 mA = 320?  --  The closest "standard" value is 330?
3.2V / 330? = 9.7 mA
"Who is like unto the beast? who is able to make war with him?"
When all else fails, check your wiring!

Latency

Thanks for the explanation guys. That makes sense to me - hopefully it clears things up for Sudar too.

-LT

michael_x

Quote
128 Ohms.
Which is not equal to 220 Ohms.

Sure it's not equal, but it's just right.
If you have a choice of  { 100, 220, 470, 1000, 2200, 4700, 10000 } 100 is a bit risky, 1000 might work , but 2200 or above is definitely too big.
Everything else is fine. There's no need to go to the next electronics shop to get a 128 Ohms resistor or a 0.1% precision item.

Engineers are happy to get the magnitude right (something between 100 and 1k)
 

sudar

Thanks for the explanation guys, that explains it. So it is more of a practical easiness rather than theoretical correctness.

Just to summarize (for the benefit of future readers) 220 Ohm is used instead of 128 Ohm (the calculated value) because

- 220 Ohm is a standard value and is easily available.
- 128 Ohm will drive it at the maximum current which is not desirable, since these components might have +/- 5 to 10% tolerance.
- The difference in brightness is not much when using 220 Ohm instead of 128 Ohm

Once again, thanks everyone for the explanation. Your explanations made it easy for me to understand.

Go Up