How to calculate the value of resistor to protect LED and Arduino Pin

In most of the getting started tutorials for Arduino (like this Blink example) it is recommended that we have place a 220 Ohm resistor. I know that it is calculated using Ohms Law, but I am interested to know how this value is calculated?

Initially I thought it was to protect the LED. I followed this guide from Evil Mad Scientist. The articled explained that a typical Red LED has a voltage drop of 1.8 V and a current of abound 25mA and Arduino Pin has an output of 5V. Using these values and the formula from the article, the calculation is

V = (power source) - (Voltage Drop) = 5V - 1.8V = 3.2 V I = 25 mA

so R = V/I

R = 3.2/0.025 = 128 Ohms.

Which is not equal to 220 Ohms.

Then, I came to know that we should also protect the Microcontroller pin. I saw from Atmega 328P datasheet, that the maximum DC current in a Pin should be 40mA. So using this value

V = 5V and I = 40mA

R = V/I = 5/0.040 = 125 Ohm.

Which is also not near to 220 Ohm.

I did a research about it and the more I read about it, the more I am getting confused. Can someone help me understand how this value is calculated? Thanks for your time.

These are really good questions Sudar. I'm sure many people will consider them to be basic but when you are just starting out (like me) it is confusing when people say, "Don't worry... 220? will be fine." Ok, but how do you know that? :)


Your first set of calculations are correct. However many simply round numbers up to a common value which means a little less current flows through the LED. For most cases this is irrelevant, unless you actually want to push the brilliance to the maximum rated value. Many of us would be satisfied using something like 470ohm but then the LED current would only be around 7mA, but probably bright enough as an indication that the output was active.

Remember that an LED is not a linear device so any change in forward voltage will have a non-linear change in current flow. Using larger than calculated limiting resistors is a prudent measure.

But others will disagree - it is a public forum.

220? is a “standard” value.

5V / 220? = 22 mA

That 40 mA max is to be avoided.

The 20mA LED thing is kind of bogus. Most LEDs are plenty bright with 10 mA and aren’t that much brighter at 20 mA.

5V - 1.8V = 3.2V
3.2 V / 10 mA = 320? – The closest “standard” value is 330?
3.2V / 330? = 9.7 mA

1 Like

Thanks for the explanation guys. That makes sense to me - hopefully it clears things up for Sudar too.


128 Ohms. Which is not equal to 220 Ohms.

Sure it's not equal, but it's just right. If you have a choice of { 100, 220, 470, 1000, 2200, 4700, 10000 } 100 is a bit risky, 1000 might work , but 2200 or above is definitely too big. Everything else is fine. There's no need to go to the next electronics shop to get a 128 Ohms resistor or a 0.1% precision item.

Engineers are happy to get the magnitude right (something between 100 and 1k)

Thanks for the explanation guys, that explains it. So it is more of a practical easiness rather than theoretical correctness.

Just to summarize (for the benefit of future readers) 220 Ohm is used instead of 128 Ohm (the calculated value) because

  • 220 Ohm is a standard value and is easily available.
  • 128 Ohm will drive it at the maximum current which is not desirable, since these components might have +/- 5 to 10% tolerance.
  • The difference in brightness is not much when using 220 Ohm instead of 128 Ohm

Once again, thanks everyone for the explanation. Your explanations made it easy for me to understand.

1 Like

Hi, I am newbie and have some questions on choosing a resistor.

My understanding is a 5 V source and a 220 Ω resistor would have a current of 0.02 A:

I = V/R = 5 V / 220 Ω = 0.02 A

A red LED with a forward voltage of 1.7 V in a circuit with 0.02 A current would require a 220 Ω resistor:

5 V - 1.7 V = 3.3 V

R = V/I = 3.3 V / 0.02 = 165 Ω

Therefore you use a 220 Ω as it is a 'close match'.

Can anyone confirm if this is correct?


The actual point is that there is simply no reason to seek to deliver the "whole" 20 mA. The difference between 20 mA and 15 mA with a 1.7 V drop in the LED will not be visible and you are just being conservative with your ratings. :grinning:

An interesting point by the way, is that if you look at the ATmega328 datasheet, Figure 35-22 and 35-24 you will note that the output will lose half a volt either way when drawing 20 mA, so it effectively adds 25 Ohms to the circuit.


I have been trying to understand Project 3 in the starter kit (

In this there are three resistors in parallel. Using three 220 Ω resistors as stated in the guide did not work. So I used three 560 Ω resistors instead and it works.

Is anyone able to confirm if my understanding and calculations are correct (I have been going through a physics book to try and work it out)?

Using three 220 Ω resistors have worked out the equivalent resistance to be 73.3 Ω


(5 V - 1.7 V) / 73.3 Ω = 45 mA

Therefore, the current is too great for the LEDs (the Ardunio starter kit book states max is 23 mA for the LED).

Using three 560 Ω resistors I work out an equivalent resistance of 187 Ω


(5 V - 1.7 V) / 187 Ω = 17.7 mA which is ok for the LEDs and the project worked as intended.

Any advice would be appreciated!


Using three 220 Ω resistors as stated in the guide did not work.

That should have worked - I'm going to guess you are reading the resistor values wrong. Do you have a multimeter* to measure the resistance?

Those resistors/LEDs are not exactly in parallel. Each LED-resistor pair is connected to a different output pin. With about 2V across the LED and 3V across the resistor you'd have about 14mA (3V/220 Ohms) coming out of each pin and through each LED-resistor pair and that's OK.

  • If this is going to be your hobby I strongly recommend getting a meter. A cheap meter is better than no meter and it's the "one essential" piece of test equipment. (You'll also want soldering iron and a few other electronic hand-tools like diagonal cutters and needle nose pliers, etc.)

And a logic analyzer.

Thanks for replies. Only got the starter kit last week and haven't got a multimeter but I'd already decided to order one - will do that ASAP and look into logic analyzers as well.

But until then I'll set it all up again and double check I was using the correct resistors. I've looked again at the schematic in the book and see what you mean about the resistors/LEDs not being exactly in parallel. So to be clear, each resistor/LED pair can be calculated separately (i.e. I didn't need to try and calculate equivalent resistance as I did in my previous post)?

Also, could you confirm that I would have followed the correct procedure if the LEDs were in parallel?

Thanks again!

sudar: In most of the getting started tutorials for Arduino (like this Blink example) it is recommended that we have place a 220 Ohm resistor. I know that it is calculated using Ohms Law, but I am interested to know how this value is calculated?

I think its just chosen as its large enough to be driven to ground (ie without an LED), safely, and its an E3 resistor value. Its not critical and most modern LEDs are blindingly bright at 25mA, a 1k resistor is actually more sensible.

E3 values go 10/22/47/100/220/470/1k/2k2.... For digital electronics you have no need for anything more exact!

I concur with MarkT's statements. Also the perceived brightness of LEDs is not linearly related to the current used to power them. At 50% of the midpoint (ie 10 out of 20mA) they look about 75-80% as bright as they do at the midpoint current. Doing this will also extend the lifetime of your LEDs.