LEDs in series, resistor and voltage drops

Hi!

I'm a programmer with no whatsoever experience in electronics and so I thought... well, what could be wrong with learning electronics with Arduino?

So far everything fine, I also made some simple and cool projects, learning practice and theory. I bought a multimeter to do some tests.

Recently I tried the difference between LEDs in series and LEDs in parallel; I could get two LEDs in series, but not three LEDs in series. I cannot understand why, though.

From my understandings, Arduino's 5V pin which I'm using has a 200 mA limit. Well, if we count that the average red LED uses 20 mA, one could use more than three LEDs without any problem.
Obviously, I use a resistor, specifically a 220 Ω one, so almost 22 mA should get to a single LED (?).

The problem is that, in simulations and in reality, this kind of circuit works fine (both LEDs turn on):

But this does not, because LEDs are three:

My first thought was about the voltage drop each LED has, so I used my multimeter to test (on the circuit which turns on, with two LEDs).
It seems that each LED has a voltage drop of ~1.95 V by testing with the multimeter. That's a lot.
What I noticed is that the first side of the resistor, which receives 5V, actually shows (almost) 5V on the multimeter, but the other side shows 3.9 V! (Almost) 1 V was lost with the resistor, it seems.

Is this supposed to be like this? Isn't 1 V too much for a 220 Ω resistor? Initially, I thought that resistors only reduced current, but I read that they also have a voltage drop too (in some way). Am I right?

So, testing with the multimeter, it seems that the circuit receives 5 V which then become 3.9 V with the resistor. The first LED receives that 3.9 V and then we go to the next, which receives almost 2 V because of the first's voltage drop.

I am doing all these measurements by connecting the multimeter's ground to Arduino's GND and the multimeter's mAVΩ to the resistor or LED's anode (depending on what I want to measure). Is that how it's supposed to be done?

So, what actually prevents me to have 3 LEDs in series? I think that what happens here is that the second LED, which receives 2 V, having a 1.95 V voltage drop, can't provide enough voltage to the third. If I'm right, probably the limit of LEDs that I can have in series is two, not because of current but because of tension.

I know probably those questions are already answered on the forum, but by searching today and yesterday I didn't find any clear explanation. Also, I had to ask about the multimeter - I'm not sure if I am using it correctly.

Thanks for the help and sorry for the long thread.

Place the LEDs in parallel ‘with’ each LED having its own 560Ω resistor.

larryd:
Place the LEDs in parallel 'with' each LED having its own 560Ω resistor.

Thanks.
My question is more regarding the theoretical and "learning" aspect of things, mostly to understand how resistors and voltage drops work. Obviously, when creating something serious and useful, I'd place LEDs in parallel.

Let’s assume each LED has a voltage drop of 2v.

If you have two LEDs in series this leaves 5 - 4 = 1 for the resistor.

If you have three LEDs in series that’s 6v, but you only have a 5v supply :cry:

Okay, understood! Thanks. So it was like I thought, a problem of voltage.

May I ask: is it normal that a resistor reduces voltage? Before the resistor I have 5 V, after I have ~4 V. Isn't it supposed to reduce only the current (by Ohm's Law)?

Measure the voltage across the resistor, what do you read ?

larryd:
Measure the voltage across the resistor, what do you read ?

Almost 1 V, so I guess that's right.
Thanks, didn't know I could do this.

Two things you need to learn are Ohm's law and Kirchhoff's circuit laws. Usually I post links to them but I am using my phone so I don't have them to hand. Search on Wikipedia. Those laws explain what you have found, they are fundamental to electronics.

May I ask: is it normal that a resistor reduces voltage? Before the resistor I have 5 V, after I have ~4 V. Isn't it supposed to reduce only the current (by Ohm's Law)?

You are correct that resistance is "the resistance to current flow". i.e Higher resistance means less current (with the same voltage).

Current is flow is the same current flows through all series components.

Two (or more) resistors in series make a voltage divider where the voltage across each resistor is proportional to the resistance. For example, two 500 Ohm resistors in series makes 1K, so if you apply 5V you get 5mA. If you apply Ohms Law to each resistor, 5mA x 500 Ohms is 2.5V across each resistor.

LEDs are non-linear (like all diodes). That means the resistance changes when voltage changes. The resistance of an LED goes down when voltage goes-up. That makes it tricky (and usually not that useful) to apply Ohm's Law directly to the LED. Ohm's Law is true for LEDs (it's a law of nature with man-made units-of-measure).

...In most circuits, the voltage is constant or "controlled" and the current depends on resistance/impedance. For example a 12V, 5A power supply puts-out (approximately) constant 12V. And 5A is it's maximum current capability. If nothing is connected no current flows. If you connect a regular little LED and current limiting resistor you'll get a few milliamps. If you connect something that "pulls" more than 5A, the power supply might burn-out.

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.