I've been learning a lot about electronics since I bought an Arduino last month, but this issue puzzles me.
Yesterday I received a bag of 5mm LED, they seemed to work just fine in series but when I'm adding two in parallel only one is lighting. The same thing happens when using an external power source (9V battery) and/or changing amps.
I've been playing around with a handful of 3mm LED that came with the Arduino kit and they work fine in both parallel and series. I sense I'm missing some fundamental knowledge here?
try to get a 3 volt power supply (no arduino, no extra stuff) and try your parallel LEDs again. if one lights and the other doesn't, reverse the polarity of the LEDs. if the second lights and the first doesn't then one of the LEDs is backwards.
The problem with wiring leds in parallel with just one series current limiting resistor, is that if they have even a slight difference in Vf (forward voltage drop, which they will) then one will turn on before the other, robbing all or most of the current. So because you can't guarantee equal current sharing between the leds you should not wire them in parallel unless each has it's own series current limiting resistor.
And if you are playing around with wiring your leds to a 9vdc battery or to a arduino output pin without using current limiting resistors (you didn't state one way or another) you are in danger of damaging the leds and/or arduino output pins.
when I'm adding two in parallel only one is lighting.
They may have dissimilar voltage drops: the led with the lowest voltage drop lights up first. The led with the highest voltage starts to light up as the voltage drop on the first led increases sufficiently.
Hi guys, sorry for the late reply, we had to go abroad for a while. I appreciate your responses.
It makes sense that dissimilar voltage drops could explain the issue, but I wonder why that is not the case with my 3mm LEDs. I'm always using resistors by the way.
Anyhow, I tried wiring the 5mm in series, connect them to the 9v battery and add a multimeter between the setup and the positive pole of the battery, but a strange thing happens. For every added LED they all light weaker, which is to expect (if I understand the theory correctly) as they can only draw the current allowed through by the (220 ohm) resistor. But at the same time the multimeter shows a lower number by each added LED.
1st LED: 27.7 mA
2nd LED: 13.6 mA
3rd LED: 6.2 mA
4th LED: 0.04 mA
Shouldn't the current drawn remain at a steady 9/220 = 40.9mA? Isn't the multimeter telling me that the setup is drawing less amps for every added LED? I really feel there's something this social science graduate doesn't get.
Shouldn't the current drawn remain at a steady 9/220 = 40.9mA? Isn't the multimeter telling me that the setup is drawing less amps for every added LED? I really feel there's something this social science graduate doesn't get.
No, it should not remain a steady current value as you add more series wired leds. That is because the formula you use is not correct for this situation (9/220 = 40.9mA). Rather you have to calculate the resistance size required for the current value you want to drive the leds at, but only if knowing how many leds (and what their vf rating is) that you are going to be wiring in series beforehand. So assuming you want to drive four red leds in series, each with a vf rating of 1.5vdc, and you want to run them at .020 amps, the formula is R = (9v - (vf1+vf2+vf3+vf4) ) / .020 = 150 ohms. Where as driving just a single led at .020 amps with 9 vdc is R= (9-1.5)/ .020 = 375 ohms.
So each added series led requires a different size resistance to maintain the same operating current desired.
Aha, that makes perfect sense, I think I'm getting it now and why Ohm's law is so important.
I ran one blue 5mm LED with a 330 ohm resistor. The multimeter measured 17 so I calculated its vf rating to be 3.4. I then calculated that two similar LEDs would require (9-(3.4*2))/.02 = 110. A 100 ohm resistor showed around 22mA as it should.
Here I initially wrote a follow-up question, but I think I came up with an answer while formulating the question. The multimeter shows 22mA rather than 2*22 because it's "drawing" a higher voltage, not a higher current(?) Thus the current reading itself is more useful for indicating the average amp consumption from each LED?
Have a quick google for 'Ohms law parallel series circuits'
In a Series circuit the current through each component is equal and the voltage across each is different. All volt drops add up to the total voltage provided bybthw power supply.
In a Parallel circuit the voltage across each component is the same and the current through each is different. All currents add up to the total current from the power supply.
Combinations of Series/Parallel circuits follow the same rules but the volt drops and currents are calculated in each section according to Ohms Law. By breaking the circuit up into aeries and parallel segments you can simplify and calculate all the values accordingly.
Shouldn't the current drawn remain at a steady 9/220 = 40.9mA?
That's correct only if the voltage drop on the 220ohm resistor is 9v. When powered by a 9v source, the resistor shares it voltage drop with the led string: the more leds you put in their, the less voltage drop over the resistor -> lower current going through the whole thing.
I did a search on the sentence mentioned by Tack, watched a really good video on YouTube and thought a lot about this subject and your feedback. I really get it now! It's not an easy subject to grasp, at least not for someone with no technical background, but it really is obvious. It's beautifully logical and it's a great feeling to be able to predict the multimeter readings before I connect it to a project
Thank you all for putting up with my beginner questions! You truly helped me getting much further in the world of electronics.