Go Down

Topic: Two LED in parallel (Read 2 times) previous topic - next topic

HomerS

#5
Dec 16, 2012, 01:58 am Last Edit: Dec 16, 2012, 02:11 am by HomerS Reason: 1
Hi guys, sorry for the late reply, we had to go abroad for a while. I appreciate your responses.

It makes sense that dissimilar voltage drops could explain the issue, but I wonder why that is not the case with my 3mm LEDs. I'm always using resistors by the way.

Anyhow, I tried wiring the 5mm in series, connect them to the 9v battery and add a multimeter between the setup and the positive pole of the battery, but a strange thing happens. For every added LED they all light weaker, which is to expect (if I understand the theory correctly) as they can only draw the current allowed through by the (220 ohm) resistor. But at the same time the multimeter shows a lower number by each added LED.
1st LED: 27.7 mA
2nd LED: 13.6 mA
3rd LED: 6.2 mA
4th LED: 0.04 mA

Shouldn't the current drawn remain at a steady 9/220 = 40.9mA? Isn't the multimeter telling me that the setup is drawing less amps for every added LED? I really feel there's something this social science graduate doesn't get.

retrolefty

#6
Dec 16, 2012, 02:25 am Last Edit: Dec 16, 2012, 02:27 am by retrolefty Reason: 1
Quote
Shouldn't the current drawn remain at a steady 9/220 = 40.9mA? Isn't the multimeter telling me that the setup is drawing less amps for every added LED? I really feel there's something this social science graduate doesn't get.


No, it should not remain a steady current value as you add more series wired leds. That is because the formula you use is not correct for this situation (9/220 = 40.9mA).  Rather you have to calculate the resistance size required for the current value you want to drive the leds at, but only if knowing how many leds (and what their vf rating is) that you are going to be wiring in series beforehand. So assuming you want to drive four red leds in series, each with a vf rating of 1.5vdc, and you want to run them at .020 amps, the formula is R = (9v - (vf1+vf2+vf3+vf4) ) / .020 = 150 ohms. Where as driving just a single led at .020 amps with 9 vdc is R= (9-1.5)/ .020 = 375 ohms.

So each added series led requires a different size resistance to maintain the same operating current desired.

Lefty

HomerS

Aha, that makes perfect sense, I think I'm getting it now and why Ohm's law is so important.
I ran one blue 5mm LED with a 330 ohm resistor. The multimeter measured 17 so I calculated its vf rating to be 3.4. I then calculated that two similar LEDs would require (9-(3.4*2))/.02 = 110. A 100 ohm resistor showed around 22mA as it should.

Here I initially wrote a follow-up question, but I think I came up with an answer while formulating the question. The multimeter shows 22mA rather than 2*22 because it's "drawing" a higher voltage, not a higher current(?) Thus the current reading itself is more useful for indicating the average amp consumption from each LED?

tack

Have a quick google for 'Ohms law parallel series circuits'

In a Series circuit the current through each component is equal and the voltage across each is different. All volt drops add up to the total voltage provided bybthw power supply.

In a Parallel circuit the voltage across each component is the same and the current through each is different. All currents add up to the total current from the power supply.

Combinations of Series/Parallel circuits follow the same rules but the volt drops and currents are calculated in each section according to Ohms Law. By breaking the circuit up into aeries and parallel segments you can simplify and calculate all the values accordingly.

dhenry

Quote
Shouldn't the current drawn remain at a steady 9/220 = 40.9mA?


That's correct only if the voltage drop on the 220ohm resistor is 9v. When powered by a 9v source, the resistor shares it voltage drop with the led string: the more leds you put in their, the less voltage drop over the resistor -> lower current going through the whole thing.

Go Up