Current in series and parallel LED.

Just sorry for the stupid and easy question, but I don't understand completely some little things.

First scenario: I have a 12V power supply, and 3 LED mounted in series, each working at 3-3.2V eating 20mA. I also add a 150 Ohm resistor to stabilize the whole current. The question is, how much current will each led get? How about the whole circuit. And what about the voltage of each led and of the circuit? And most important of all, why?

Second scenario: I have a 3.7/4.0V power supply, and 2 LED mounted in parallel (the LED have the same specific of the above ones: 3-3.2V, 20mA). What resistor value must I put to keep the LED safe and reduce the voltage to 3-3.2V? How much current will get the whole circuit? And a single LED? What about the voltage of the circuit and of a single LED? And why?

As I wrote, I also need to understand why (so some theory and most important the logic behind that) to understand it one time and then apply my acquired knowledge to my future circuit.

Thanks to all!

Have a look at these articles first.

http://www.evilmadscientist.com/2012/resistors-for-leds/

http://led.linear1.org/why-do-i-need-a-resistor-with-an-led/

https://www.sparkfun.com/tutorials/219

Thanks for the great article! But they only speak about the LED Current Limiting Resistors. What about the other question?

The parallel LEDs both can draw 20mA, so 40mA together. The voltage dropped across both of them would be 3V. So with a 4V power supply you have an 'extra' volt to drop across a resistor in series with the LEDs.

Using Ohm's law: 1V/40mA = 25ohm

A single LED will have the same voltage, but only the 20mA, so you want to limit the current to 20mA instead of 40mA.

Ohm's law: 1V/20mA = 50ohm

In any series circuit exactly the same current runs through every component so the resistor and each LED will all see roughly 20mA.

The LEDs control their voltage themselves as those links say so that will be around 3V each leaving about 3V across the resistor (which is presumably why you chose 150 Ohms to give you 20mA).

Steve

evanmars:
The parallel LEDs both can draw 20mA, so 40mA together. The voltage dropped across both of them would be 3V. So with a 4V power supply you have an 'extra' volt to drop across a resistor in series with the LEDs.

Using Ohm's law: 1V/40mA = 25ohm

A single LED will have the same voltage, but only the 20mA, so you want to limit the current to 20mA instead of 40mA.

Ohm's law: 1V/20mA = 50ohm

This will only work if the voltage drop across both diodes is the same. This is almost never the case. Using two LEDs from the same manufacturer and same color may work, but using e. g. a red and a green one will not work. Parallel LEDs should always have their own resistor each

Second scenario: I have a 3.7/4.0V power supply, and 2 LED mounted in parallel (the LED have the same specific of the above ones: 3-3.2V, 20mA).

This is a "bad design".

In parallel, of the voltage drop across both LEDs will be (about) the same as one LED, which means the voltage across the resistor is the same as one LED and the TOTAL current is the same as one LED, with the current split between the two LEDs.

The problem is... Since LEDs are non-linear (their resistance changes with voltage) and there are part-to-part variations we can't be sure the current will split equally. With resistors (which are of course linear) a 1% difference in resistance would result in a 1% difference in current. With LEDs, the difference can be "magnified" and you might see a difference in brightness.

You might get away with it, but for a proper design each LED should have it's own resistor.

Eternyt:
Thanks for the great article! But they only speak about the LED Current Limiting Resistors. What about the other question?

What other question? you mean second scenario? Led's in parallel? it was clearly mentioned in the first article. Did you actually read the article? they even had a neat diagram.

olf2012:
Parallel LEDs should always have their own resistor each

DVDdoug:
each LED should have it's own resistor.

That's exactly what those articles said. OP should have taken the time to read them

DVDdoug:
The problem is... Since LEDs are non-linear (their resistance changes with voltage) and there are part-to-part variations we can't be sure the current will split equally.

Visually explained here using Falstad's circuit simulator, so OP can see what happens.

Completely understand the limiting resistor part. Thanks to all, the visual design was great!

And perhaps also the parallel one: so I will pick a resistor for each led (4V-3.2V = 0.8V -> R = 0.8V/0.020mA = 40 Ohm), and then each led will get 3.2V and 20mA. And so the entire circuit is getting 40mA at 4V

But in the series one, with a power source of 12V and 3 led and a 150 Ohm resistor, each led will get about 3V, but how about how much current a single led will have? If it is 20mA for the first, the second what will get? And the third?

Thanks

If it is 20mA for the first, the second what will get? And the third?

They all get 20 mA.

In a series connection, the current is the same in all components. Think of current as water flowing in a pipe: water flow in = water flow out.

So all the three will get the FULL 20mA? And consequentially will light up in the exact same way of the two in the parallel circuit (speaking for lumen per led)?

Which part of "In any series circuit exactly the same current runs through every component so the resistor and each LED will all see roughly 20mA." (from post #4) did you have trouble with?

It's quite dispiriting trying to help people to find they apparently don't bother reading your answers.

Steve

It's quite dispiriting trying to help people to find they apparently don't bother reading your answers.

You can see that the OP is struggling. Maybe it takes 5 repetitions before the idea starts to sink in.

Politicians have learned that if you repeat anything often enough, some people will eventually believe you.

Eternyt:
So all the three will get the FULL 20mA? And consequentially will light up in the exact same way of the two in the parallel circuit (speaking for lumen per led)?

Different LED brands/makes/casings/colours etc. may not have the same luminosity at the same current.
Check the datasheets for a graph of lumens vs current.

For example, Vishay's 510 series in different colours:
http://www.vishay.com/docs/81346/tlcx510.pdf

It was all clear, It's a stupid logic problem that I have. I simply can't figure out how, if the first led consume some power, the second will have the same power of the first. Again, I well understand all the theoric, also thanks to all of Your patience.
Many thanks to all the people that respond to my questions, and sorry for my last stupid question, it was just a logic problem of my mind, I will take the answer as is without making more question.

I simply can't figure out how, if the first led consume some power, the second will have the same power of the first.

The second almost always won't consume the same power. Power is voltage times current, so each LED consumes power according to the voltage drop across it times the current through it.

The current through each component in a series connection is the same, but voltage drops across the components are almost always different, so the power consumed by each component is different.