Go Down

Topic: LEDs in series and forward voltages. (Read 5543 times) previous topic - next topic

big_mark_h

I am currently building an LED array. I have a 12 volt switch-mode power supply which I would like to use to power my LEDs. My LEDs have a forward voltage of between 3.0 and 3.6 volts (according to the datasheet), and require 20mA. If I run my LEDs in series in strings of four, do I need a resistor since the combined forward voltage of four LEDs should be 12-14.4 volts?

I have previously used the "(Supply voltage- forward voltage)/ LED current" equation to calculate resistor values, but the "(Supply voltage- forward voltage)" part seems to suggest that if the supply voltage equals the forward voltage then no resistor is needed. Is this right?

I know that if the supply voltage equals the forward voltage and current is not limited, if the supply voltage increases only slightly then the current drawn by the LED increases very rapidly until the LED fails. But what happens if we go the other way? If the supply voltage is less than the forward voltage what happens to the LED and its current consumption?

Will I be OK using my 12 volt power supply and not having resistors to limit the current (since the usual equation for working out resistor values would suggest one is not needed), or should I get a higher-voltage supply (maybe a 15 volt laptop psu?) and add resistors to each string of four LEDs, or should I keep my 12 volt psu and add low-value resistors (maybe 1 ohm)??

Anachrocomputer

Yes, will always need a series resistor when working with LEDs. In fact, you'll probably need to make series strings of three and then drop a volt or so across the resistor.

Quote
I have previously used the "(Supply voltage- forward voltage)/ LED current" equation to calculate resistor values, but the "(Supply voltage- forward voltage)" part seems to suggest that if the supply voltage equals the forward voltage then no resistor is needed. Is this right?


No, you will always need to use a resistor.  A LED is just a diode, and connecting a diode across a power supply without a resistor is just like a short circuit.  There are small variations in forward voltage in LEDs that make it impossible to exactly match their voltage with a power supply, so we always need a resistor.

What's the problem with using a resistor in this case?

Grumpy_Mike

Quote
But what happens if we go the other way? If the supply voltage is less than the forward voltage what happens to the LED and its current consumption?


The LED goes out.  :)

big_mark_h

Quote
What's the problem with using a resistor in this case?

No problem at all. I was just curious as to whether I needed one or not. My understanding of the usual LED resistor equation led me to beleive that if the forward and supply voltages were the same then no resistor would be necessary.

Since I do need a resistor, given that the datasheet gives the forward voltage as anywhere between 3 volts and 3.6 volts which value should I use when working out the resistor value? If I use 3 volts, since the supply voltage is 12 volts and the LED current 20mA I get ((12-(3 LEDs x 3 volts))/0.02=150 Ohms. But if the forward voltage is taken to be 3.6 volts I get ((12-(3 LEDs x 3.6 volts))/0.02=60 Ohms.

Scott S.

#4
Apr 01, 2009, 03:33 pm Last Edit: Apr 01, 2009, 03:37 pm by minime72706 Reason: 1
Just thought I'd chime in about where that LED equation comes from if you are not familiar with electronics.




If the forward voltage of the LED is nomially 3.0V, the other 9.0V of your 12.0V supply needs to be dropped across the resistor. If you want 20mA of current flowing through the LED (as well as the resistor, considering they are in series) you would use the equation from Ohm's law:

Rled = Vres / Iled where Vres = Vsupply - Vled

Example:
Vsupply = 12V
Vled = 3.0V * #LED
#LED = 3
Iled = 20mA

Rled = (12.0V - 3*3.0V) / 20mA = 150 ohms



Technically, LEDs have forward resistance because they have a voltage drop, but you don't want them dissipating more power than they have to/are capable of. Also, this forward resistance is not fixed as is the resistance of the current limiting resistor you use.

in the case of an LED with a forward voltage of 3V with a recommended current of 20mA:
R = E/I = 3V / 20mA = 150ohm

This is not fixed, it varies with some function. The forward voltage given by the manufacturer is recommended for reliability and performance.

In my opinion, if you have four 3V LEDs in series across a 12V supply, you only need a minimal resistor. This ruins that equation about LED resistors because the impedance of an LED varies over the V-I (voltage - current) curve.

http://www.cq.cx/pics/int-led-vi.png

This shows the impedance at 1.8V forward voltage to be 360 ohms and  100 ohms at 2.0V. This is a wide variation, and although LEDs have resistance, it is not fixed like a resistor. Since none of the electrical characteristics for an LED are fixed (voltage, current, resistance), simple Ohm's law equations like the one you showed break down.

I would put the 4 LEDs in series and test the current with a DMM and see what resistance you need.
You will not blow up the LEDs this way, though the live may be diminished if the current is too high. I don't see a problem arising, though.



EDIT:
To sum it all up: If you put the LEDs in series and the voltage across each LED is 3V, and the power supply is fixed and regulated, you won't have a problem. fixed voltage == fixed current.
- Scott

Grumpy_Mike

Quote
If you put the LEDs in series and the voltage across each LED is 3V, and the power supply is fixed and regulated, you won't have a problem. fixed voltage == fixed current.


Yes and that fixed current will be too much or too little and in real life nothing is regulated 100% so there is no such thing as fixed voltage and so no such thing as fixed current. And it all changes with temperature.

so in real life:-

[size=14]YOU ALWAYS NEED A RESISTOR OR SOMETHING TO LIMIT THE CURRENT[/size]

Scott S.

#6
Apr 01, 2009, 03:59 pm Last Edit: Apr 01, 2009, 04:05 pm by minime72706 Reason: 1
Lol that's quite a run on sentence, there.

A typical 7XXX 3-terminal regulator does a pretty damn good job.
What I was basically trying to say there is that the equation he mentioned works most of the time, but breaks down when you get towards the supply voltage. He will have to experiment with the resistance, which will be pretty low.

You don't even need 20mA of current through the LED, it can be lower and even somewhat higher. I still say you don't HAVE TO put a resistor, but you don't even need 20mA either, so a small one won't hurt the performance much.

Not having a resistor is only dangerous if the supply voltage might change. With three terminal regulators like the 7812 (+12V), the chances of deregulation are low. They build a decent amount of protection into those things. However, a human "oops" could burn out the LEDs since the dynamic resistance of the LED decreases fairly rapidly with forward voltage.

Assuming the V-I curve of his specific LEDs say that at 3V, the current is 20mA, then the resistance of the LED is 150ohms at that point.

12V / (4*150ohms) = 20mA as well, obviously.

Again, I agree, a small resistor won't hurt. Without any other information, i'd try around 50 ohms.
- Scott

Grumpy_Mike

Scot,
I don't think you have quite grasped the concept of resistors and non liner devices. Operating an LED on the knee of it's characteristics and using that as your current limit is at best hairy and will produce a non repeatable design.

Quote
Not having a resistor is only dangerous if the supply voltage might change.


Simply NO, there are other things that change. The knee characteristic of a LED varies from device to device, manufacturer to manufacturer, LED technology to LED technology and it varies with temperature.

So you will end up with a circuit that will only work with the components you played about with at the temperature you played with them. So it could work fine at 20C but then it gets hot and you are operating at 35C and bang you burn out your LED. Look up thermal runaway.

Quote
Again, I agree, a small resistor won't hurt.

The smaller the resistor is the less effective it will be in curbing the variations in the LED characteristics. The smaller the resistor the less useful it is in regulating the current. When the forward voltage drop of the LEDs approaches that of the power supply then it is time to think of other current limiting measures like a constant current drive. It is not a time for thinking of abandoning all forms of current control. This is why high power LEDs are not usefully controlled with a series resistor.

Sorry to stamp on you so hard but it is important that beginners don't go on to think that current limiting components on LED systems are something of an option. It is also important that you don't think that as well.

Scott S.

#8
Apr 01, 2009, 04:23 pm Last Edit: Apr 01, 2009, 04:31 pm by minime72706 Reason: 1
Yeah, those are some things I wasn't thinking about.

The "iffyness" of the specific situation is why I said he'd have to do some testing to get the right current. I did not consider a constant current source for this application, however. Seems overkill, though not a bad idea.

A large resistor will hurt the performance of the LEDs (if thats even much of a concern to him, I don't know)

Basically, my posts were supposed to by hypothetical in nature to show him why his formula doesn't work. Especially to show that LEDs do not follow Ohm's law (though nonlinear, they are approximately linear. Linear does not mean V = I * R, it could mean V = k(I*R) where k is some constant) I can see how my posts would mislead someone now.


Thanks for livening up my day, my current job is boring.

I'm really frickin' tired this morning.
- Scott

big_mark_h

Many thanks for all your replies. However a question still remains. When working out the resistor to limit the curent do I use the upper or lower limit of the forward voltage in the equation? The datasheet gives the forward voltage as anywhere between 3 volts and 3.6 volts. If I use 3 volts, since the supply voltage is 12 volts and the LED current 20mA I get ((12-(3 LEDs x 3 volts))/0.02=150 Ohms. But if the forward voltage is taken to be 3.6 volts I get ((12-(3 LEDs x 3.6 volts))/0.02=60 Ohms.

Scott S.

#10
Apr 01, 2009, 04:36 pm Last Edit: Apr 01, 2009, 04:42 pm by minime72706 Reason: 1
Use 150 ohms, it's a standard value. It's safer to use the smaller number for forward voltage because if you assume 3.6V and your LEDs happen to all be 3.0V at 20mA then the following might happen:

60ohm resistor chosen, assuming 3.6V across each LED.

Iled = Vres / Rres
Vres = 3.0V = 12V - 3*3.0V
Rres = 60ohms

Iled = 3.0V / 60ohms = 50mA

You're screwed in this "worse-case". In reality it can be worse because of what was talked about before.

Ignoring all this crap me and Mike have been throwing at each other, if you're using the 3 LEDs, you're pretty safe with 150 ohms. It becomes difficult with 4 LEDs, and you won't be able to get 20mA most likely. It would have to be less.
- Scott

big_mark_h

Actually I'm not that bothered if each LED gets less than 20mA. I already have a 12 volt power supply so I'd like to use if possible, but I've already built a test board with sets of four LEDs in series, so it seems I have three options.

First off, I could keep the power supply I've already got, but re-design/re-build the boards to include resistors.

The second option is to get a different power supply (15 volt laptop psu?) and some resistors to my boards.

The third option would be to keep both the boards and power supply, and find some way of limiting the current.

I'd like to go with either the second or preferably the third option. Its easy enough to add a few resistors to my boards though I'd rather not have to redsign them from scratch, which is what the first option would entail.

If I wanted to keep my power supply, and keep my LEDs in sets of four what is the maximum current I could reasonably expect to give them? I'm guessing that running them at less than their maximum current will give them an easier/longer life.


Scott S.

keep it hovering around 20mA, use a potentiometer if you have one to adjust the series resistance you need, then find the resistor closest to that as you can.
- Scott

Grumpy_Mike

Quote
I'm guessing that running them at less than their maximum current will give them an easier/longer life.


Yes I normally run LEDs at 10mA as the extra 10mA seems only to contribute to glare rather than to actual brightness.
Any of those three options would do although I would suspect that options one and two are the simpler.

As to your original question as to what limit you go for, when designing something go for the half way situation for the design but check that it is within limits at either extreme.
So calculate your resistor value at 3.3V and check that when using that resistor the current is not exceeded or too low at 3.0V and 3.6V.

Oracle

LEDs are current devices, not voltage.  This seems to confuse a lot of people.

In *theory* if you connected 3.4 voltages to an LED with a Vf for 3.3, you'd fry the LED.  The only reason it works at all is in *practice* there's internal resistances in everything and your power supply can't provide unlimited current to maintain 3.4 volts.  But relying on imperfections in your components to make everything work is beyond poor engineering.

Go Up