Forward voltage vs Current

Surely heat is directly related to current, so if you're putting the same current through each it doesn't matter?

The graph just shows how much current would flow through the LED if directly connected to that voltage, if you're using current limiting resistors, then the graph doesn't make much sense.

Actually I found this page which I think answers my question:

"For a standard LED of 5mm diameter, Figure 1 shows the forward voltage (VF) vs. forward current (IF). Note that the voltage drop across an LED increases with forward current."

So I think what the graph is showing is how the forward voltage changes as I put more current through the LED.

With that in mind, I think that means if I reduce the current I'm trying to put through the LED, the forward voltage will drop. So if these LEDs can have a max forward voltage of 5.3v at 50mA, then if I reduce that to say 40mA, I will have less of a chance of getting a batch of LEDs that simply won't work at 5v.

Plus I've been told I should stay 80% below absolute maximum ratings anyway, so 40mA would not be unreasonable.

Surely heat is directly related to current, so if you're putting the same current through each it doesn't matter?

I'm relatively new to this whole circuit design thing, but I'm pretty sure that current isn't the only thing that matters. What matters, I believe, is the power which you can see here is calculated with voltage x current:
http://www.the12volt.com/ohm/ohmslawcalculators.asp

To use a water analogy, let's say you have two pipes going to a pool, and the diameter of the pipe is the current, and the pressure which the water is under is the voltage. If both pipes are 3" in diameter, but one has twice the pressure, then you can see there's a lot more energy there and it will fill your pool twice as fast.

So to go back to the LED example, the led with the lower forward voltage is using up more of the voltage inside it to generate light and heat, so it should get a little hotter than the one with the higher forward voltage.

I'm relatively new to this whole circuit design thing, but I'm pretty sure that current isn't the only thing that matters.

As far as LEDs are concerned current is the only thing that matters. The LED 'forward voltage' is not something that you apply, it is a result of the LED characteristics and the current. As long as the current is within the LED ratings the forward voltage and the resultant power dissipation will also be within the LED ratings. You can't use an Ohm's law calculator for an LED because an LED is not a resistor (it is non-linear) and Ohm's law does not apply. That's why you need a graph to get the relationship between current and forward voltage.

"For a standard LED of 5mm diameter, Figure 1 shows the forward voltage (VF) vs. forward current (IF). Note that the voltage drop across an LED increases with forward current."

So I think what the graph is showing is how the forward voltage changes as I put more current through the LED.

With that in mind, I think that means if I reduce the current I'm trying to put through the LED, the forward voltage will drop. So if these LEDs can have a max forward voltage of 5.3v at 50mA, then if I reduce that to say 40mA, I will have less of a chance of getting a batch of LEDs that simply won't work at 5v.

You interpreted this part correctly.

To use a water analogy, let's say you have two pipes going to a pool, and the diameter of the pipe is the current, and the pressure which the water is under is the voltage. If both pipes are 3" in diameter, but one has twice the pressure, then you can see there's a lot more energy there and it will fill your pool twice as fast.

So to go back to the LED example, the led with the lower forward voltage is using up more of the voltage inside it to generate light and heat, so it should get a little hotter than the one with the higher forward voltage.

AARRGGHH.

Don

I don't understand how the LED can lower the voltage without actually using up power. Power = voltage x current, and energy cannot be created or destroyed, so where does the voltage go once it hits the LED if not into generating light and heat?

Also, if you put two leds in series, you can get away with using a smaller resistor and generating less waste heat because there's less power that the resistor needs to dissipate. But if one led uses 30mA, and two leds in series also use 30mA, how are they generating twice the light, and twice the heat, if not by converting the voltage part of the equation to light and heat?

The LED just says: "You can't make me!"

And as it is not alone in the circuit, the dumb resistor gets to do the dirty job and gets a little hotter :wink:

so where does the voltage go once it hits the LED if not into generating light and heat?

It never gets there. If the LED 'decides' to take 2.7V at a given current, the remainder is dropped 'elsewhere' (series resistor, driver chip...).

where does the voltage go

The voltage doesn't 'go' anywhere and it doesn't 'hit' anything. You apply voltage to a series circuit and current flows. The amount of current is determined by the components in the circuit. When everything settles out there is a certain amount of current flowing and a certain amount of voltage appearing across the various components. Did you ever wonder why it takes so many years to earn an engineering degree?

Don

Lets go back to some earlier posts:

But I would still like to understand what that graph is trying to tell me.

The graph is telling you the relationship between current and voltage for the LED. You need a graph because the device is non-linear and Ohm's law does not apply.

If the graph were showing me how much current I could put through the LED if it had different forward voltages, then wouldn't the current be inversely proportional to the forward voltage, rather than proportional to it?

You've got the independent and dependent variables reversed. The graph is telling you how much forward voltage you will get for different forward currents. You said this yourself here:

So I think what the graph is showing is how the forward voltage changes as I put more current through the LED.

I don't understand how the LED can lower the voltage without actually using up power.

The LED does not lower the voltage. The voltage is determined by the current flow and the current flow is determined by the forward voltage drop of the LED and the resistance. If you see a chicken/egg situation here you are correct. That's where the time is used up in getting the engineering degree.

Basically you decide what current you want and you look at the graph to find out the forward voltage at that current. You subtract that voltage from the supply voltage to determine the voltage across the resistor. You apply Ohm's law to determine the required resistance. You choose a resistor near that value, put it in the circuit, and measure the current. If the current is too high you raise the resistance and if it is too low you lower the resistance. Remember you are dealing with more than one approximation. The graph is for a 'typical' LED, yours may be different. The marked resistance on a resistor is a 'nominal' value, yours may be different.

Don

1 Like

But if one LED "decides" to take 4.3v, and other "decides" to take 4.7v, are you telling me the one that uses 4.7v at 50mA isn't gonna get hotter than the one which takes 4.3v at 50mA?

Also, where does this leave me regarding the issue with some of the leds potnetially haivng a forward voltage of 5.3v? If I lower the current I'm tying to put through it, will I also lower that forward voltage?

This must be the first time I heard about term forward voltage, and I just don't get the point? Is this some sort of special LED that requires so much more... something ... than normal LED does?
It must be my narrow capasity in english...

Only thing I see here, is that voltage potential is divided in to two pieces with resistor, which also limits the max. current. As it was said, voltage don't disappear, nobodys eating it, current just turns to heat at some point. Kirchhoff's circuit laws - Wikipedia

Is there any other point? Again, I would like to learn if there's something I am missing.
Thanks!

Cheers,
Kari

Forward voltage is the voltage drop across an LED. In other words, if you put an LED in a circuit with a 5v source, and its forward voltage is 2.4v, then you will read 5v-2.4v, ie 2.6v at the anode of the LED. (I think!)

You would then do your Ohm's law calculation on the remaining voltage, and the amount of current you want to put through the LED, and you get:
R = E/I
R = 2.6v / 0.020a (20mA)
R = 130ohms

And you would then find the next largest standard resistor size which is closest to that.

This is why you can put only a few LEDS in series. In the above example, you could put two LEDs in series before the voltage you have left is too low to light another. Add a third LED, and no power will flow in the circuit and all the LED will remain dark.

And that's why I'm asking about this particular LED. What makes this LED special is that depending on the batch of LEDs I get, the forward voltage for a single LED can range from 4.3v to 5.2v. But my supply is only 5v. So if I get a bunch of LEDs with a forward voltage of 5.2v, they won't work.

At least that's what I assume. What I was asking is if the forward voltage and the current you put through the LED are related, if I could lower the current I put through them from the 50mA used in the datasheet, and then always be certain these LEDs will work in a 5v circuit.

scswift:
Forward voltage is the voltage drop across an LED.

So, it is just a voltage drop? Why wouldn't just use that term?
Waste of time...

Cheers,
Kari

So, it is just a voltage drop? Why wouldn't just use that term?

Why don't we just use electron flow instead of conventional? Because engineers are silly that way apparently. Forward Voltage is the term used in every datasheet I've seen. It confused the hell out of me for the longest time too. If it had been called Voltage Drop it would have made more sense. And I would have figured out how circuits work a lot faster if I wasn't being confused by which way the electricity was flowing. :slight_smile:

Here's the important part first:

What I was asking is if the forward voltage and the current you put through the LED are related, if I could lower the current I put through them from the 50mA used in the datasheet, and then always be certain these LEDs will work in a 5v circuit.

Your problem is right here. The 50mA value is the 'Absolute Maximum' rating. This is a value that you never want to exceed, not the value that you design for. Typically you operate at somewhere around half the absolute maximum rating. At 25mA your LED should have a (nominal) forward voltage of around 3.75 volts so your resistor should be around 50 ohms.

Here's the nit-picking part:

Forward voltage is the voltage drop across an LED. In other words, if you put an LED in a circuit with a 5v source, and its forward voltage is 2.4v, then you will read 5v-2.4v, ie 2.6v at the anode of the LED. (I think!)

This is ambiguous because we don't know where the resistor is and it is wrong regardless of how you connect the circuit. You really measure voltage between two points, not 'at' a point. When you express the voltage 'at' a certain point then the other point is assumed and the assumption is typically some common point, frequently referred to as 'ground'. So in this case if you have the resistor connected to the + side of the battery and the LED between the resistor and the - side of the battery (ground) then the voltage 'at' the anode is 2.4v. If you have the LED connected to the + side of the battery and the resistor between the LED and the - side of the battery (ground) then the voltage 'at' the anode is 5v (unless you have the LED in backwards).

You would then do your Ohm's law calculation on the remaining voltage, and the amount of current you want to put through the LED, and you get:

Although "through the LED" is technically correct you really should use "through the resistor" since that's where you are using Ohm's law.

You are concentrating on the voltage aspects because of the fact that your particular LED has a forward voltage drop that is greater than your available voltage when it is operating at high currents. You have correctly surmised that you must settle for a lower current in order to operate such an LED from a 5v supply. Once you have determined an appropriate current (from the graph) then you should concentrate on that current. Don't worry about the power or anything else.

So if I get a bunch of LEDs with a forward voltage of 5.2v, they won't work

.But they will work if you lower the current.

Don

Forward Voltage is the term used in every datasheet I've seen. It confused the hell out of me for the longest time too. If it had been called Voltage Drop it would have made more sense.

Because there is also an important parameter called the 'Reverse Voltage' rating. It is on the next line of your data sheet, right below the 'Forward Voltage' rating. That is why you must use the term forward voltage for the parameter that you are talking about.

Don

So, it is just a voltage drop? Why wouldn't just use that term?
Waste of time...

Because, since the LED is not a resistor, the voltage across the LED is different when it is 'forward biased' than when it is 'reverse biased'. This is really where the terms forward and reverse come from. Are you beginning to understand why an engineering education takes so long?

Don

floresta:

So, it is just a voltage drop? Why wouldn't just use that term?
Waste of time...

Because, since the LED is not a resistor, the voltage across the LED is different when it is 'forward biased' than when it is 'reverse biased'. This is really where the terms forward and reverse come from. Are you beginning to understand why an engineering education takes so long?

Don

Yes, I really do, and I would never be ready to go that road!!!
XD

What I always consern, is the max. current with the Vcc I'm using, and for some reason, most of the times nothing fries. Lucky me!?

Cheers,
Kari

Your problem is right here. The 50mA value is the 'Absolute Maximum' rating. This is a value that you never want to exceed, not the value that you design for. Typically you operate at somewhere around half the absolute maximum rating.

I'm aware of the need to run at below the Absolute Maximum rating. The reason I'm quoting the 50mA over and over is because that's also the listest test condition for the LED. I'm not sure why the manufacturer did that other than to pad their stats if they don't expect the LED to be run at 50mA. Most seem to list current levels around 66% lower than the Absolute Maximum for their test conditions.

At 25mA your LED should have a (nominal) forward voltage of around 3.75 volts so your resistor should be around 50 ohms.
You have correctly surmised that you must settle for a lower current in order to operate such an LED from a 5v supply.
But they will work if you lower the current.

Thank you. I've asked about that several times, but this is the first straight answer I've gotten.

Forward voltage is the voltage drop across an LED. In other words, if you put an LED in a circuit with a 5v source, and its forward voltage is 2.4v, then you will read 5v-2.4v, ie 2.6v at the anode of the LED. (I think!)

This is ambiguous because we don't know where the resistor is and it is wrong regardless of how you connect the circuit. You really measure voltage between two points, not 'at' a point. When you express the voltage 'at' a certain point then the other point is assumed and the assumption is typically some common point, frequently referred to as 'ground'. So in this case if you have the resistor connected to the + side of the battery and the LED between the resistor and the - side of the battery (ground) then the voltage 'at' the anode is 2.4v. If you have the LED connected to the + side of the battery and the resistor between the LED and the - side of the battery (ground) then the voltage 'at' the anode is 5v (unless you have the LED in backwards).

So, you're saying that if I have:
(-) --> resistor --> -led+ --> (+)

And I touch my probes to (-) and the + side of the led, I'll get 5v... whichs makes sense since the + side of the led is also connected directly to (+) and the potential difference between (-) and (+) with nothing else between would be 5v.

But if I have:
(-) --> -led+ --> resistor --> (+)

And I touch my probles to (-) and to the + side of the led, I'll get 2.4v... which I can only assume is because of the resistor being there. But I'm not sure I understand how that works.

Lastly, to take it a step further, if I have this setup:
(-) --> -led1+ --> -led2+ --> resistor --> (+)

And I touched a probe to (-) and to led1's + terminal, what would I read?
And what would I read if I touched a probe to (-) and led2's + terminal?

Also, why am I not reading 2.6v at those points if my LEDs forward voltage is 2.4v? I thought you could supply two leds with a 2.4fv off a 5v supply and have 0.2v left over that the resistor needs to dissipate?

Why don't you connect it up and probe it? Much easier to learn that way.