Say I have 6 LED's in series. Each LED has a voltage drop of 2 volts. If I supply the LED string with exactly 12volts do I still need to use a current limiting resistor? Because 12-12=0 and 0/0.02A = 0
That's 2V at 20mA. What is it at >20mA? Add a resistor to make sure it does not exceed.
The problem with that is that Vf is not a fixed value, but rather varies from led to led and changes with temperature. So string will undergo variable brightness and even might turn off or over current the voltage source. Current must be managed externally either with series resistor or driving the string with a constant current source.
CrossRoads:
That's 2V at 20mA. What is it at >20mA? Add a resistor to make sure it does not exceed.
Theoretically though to calculate the resistor required to ensure 20mA u use (vs-vf)/amps. So if vs-vf is zero then why do I need to put in a resistor?
If I do put in a resistor the LED's are not going to get 2v each. They will get less voltage will they not? Then they might not even come one?
retrolefty:
The problem with that is that Vf is not a fixed value, but rather varies from led to led and changes with temperature. So string will undergo variable brightness and even might turn off or over current the voltage source. Current must be managed externally either with series resistor or driving the string with a constant current source.
Ok so must the vs-vf never equal zero then? Should you power supply always supply more voltage than the total voltage drop from the LED's? Is it correct if I say that?
calvingloster:
retrolefty:
The problem with that is that Vf is not a fixed value, but rather varies from led to led and changes with temperature. So string will undergo variable brightness and even might turn off or over current the voltage source. Current must be managed externally either with series resistor or driving the string with a constant current source.Ok so must the vs-vf never equal zero then? Should you power supply always supply more voltage than the total voltage drop from the LED's? Is it correct if I say that?
Yes, correct. A resistor just burns off any excess voltage to maintain a fixed maximum current. A constant current source has a higher voltage source that it can manipulate the current at a constant value by automatically and continously changing the voltage applied to the led(s).
calvingloster:
Ok so must the vs-vf never equal zero then? Should you power supply always supply more voltage than the total voltage drop from the LED's? Is it correct if I say that?
Yes.
(somebody will now invent a special case where is isn't .... but in the real world it's 99% true to say that)
A constant current source supplies exactly the right voltage automatically to maintain the programmed
current.
A constant voltage source doesn't so can only be used with a resistor to turn it's
thevenin equivalent into something a bit more like a current source - sort of.
Since forward voltages vary by perhaps 10% or so at most, you only need say 20%
spare voltage to allow a series resistor regulate the current to within a factor
of 1.5 or so, which frankly is good enough to fool the eye.
To get the best performance from a high power LED (which are expensive)
its worth investing in an accurate constant-current supply (these are usually
boost-converters with current output).