Please understand. This is a semiconductor diode, not a resistor!
So you're saying that if i put my power supply on 3.4V constant voltage, and put a led on it, it will short circuit because the led won't limit the current?
For me looking at it from a voltage and a current perspective are pretty interchangable ways of looking at a circuit, and for certain applications one of both ways is often preferred, but that doesn't mean the other suddenly doesn't work anymore.
racemaniac:
So you're saying that if i put my power supply on 3.4V constant voltage, and put a led on it, it will short circuit because the led won't limit the current?
For me looking at it from a voltage and a current perspective are pretty interchangable ways of looking at a circuit, and for certain applications one of both ways is often preferred, but that doesn't mean the other suddenly doesn't work anymore.
I've not been paying much attention to this thread but we're on page three already so I've got to start wondering if racemaniac is on a wind up
KenF:
I've not been paying much attention to this thread but we're on page three already so I've got to start wondering if racemaniac is on a wind up
my british english doesn't reach far enough to get what that means :p. (i assume it's not nice :p).
well, at least i'm learning something in this discussion (i hope :p)
This is all just hypothetical until you measure the voltage drop across the led with a 220 ohm resistor and tell us the forward voltage for the led you are using for the experiment. We can't do the math until we know that. So far that remains unknown, therefore the math is just hypothetical.
racemaniac:
So you're saying that if i put my power supply on 3.4V constant voltage, and put a LED on it, it will short circuit because the LED won't limit the current?
Precisely.
It's not strictly a "short circuit", but the current will not be limited to a reasonable value.
The "equivalent circuit" of the LED and its behaviour is that it acts as if it were actually a series combination of three parts; a battery which corresponds to the threshold voltage (somewhere about, but not exactly 3.4V), a perfect diode (so that you can only put current in to the battery, not take current out) and a very small resistance of a few ohms.
I've not been paying much attention to this thread but we're on page three already so I've got to start wondering if racemaniac is on a wind up
Funny it should be a south-eastern England term, I understood what it meant.
You understood what you thought he meant. He hasn't clarified if that's what he meant and his comment
above implies he is wondering if Racemaniac is winding up his post, which is entirely different from wondering if Racemaniac is teasing us. I don't think it is clear what he meant. Perhaps Ken will clarify what
he meant. (it isn't far fetched that racemaniac is just jerking our chain but I actually got the impression he
was trying to learn. I don't think I am the only one.)
@Racemaniac,
3.4V is not a common forward voltage for a led. The generic red 5mm are usually about 2.2 to 2.4V.
Anyone with a variable bench supply can test the theory that a led won't burn out if supplied only with it's
forward voltage. For a led with a Vf of 2.2V, the current limiting resistor should be:
Let Vf = 2.2V@ 20mA
Let Vcc = 5V
(Vcc-Vf)/Iled @ Vf=(5V-2.2V)/0.02=2.8V/0.02 A = 140 ohms.
[quote author=Nick Gammon link=msg=2052442 date=1421792790]
As for what KenF really meant, perhaps we'll let him speak for himself, but possibly the UK version is not totally unlikely.
[/quote]Yep, I think he's jerking our chain. I find it hard to believe that he can get through this thread and still have such a deluded conception of how this stuff works.
Just in case I'm wrong here's a little picture I knocked up in Paint. It shows the typical behaviour of a resistor across a range of voltages, compared with the behaviour of a typical diode (which includes the run of the mill LED).
Maybe this will make the point. If you don't give the diode enough voltage it wont conduct much at all. There's a fine line between barely conducting at all OR causing something to go pop (due to too much current being conducted). Using the qualities of the LED itself to limit the current it requires is just unpractical.
This is the point that everyone has been trying to make for 4 pages. Diodes are NOT linear in their response to applied voltage.
That current curve is so steep after the turn on voltage I think it would be very difficult if not impossible to
know EXACTLY what voltage would illuminate the led sufficiently without burning it up in the absence of a
current limiting resistor.
That being said and also noting that the OP never acknowledged the responses from everyone, all in all , I
think this thread actually turned out to be very educational for other readers. To say it should be "sticky"
might be a bit of a stretch but it addresses a question that has probably puzzled more than one Newbie and
as such I think the comments in this thread would be helpful for Noobs to begin understanding semiconductors so it wasn't a waste of time. I think others can benefit from it .
Now an LED is a device the ideally has no current flowing through it, that is it looks like an infinite resistance, when the voltage across it is below it's "turn on voltage". When the voltage across it reaches this point it all of a sudden has no resistance or looks like a short circuit. So how do we deal with that with ohms law?
Well we can't. So instead we say that if we have a resistor and LED in series we know that the same current flows through both devices. The voltage across the resistor will be proportional to the current through it AND the current through the LED is the same as the current through the resistor. We know the LED will have it's turn on voltage across it so we can express ohms law for the resistor as:-
I would vote for a "sticky". It's a bit tedious overall so its exact value for a "n00b" could be questioned, but it's a ripping yarn, the topic is perfect and the appropriate explanations are there!