 # Resistor + LED...

A single resistor decreases the current but not the voltage, doesn't he?
=> so when I connect a 9V battery to my LED (which consumes 3.5V) though a resistor I only get the current decreased but not the voltage?

Here my main question comes:
-When a LED burns/melts/stops emitting?
A) When the voltage is too high
OR
B) When the current is too high

And the last question: Is the voltage or current crucial for the LED?

Thanks to everyone who viewed this topic and greatly thanks to the ones who commented here.

First you need a resistor to put a limit for the led, it needs 20mA. to work, but first you need to reduce that voltage, the cheapest way it's with a resistor divided, google it, it's easy just 2 or 3 operations and you will have ur resistors value, after you have 3.5V. with the volt. divider, calculate the resistor for the current with ohms law. 3.5V. and 20mA its... R = 3.5v / 20mA usually it doesn't give comercial value, you can use the next lower comercial resistor, conect it in serie and your led will work fine.
If you put more voltage or more current you will burn your led.

Current is like water flow.
Voltage is like pressure drop.
Resistance is like a small pipe.

The current is the same everywhere in a loop. The pressure drop distributes over the components in the loop.

In your LED example, the resistor will limit the current, and this current flows through both the led and the resistor. Also voltage will drop over the resistor. The remaining voltage drops over the LED, so the led will actually see less voltage.

"The remaining voltage drops over the LED, so the led will actually see less voltage."

That's thinking of it backwards. The supply minus the LED voltage is what gets dropped by the resistor once the current flow is enough to turn on the LED.
(Vsupply - Vled)/Resistor = resulting current
OR
(Vsupply - Vled)/current desired = resistor value needed

If there is a transistor in the loop as well, that will also drop some voltage.
The equation then becomes this for an NPN transistor, where Vce can be looked up:
(Vsupply - Vled - Vtransistor)/current desired = resistor value needed

Or this for a MOSFET which has an Rds when turned on:
(Vsupply - Vled)/(Resistor + Rds) = resulting current
or
(Vsupply - Vled)/current desired = resistor value needed - Rds

“And the last question: Is the voltage or current crucial for the LED?”
Current. The voltage across the LED is what it is, the rest of the voltage is dropped by the resistor.
Too much supply voltage and too low of a current limit resistor will let too much current flow, burning out the LED.

A single resistor decreases the current but not the voltage, doesn't he?

Let me stop you right there. No.

You are conflating two ideas. A single resistor with no current flowing will not decrease the voltage but as there is no current it will not decrease the current either.

A resistor with current flowing through it will develop a voltage across it according to ohms law.

-When a LED burns/melts/stops emitting?
A) When the voltage is too high
OR
B) When the current is too high

Yes both they are like two sides of the same coin you can not separate them.

israldo69:
it needs 20mA. to work
.... after you have 3.5V. with the volt. divider,
... you can use the next lower comercial resistor

A number of errors in there:

LEDs do not "need" 20mA to work- 20mA is about the maximum they can accept. I regularly limit the current to below 10 on modern resistors: even 5 can be quite bright.

You do not need a divider: just a series resistor.

You should use the next highest value resistor, not the lowest.

ardy_guy:
LEDs do not "need" 20mA to work- 20mA is about the maximum they can accept. I regularly limit the current to below 10 on modern resistors: even 5 can be quite bright.

I recently put a 10k resistor in series with the blue power LED of our mediacenter PC, so it wouldn't illuminate the whole room. It is still visible. Even if 10V dropped on that resistor (driven by 12V, which I doubt, I guess the LED is driven with 5V, and a forward voltage of 2V, which is not enough for blue), that would be 0.1mA.

Ok, thanks to everybody, but I have another question:
Does a resistor ALWAYS set an upper border of the current? Like lets say, no matter you have 10A or 0.100A at a wire, the resistor connected to it with resistance X ohms will always reduce it to max 0.020A, is this true?

The only difference would be the power dissipated by the resistor (now I understand why there are different resistors by power like 1/8, 1/4, 1/2 and so on) which can be calculated with the formula Watt = Volt * Ampere

JMD1:
Like lets say, no matter you have 10A or 0.100A at a wire, the resistor connected to it with resistance X ohms will always reduce it to max 0.020A, is this true?

You don't have a certain number of amps "at a wire"... you have volts across it. The current is a result of the volts and resistance. The actual form of Ohm's Law, regardless of which way you rearrange it algebraically, is that the current is proportional to the voltage; the constant of proportionality is the resistance.

"Like lets say, no matter you have 10A or 0.100A at a wire, the resistor connected to it with resistance X ohms will always reduce it to max 0.020A, is this true? "

No, it's an equation, V=IR, if you know two variables you can solve for the third.
So for example, you have a power source, an LED, and a known resistor. Then (Vsource - Vled)/resistor = current flow.
Or if you want to set a current limit, then (Vsource - Vled)/desired current = resistor needed.
This works for simple, low power LEDs (like the small SMD, or 3mm or 5mm LEDs). For high power LEDs, a constant current supply is needed. The LEDs heat up as current flows, and their resistance drops, allowing more current to flow creating more heat, and if not limited by an active controller they will burn up.

The only difference would be the power dissipated by the resistor

No.
The power dissipated in a resistor, is the current through it times the voltage across it.

Power = i * V

if you rearrange the formula you can get
Power = i2 * R

or
Power = V2 / R

But it is all the same formula.

"And the last question: Is the voltage or current crucial for the LED?"
Current. The voltage across the LED is what it is, the rest of the voltage is dropped by the resistor.
Too much supply voltage and too low of a current limit resistor will let too much current flow, burning out the LED.

How can I place resistor so it decreases only the current but not the voltage?

A led has a fixed voltage drop when it’s conducting. The remaining voltage is over the resistor.

JMD1:
How can I place resistor so it decreases only the current but not the voltage?

You can't

V=IR which will always stand

--
Mark