Like Don said, LED's current is not linear when voltage changes. You need to resize the resistor. Just think your LED as a part, that gives you voltage drop and current to work properly.2.4V, 50mA. Imagine that as a resistor RledRled= 2.4v/50mA= 48 ohm.R1 will be 2.6v/50mA=52 ohms
Maybe I misunderstand the whole problem
In this case the voltmeter is connected across the LED and the LED is forward biased. Therefore it is reading the forward voltage drop of the diode which happens to be 2.4 volts.
You would know if you think across instead of at. If you put the voltmeter across the resistor you get the voltage drop of the resistor. If you put the voltmeter across the LED you get the forward voltage of the LED (assuming it is connected properly). If you put the voltmeter across the supply you get the supply voltage. Those are your only three choices in this circuit.
Isn't what flows through the meter everyhting but what flows through the component between the probes?
If I were to replace the 5v source with a 9v source, I would not expect to read the same 2.4v.
Now we are running in place. Are you playing with us, or what is going on?If you want to have 50mA to go throught the LED, of course you change the resistor for 9 volts, right?
1) If I use the same resistor with a higher voltage, I will put more current through the LED.2) As I put more current through the led, the LED's forward voltage increases.3) If I put my meter probes across the LED, I will get the LED's forward voltage.4) Therefore if I conducted the above experiment, the value I read with the meter would increase, just as I would expect it to if I was reading supply voltage - forward voltage, but it would not increase nearly so much as I would expect in that case, because what I'm actually reading is just forward voltage, and that will have increased only slightly.
1) If I use the same resistor with a higher voltage, I will put more current through the LED.