Your problem is right here. The 50mA value is the 'Absolute Maximum' rating. This is a value that you never want to exceed, not the value that you design for. Typically you operate at somewhere around half the absolute maximum rating.
I'm aware of the need to run at below the Absolute Maximum rating. The reason I'm quoting the 50mA over and over is because that's also the listest test condition for the LED. I'm not sure why the manufacturer did that other than to pad their stats if they don't expect the LED to be run at 50mA. Most seem to list current levels around 66% lower than the Absolute Maximum for their test conditions.
At 25mA your LED should have a (nominal) forward voltage of around 3.75 volts so your resistor should be around 50 ohms.
You have correctly surmised that you must settle for a lower current in order to operate such an LED from a 5v supply.
But they will work if you lower the current.
Thank you. I've asked about that several times, but this is the first straight answer I've gotten.
Forward voltage is the voltage drop across an LED. In other words, if you put an LED in a circuit with a 5v source, and its forward voltage is 2.4v, then you will read 5v-2.4v, ie 2.6v at the anode of the LED. (I think!)
This is ambiguous because we don't know where the resistor is and it is wrong regardless of how you connect the circuit. You really measure voltage between two points, not 'at' a point. When you express the voltage 'at' a certain point then the other point is assumed and the assumption is typically some common point, frequently referred to as 'ground'. So in this case if you have the resistor connected to the + side of the battery and the LED between the resistor and the - side of the battery (ground) then the voltage 'at' the anode is 2.4v. If you have the LED connected to the + side of the battery and the resistor between the LED and the - side of the battery (ground) then the voltage 'at' the anode is 5v (unless you have the LED in backwards).
So, you're saying that if I have:
(-) --> resistor --> -led+ --> (+)
And I touch my probes to (-) and the + side of the led, I'll get 5v... whichs makes sense since the + side of the led is also connected directly to (+) and the potential difference between (-) and (+) with nothing else between would be 5v.
But if I have:
(-) --> -led+ --> resistor --> (+)
And I touch my probles to (-) and to the + side of the led, I'll get 2.4v... which I can only assume is because of the resistor being there. But I'm not sure I understand how that works.
Lastly, to take it a step further, if I have this setup:
(-) --> -led1+ --> -led2+ --> resistor --> (+)
And I touched a probe to (-) and to led1's + terminal, what would I read?
And what would I read if I touched a probe to (-) and led2's + terminal?
Also, why am I not reading 2.6v at those points if my LEDs forward voltage is 2.4v? I thought you could supply two leds with a 2.4fv off a 5v supply and have 0.2v left over that the resistor needs to dissipate?