...lets say the resistor has a resistance of 50 Ohms (I have no idea what the typical LED resistance is)

The resistance of an LED is

**non-constant and nonlinear. At low voltages, the resistance is relatively high, and at high voltages, it's very low.** (A diode or LED basically "turns off" when the voltage across it is low, and it "turns on" when the voltage across it is high-enough.)

Ohm's Law is a physical law. It's ALWAYS TRUE* and

**you CAN calculate the resistance under the particular conditions** if you know the EXACT voltage/current characteristics. If the LED is rated 20mA at 2V, that's 2V/0.020 = 100 Ohms.

But, those voltage/current characteristics vary from part-to-part and with temperature. So if you apply 2V, you

*might* get a lot more than 20mA (the resistance might be lower than 100 Ohms) and the LED might burn-up, or you might get a lot less than 20mA, and the LED will be too dim. That's why we use something else (typically a resistor) to control the

*current*, rather than applying a controlled voltage.

* In AC circuits wiht inductors and/or capacitors, there can be phase differences between the current and the voltage. So, if you measure the voltage and current it can

*seem* like Ohm's Law isn't true. But, if you measure voltage and current at any instant in-time, you'll ll find that the law holds true.