Many LEDs have a max continuous current rating of 20mA.

If the source voltage is 5V, and Vf of the LED is 2.2V, then the resistor to set that up is:

(Vs - Vf)/current = resistor

(5V - 2.2V)/.02A = 140 ohm.

The power dissipated in the resistor is P=IV. V = IR, so sub in: P = I*IR, so .02A * .02A * 140ohm = .056W, 56mW

Alternately, Vr = Vs - Vf = 5V - 2.2V = 2.8V. .02A * 2.8V = 56mW

Say you only had 220 ohm resistors available, how much current would flow?

(Vs - Vf)/resistor = current

(5V - 2.2V)/220 = .0127A, 12.7mA

And .0127 * .0127 * 220 ohm = 35.4mW

Does that help?