I have a lot to learn. Here is your total beginner question of the day. Is the current what the led is rated for, or what flows out of the battery? I don't know where to get the numbers for the equation W= V * C
Many LEDs have a max continuous current rating of 20mA.If the source voltage is 5V, and Vf of the LED is 2.2V, then the resistor to set that up is:(Vs - Vf)/current = resistor(5V - 2.2V)/.02A = 140 ohm.The power dissipated in the resistor is P=IV. V = IR, so sub in: P = I*IR, so .02A * .02A * 140ohm = .056W, 56mWAlternately, Vr = Vs - Vf = 5V - 2.2V = 2.8V. .02A * 2.8V = 56mWSay you only had 220 ohm resistors available, how much current would flow?(Vs - Vf)/resistor = current(5V - 2.2V)/220 = .0127A, 12.7mAAnd .0127 * .0127 * 220 ohm = 35.4mWDoes that help?
For normal 20mA LEDs get the lowest wattage and hence the cheapest you can find. That is normally quarter or eighth watt.The numbers to use are the current through the resistor, which in this case is the same as the LED and the voltage across the resistor which is the supply voltage minus the voltage across the LED.
Well now, the formula for power is V2/R.Considering resistors placed directly across 5 V, V2 is 25, so a 100 Ohm resistor would dissipate ¼ W, and a 220 Ohm resistor an 11th of a Watt.Clearly, a tenth Watt rated 220 Ohm resistor is safe in any circuit powered at 5 V.