5 Lilypad LEDs in series from 3V source

I have 5 LEDs Lilypad.

They are rated at Forward VOltage of 2V, so I guess this is the voltage required by the LED to light. IF = 20 mA and its rated at 78mW. Im using a drop off V of 1.8V and Im getting that from a 3V source it would require a 60Ω resistor. Im using this web calculator.

If I use 5V I would need 160Ω. Now the thing is I want to put about 5 of these in series. So they that means the resistance in the circuit would be greater due to the increased amount of LEDs in it and I should be able to get away with using no resistors at some point, up to a certain point, right?

So if:

R = (Vs - Vled) / Iled

this means that the Rtotal would be the (Vs - Vtotal leds) / I totalleds. But this would mean that Im doing the same to the num as the denominator so the result would be the same:

5v-(3*5V) = 5-15V = -10V all divided by I total leds = 20mA * 5 = 100mA. But that -10V seems wrong obviously. Please help me understand what Im doing wrong.

Im guessing I would need more voltage because if the forward voltage is 2V for each LED and I have 5, that means I need at least 10V?

Would you accept my saying, "This is not a good idea" without going into a long explanation as to why that's so?

LEDs have suicidal tendencies. Allow them to draw more current and they will keep drawing more and more until they pop. It's called thermal runaway. You have to limit the current, either with a resistor or with a constant current circuit.

The formula you used above is Ohm's Law. When he invented that law, there were no semiconductors. He thought the law applied to all conductors, but now we know better. So why do we use the law to calculate resistance of LEDs? We don't. When you use Ohm's Law like that, you are applying the law to the series resistor, not the led itself. Resistors obey the law. Because the resistor is in series with the led, the resistor limits the current for both.

When things are connected in series the same current runs through all of them. So it's 20mA not 20mA * 5.

And yes you would need a voltage at least great enough to allow 2V per LED. Voltages in series DO add up so that's 10V minimum. And because it's a really bad idea not to use a series resistor then you'd probably want at least 12V so you can drop 2V across the resistor to set the current.

Best practise is probably to run them in parallel with a resistor to each LED but a series connection with a single resistor can work for small low-current LEDs.