I have previously used the "(Supply voltage- forward voltage)/ LED current" equation to calculate resistor values, but the "(Supply voltage- forward voltage)" part seems to suggest that if the supply voltage equals the forward voltage then no resistor is needed. Is this right?
But what happens if we go the other way? If the supply voltage is less than the forward voltage what happens to the LED and its current consumption?
What's the problem with using a resistor in this case?
If you put the LEDs in series and the voltage across each LED is 3V, and the power supply is fixed and regulated, you won't have a problem. fixed voltage == fixed current.
Not having a resistor is only dangerous if the supply voltage might change.
Again, I agree, a small resistor won't hurt.
I'm guessing that running them at less than their maximum current will give them an easier/longer life.