LED Resistor Question

I am planning to power an LED with two AA batteries, which should put out 2.8-3.2V depending on how charged they are. My LED has a forward voltage of 3.0-3.4V, and current of 80mA. I did the math to figure out what resistor I need, and got a very wierd answer.
The equation for finding the proper resistance is (Source Voltage-LED Voltage)/LED Amperage = Resistor Ohms
(3.2-3.4)/.08 = -2.5 Ohms
I obviously cant get a negative resistor, so should I just not use one?

The problem is that if your battery voltage is 2.8V and your LED forward voltage is 3.4V, you "can't get there from here" -- where is the extra 0.6V going to come from?

In practice, the LED forward voltage is at a certain "bright" current (usually 20mA). So while it may well be 3.4V at 20mA, there will still be current flowing at 2.8A, just at much lower brightness, even with no resistor. How much brighter depends on the LED, and the particular LED that you have in your hand (which might be different from the one that was next to it in the bag).

In these types of situations I just put in a nominal small-value resistor (like 10 ohms) just to handle the worst case of 3.2V battery voltage and 3.0V forward voltage. It has minimal effect at the other extreme (where battery voltage is less than forward voltage).

--
The Rugged Audio Shield: Line In, Mic In, Headphone Out, microSD socket, potentiometer, play/record WAV files