You're both right. You could use a resistor if you are willing to sacrifice efficiency and full brightness -- otherwise, use an active LED driver, as suggested by CrossRoads. But, the intermittent cycle actually makes using a resistor more feasible.
The trick to getting a resistor to "work", is to select a resistance value that will behave enough like a current regulator. An ideal constant current source/sink is, essentially, an infinite resistance [R = V/ΔI if ΔI = 0, then R is = ∞]. But, a large[ish] resistance will suffice -- if you can find the correct value.
You might be able to do the needed analysis using a datasheet -- if one is available -- and it has the needed data. Otherwise, if you are willing to risk the destruction of one or more of these LEDs, try the following experiment:
- Equipment needed:
a. Adjustable power supply [PS] with current limiting, that is capable of supplying enough current [something like 600mA, at least]
b. Current meter that can measure up to around 600mA
c. One of the 3W LEDs
d. Test leads
- Set the PS to limit current to 500mA, then set it's voltage to 1.5V
- Set the current meter to read up to 500mA
- With the PS turned off, connect the current meter in series with the positive PS lead, then connect the free current meter lead to the positive side of the LED
- Connect the negative PS lead to the negative side of the LED
- Turn on the PS and slowly adjust the voltage up, watching the current meter and the voltage level
- As soon as the LED starts to go into thermal runaway (i.e. when the current starts going up on it's own), note the PS voltage VF and the current reading IF
- If you didn't catch one or both of the readings at the moment the LED started into thermal runaway, then back the PS voltage off to around 1.5V and turn it off and let the LED cool down -- then, when it's cool, start over at #6
Use these readings in the following formulas:
R = (VBATT - VF) / ( 0.8 * IF)
P = (VBATT - VF) * 0.8 * IF *2
Where:
-
VBATT is the maximum possible Battery Voltage (which will have to be much higher than 3V -- try two lithiums in series for 6V)
-
VF is the noted Forward Voltage on the LED
-
IF is the noted Forward Current
-
R is the value of the resistor to put in series with the LED (adjust to the next higher standard resistor value for a resistor at the power level arrived at in the second formula -- then calculate the power P, etc.
-
P is the power rating on that resistor (adjust this value up to the next available standard resistor power)
-
*2 is per a common rule-of-thumb that suggests doubling the calculated power value. Then, select the next up available standard power rating -- so, if you calculate 3.4W, then use a 5W resistor.
Give that a try. If the LED goes up in smoke then adjust the 0.8 factor lower (say 0.7) and try again ('cuz ted was wrong
).
And then, another crazy idea: if that Lithium battery has just the right internal resistance (and this assumes your battery can truly supply the 400-500mA needed), it might be able to limit the current to the LED on it's own. But, to test this, you will need to try connecting the LED across the battery (make sure it's fully charged, first). Do this on a non-flammable surface, in a well ventilated area, and stand well away from the thing, so power it from a distance, and have a fire extinguisher at the ready, and if you're a kid, get parental supervision! There's even a chance that Thermal runaway won't be a problem, because as the impedance of the LED goes down, this will reduce the battery voltage, and with luck, that will regulate the current. It's a crap shoot, but who knows. Make sure to film the experiment, and if it explodes, you can put it on YouTube!