oh yes , the guy from whom i bought said they were 0.5 watt, and yes this is very unstable i dunno why, also they are very fussy about what current they need, the heat up like crazy even at 3v sometimes.
Most LEDs are not at all fussy about what current they need, but as long as you keep thinking about supplying 'voltage' to an LED you are doomed to failure. You do NOT apply some specific voltage voltage to an LED to get it to work. The 'forward voltage' value specified (or measured) for an LED is the result of somehow getting the correct current to flow through the diode. We typically use a fixed voltage supply and a current limiting resistor to do this, but the constant current supply mentioned by Mike is a better choice if you have one.
So to deal with an LED you start with the forward current that you need, typically about half it's maximum rated value (lets use 20 mA). Next you estimate what the forward voltage drop will be with that current flowing through the LED. If you have a datasheet for the LED you may be able to get a fairly accurate value, otherwise you take a guess based upon your experience or the experience of others as in reply # 2 (I usually use 1.7v for a red LED). Next you pick out a supply voltage which must be higher than the voltage you just determined (I'll use 5v). Now you can use Ohm's law to determine the required resistance. The voltage across the resistor will be the difference between the supply voltage you decided to use and the voltage that you guessed would be across the LED (5v - 1.7v = 3.3v). The current through the resistor will be the same as the current through the LED (20mA). Ohm's law for the resistor says that R = V/I (R = 3.3/0.020 = 165 ohms). You then pick the closest value resistor that you happen to have and stick that in your circuit. Most likely the current won't be exactly what you desired and the LED voltage won't be what you guessed would be there but you won't see any smoke either and you will see light from the LED.
Don