Led resistor calculator help

i recently got those high powered led's, but the problem is there is no datasheet whatsoever, so i tested them out ,i found out that at 3v it takes about 45ma, and does not require a resistor(led voltage=power source).now i want to run 5 of them at 6v,what resistor value would i need??? i tried figuring this out by a potentiometer but i burned it.........

Depending on how "high powered" your LED's are depends on what current they run on. Are they 1W, 3W, 5W, etc?

I'll assume they are 1W, just for example.

From past experience, these LED's operate at 3.7V @ 350mA.

That means that they will have to be in parallel if you want them running off 6v.

Try playing with an online calculator like this...
(http://led.linear1.org/led.wiz)

(BTW, some other values for power LED's:
3W : 3.7v @ 700mA
5W : 3.7v @ 1000-1200mA)

Enjoy!

Regards,
Fudge.

I'd check what color they emit as well.

Red usually has a voltage drop of 1.6-2.2 volts, green and blue 3.0-3.6 volts and white also 3.0-3.6 volts.
3.7 volts will work for a lot of leds, but isn't a standard value....
Leds with an aluminum plate for heat-excange usually are 1 watt or more. Without they're often using less then 1 watt.

Do you have any pictures of the leds ?

so i tested them out ,i found out that at 3v it takes about 45ma, and does not require a resistor(led voltage=power source).

You always need some form of current limiting no matter what the voltage. What you found was a very unstable condition.
http://www.thebox.myzen.co.uk/Tutorial/LEDs.html

With high power LEDs you can't use a resistor, it is not a stable solution, you need a constant current driver, so don't even try and calculate the resistor value.

oh yes , the guy from whom i bought said they were 0.5 watt, and yes this is very unstable i dunno why, also they are very fussy about what current they need, the heat up like crazy even at 3v sometimes.

yes this is very unstable i dunno why

Because the forward voltage is temperature related, it also varies with age.

the heat up like crazy even at 3v sometimes.

Yes they do, heat is the product of voltage drop and current. A 0.5W LED is going to need a heat sink of some sort, many LEDs have these built in but I see yours do not.

You need a constant current supply.

oh yes , the guy from whom i bought said they were 0.5 watt, and yes this is very unstable i dunno why, also they are very fussy about what current they need, the heat up like crazy even at 3v sometimes.

Most LEDs are not at all fussy about what current they need, but as long as you keep thinking about supplying 'voltage' to an LED you are doomed to failure. You do NOT apply some specific voltage voltage to an LED to get it to work. The 'forward voltage' value specified (or measured) for an LED is the result of somehow getting the correct current to flow through the diode. We typically use a fixed voltage supply and a current limiting resistor to do this, but the constant current supply mentioned by Mike is a better choice if you have one.

So to deal with an LED you start with the forward current that you need, typically about half it's maximum rated value (lets use 20 mA). Next you estimate what the forward voltage drop will be with that current flowing through the LED. If you have a datasheet for the LED you may be able to get a fairly accurate value, otherwise you take a guess based upon your experience or the experience of others as in reply # 2 (I usually use 1.7v for a red LED). Next you pick out a supply voltage which must be higher than the voltage you just determined (I'll use 5v). Now you can use Ohm's law to determine the required resistance. The voltage across the resistor will be the difference between the supply voltage you decided to use and the voltage that you guessed would be across the LED (5v - 1.7v = 3.3v). The current through the resistor will be the same as the current through the LED (20mA). Ohm's law for the resistor says that R = V/I (R = 3.3/0.020 = 165 ohms). You then pick the closest value resistor that you happen to have and stick that in your circuit. Most likely the current won't be exactly what you desired and the LED voltage won't be what you guessed would be there but you won't see any smoke either and you will see light from the LED.

Don

i tried figuring this out by a potentiometer but i burned it.........

This will happen whenever you use a potentiometer as a rheostat and don't use a fixed current limiting resistor as well. Your potentiometer had a relatively fixed voltage across it (the supply voltage less the forward voltage of the LED). Now look at Ohm's law for your rheostat (I = V/R). You had a fixed voltage divided by a variable resistance. As the resistance went down the current went up. Each time the resistance was halved the current doubled. What was the current just before the resistance got to zero? Answer: a lot (theoretically almost infinity). In your case the potentiometer burned out before the LED but it could have gone the other way or you may have hit the jackpot and destroyed both of them.

Don

These LEDs don't cost you too much do they? Get new ones with specifications. That's what I'll do.

If you trust the guy you bought them from, with the 0.5W rating, you could use some variable power supply, a few fixed resistors and a voltmeter to find out what what voltage current value the LED works at 0.5W. with a serial resistor, say 300ohm, increase the power supply voltage and measure voltage across the resistor until you get V_led*I_led~=0.5W and stop there. I made my students do this on regular diodes but they could use a function generator and a two-channel oscilloscope. I wouldn't waste that much time for cheap LEDs.

The 0.5w power rating is a maximum allowable value. You wouldn't want to operate at that value.

Don

floresta:
The 0.5w power rating is a maximum allowable value. You wouldn't want to operate at that value.

Don

Yes, yes, my point was also "only ~= 0.5W and stop" (that's when the OP trusts the 0.5W so maybe while ramping voltage the LED pops at a lower value then good luck trying to be a QC/testing engineer) :slight_smile:

You really don't have to 'test' anything. You just keep the current below it's maximum rated value and the power will automatically be less than the maximum rated value. The only reason to be at all concerned with the forward voltage is (1) to make sure your supply is greater than this value, and (2) to have a ballpark value from which to start calculating the series resistor.

Don

The OP doesn't have either the current rating or forward voltage and a 0.5w was all that the OP knows. So to make an estimate of both forward voltage and current rating from the 0.5W, I suggest the ramping voltage.

and (2) to have a ballpark value from which to start calculating the series resistor.

It's a high power LED you DON'T use a resistor with these.

got them working,with a resistor ,and they dont heat up, the problem was that the two same type of led ,were different but they looked same; one type required 145ma and the other type 30-40ma, so i fixed a pot and got the 30-40ma one working perfectly, the voltage drop i measured was 3.10v and the current was 35ma,with a resistor of around 81 ohm.

There's nowt so stupid than those wot won't learn.