Go Down

Topic: LED different voltage resistor value (Read 2513 times) previous topic - next topic

lyron

Jun 28, 2013, 02:14 pm Last Edit: Jun 28, 2013, 02:25 pm by lyron Reason: 1
Okay, so here's the deal:

I've got two unknown LED's connected in series to a 12V source via a 47 Ohm resistor.
Now I would like to power them off 5V, how would I calculate the proper resistor value?

Edit: found the type of LED: http://www.soselectronic.com/a_info/resource/pdf/king/l-934f3bt.pdf


strykeroz

Hi,

Firstly I think you'll find that resistor value is 470R?

Using the forward voltage from the datasheet at 1.6V, and plugging 12V, 1.6V, 20mA and 2 LEDs into an online calculator like this one you'll find 470 Ohms is the recommended resistor value.  Change that voltage to 5V and you'll get 100 Ohms as the new current limiting resistor. 

Cheers ! Geoff
"There is no problem so bad you can't make it worse" - retired astronaut Chris Hadfield

michael_x

I don't believe in  47 Ohm  @ ( 12 V - 2*1.3 V ) This would give 190 mA, far beyond the max 50 mA.

Either the datasheet does not fit, or it's not 12V, or it's not 47 Ohms.

Easiest solution: use a 1k pot, start at 1k, and see when you have sufficient brightness.
If the leds really start shining at about 1.2V and 15 mA is a proper current (according to your datasheet) ,
you can easily run them in series with a 200 .. 300 Ohms resistor at 5V.

lyron


I don't believe in  47 Ohm  @ ( 12 V - 2*1.3 V ) This would give 190 mA, far beyond the max 50 mA.


Me neither, but it's really what the kit designer used: http://www.velleman.eu/downloads/0/minikits/manuals/manual_mk162.pdf

Maybe they chose it because since it's an IR remote the IR LED's will only go on for a very short while

sonnyyu

Why not use constant current, LED driver IC to replace resistor.

Simple 90V, 25mA, Temperature Compensated, Constant Current, LED Driver IC - CL25

Power supply  5 V to 90 V.

CL25, Mouse, $0.71

MarkT


I don't believe in  47 Ohm  @ ( 12 V - 2*1.3 V ) This would give 190 mA, far beyond the max 50 mA.



Datasheet says continuous current max 50mA, peak current max 1.2A... 
[ I will NOT respond to personal messages, I WILL delete them, use the forum please ]

nickgammon

From: http://led.linear1.org/led.wiz

Assuming you use 20 mA current and 1.2V forward voltage:

Code: [Select]

Solution 0: 2 x 1 array uses 2 LEDs exactly
    +----|>|----|>|---/\/\/----+  R = 150 ohms

The wizard says: In solution 0:
  each 150 ohm resistor dissipates 60 mW
  the wizard thinks ¼W resistors are fine for your application
  together, all resistors dissipate 60 mW
  together, the diodes dissipate 48 mW             
  total power dissipated by the array is 108 mW     
  the array draws current of 20 mA from the source.


Please post technical questions on the forum, not by personal message. Thanks!

More info: http://www.gammon.com.au/electronics

db2db


Why not use constant current, LED driver IC to replace resistor.

Simple 90V, 25mA, Temperature Compensated, Constant Current, LED Driver IC - CL25

Power supply  5 V to 90 V.

CL25, Mouse, $0.71


curious why is this better than using a resistor.

sonnyyu

#8
Jun 29, 2013, 03:56 am Last Edit: Jun 29, 2013, 04:04 am by sonnyyu Reason: 1
It works with power supply from 5 v ~ 90 v, never need change resistor when power supply change. plus temperature compensated, the current keep constant when temperature change.

The high quality SSR (solid state relay), use LED driver IC to replace resistor for support variable input voltage  5 v ~25 v to drive optoisolators, other wise if OK with 5 v then 25 v's current is too high.  same thing happen with 25 v.

If you use SSR with Arduino, the SSR with led driver IC is the way to go, since if SSR use resistor the current is set at 15 v.

pwillard

I would place a bet on the outcome that the GPIO pin is actually using PWM and not continuous current if the resistor is only 47 Ohms.  A LED can by driven at much higher current when you don't leave it turned on full-time.

Go Up