I connected 3 LED's in serie to a 12V car battery. The LED's are rated 3.5V 4V max. The question is when the car is running the alternator is sending 14.5 so I was wondering about burning the LED's. Should I install a resistor in the circuit? What value should it be?
I connected 3 LED's in series to a 12V car battery. The LED's are rated 3.5V 4V max. The question is when the car is running the alternator is sending 14.5 so I was wondering about burning the LED's. Should I install a resistor in the circuit? What value should it be?
For sure you would need a resistor, you can figure out the value from the LEDs current consumption assuming you want to drop 4 volts across it - if the LED's want 40ma then you need like 100 ohms.
I would not be inclined to do this myself though. I'm not sure how minor differences in the LEDs would affect the result. There are "12 volt" LEDS meant for automotive use that don't need any coddling - could you try one of those instead?
You would use ohms law to figure out the resistor your need... like the other answer stated it would be roughly a 100ohm res but here is why...
You say you need 3.5V to the leds... and your supply voltage will be 14.5V... so by subtracting the 3.5V from the 14.5V we need a voltage drop of 11V. So now we know we need to drop 11V. Find out what the amperage of the LEDs are... lets just say they are 40mA. So you have 3 LEDs... so you have 120mA (40mA x 3). So now we know the voltage drop we need, and the amperage we will have.
To find the resistance value we need we take the voltage (in V) and divide it by the amperage (in A). So we need to convert the mA to Amps by moving the dec place... so we get 120mA = 0.120A. So now we get the following Resistance = 11V/0.120A, which means Resistance = 91.66 ohms.
I hope that makes sense. So if you had 14.5 volts and a single 3.5V 40mA LED it would look like this: Voltage drop = 14.5-3.5 = 11, then we do the rest of the math R = 11/.04 , R=275 ohms
If you had 14.5 volts and two 3.5V 40mA LED's it would look like this:
R = 11/.08, R= 137.5ohms
If you had 12 volts and three 3.5V 40mA LED's it would look like this:
Voltage drop = 12-3.5=8.5V, Amperage = 3x40mA = 120mA = 0.120A, do the math R = 8.5/0.120, R=70.83ohms.
Anyways... you get the idea... I hope this helped some.
@windhamrules
Alas, the op originally specified 3 LEDs (at 3.5v) in SERIES.
So, R = E/I = (14.5 - 3*3.5)/0.020 = (14.5 - 10.5)/0.020 = 4/0.020 = 200-Ohms.
Sorry, 40-mA is just a bit much for the taste of many LEDs. And
you'll probably get just as much brightness and LIFE out of the LEDs if
they're driven at 10-mA, thus use a 400-ohm resistor in series with the
LEDs.
You mean limit the voltage not the power and no you do not have to limit it.
Make sure the LEDs are rated for 3.5V forward voltage drop because that sounds high to me. The resistor limits the current going through the LEDs (same current goes through each if they are in series), this current must be below the specification of the LED you use, 20mA is normal but as have been stated 10mA is usually fine. When calculating the resistor value always take the lower voltage readings from the specification.
An LED is what is known as a non liner device, that is the relationship between current and voltage is not a straight line if plotted on a graph. That means you can't just give it a voltage and let it take care of itself, you have to control the current. A simply way of controlling the current is through a series resistor, but make your calculations based on the maximum input voltage it will experience.