Can someone please confirm if my thinking is correct here ?
I have a small battery powered 3 LED light which I want to convert to my 12V power supply.
The 3 batteries ( AAA size ) are 1.5V each in series, so 4.5V total.
The insides of the light contain an 11ohm resistor, and then the 3 LEDs ( quite bright, 5mm diameter, 8 mm height ) in parrallel.
I placed some card between the first battery positive terminal and the contact, then measures with an ampmeter and it is drawing 40 mA from the batteries.
So my thinking is that if I have a 12VDC supply, I should place a resistor to suit 4.5V 40mA, and power via the ( now removed ) battery terminals.
The page : http://led.linear1.org/1led.wiz?VS=12;VF=4.5;ID=40 says I need a 220 ohm 1 W resistor.
Since the best I have available is 220 ohm 0,5W I think I have 2 options available :
option 1 : As far as I can understand, resistors placed in parallel will halve the resistance, while in series will add the resistance. So I split the Vcc line to 2 x 220 ohm ( 0,5W ) in parallel, effectively using 110 ohm resistance from each, but halving the energy loss factor, and then after each of the 2 resistors, a second 220 ohm in series, to add the rest of the required resistance, and manage the rest of the heat loss.
so : / 220 ---- 220 \ Vcc ------- --------------- LEDs \ 220 ---- 220 /
option 2 : I have a few L7808 linear voltage regs and small heatsinks in the hobby box, so reduce the power from 12V to 8V, and then use a 100 ohm ( 0,25W ) resistor to the LEDs.
I think that as I have all the components for the above 2 options, option 1 looks the better and simpler choice. Or am I missing something ?