I need to convert a strip of 16 leds from 31 V to 12 V. All 16 leds are not in one series. They are divided into two series. Each series has 8 leds and one resistor. Each series gets 31 V. When lit, the voltage drop at one led seems to be 1.75 V. That would be 14 V over all 8 leds. The remaining 17 V drops over the resistor.
But what is actually happening here? When I connect the power, is there a moment when 31 is divided to all 8 leds? That would be some 3.9 V. And that will light up the leds. The current through a led is some 27 mA. And 27 mA to drop 17 V needs a 630 Ohm resistor, which is what I measured. So the leds need the 3.9 V to light up, but when the current starts flowing, each led only "senses" the 1.75 V. Am I on right track here?
I tried to bypass the resistor and connected the 8 leds directly to a 12 V source (led battery). No light. It would have been only 1.5 V over each led. I tried with only 7 leds, which would be 1.71 V. Still no light. Without a resistor, I didn't dare to test only 6 leds, which would have given 2 V.
People are always pointing out the importance of a current restricting resistor, when using leds. But on some page I read that if the voltage source is equal to the forward voltage of the led, no resistor is needed. But could it be that the leds always need the resistor? With the 630 Ohm resistor, my 8 leds light up with 31 V, 3.9 V per led, then keep shining beautifully with 1.75 V.
If I'm right, I might have to divide my 16 leds into four groups of four leds plus one resistor, to be able to use my 12 V source. Four leds need 7 V. The resistor needs to drop 5 V, which an 190 ohm resistor would do with a current of 27 mA.
So there's no Arduino involved - yet. I just want to get this led thing straight.