LED 5v Strip Actual Voltage Required

I have a 5v LED strip with the controller that comes with it.
I want to control with Arduino.

I am using MOSFETs etc.

The 5v will power a rail that powers the arduino and the MOSFETs.

However, when measuring voltages using the included controller, on bright white, the RGB voltages were 3.2, 3.6, 3.7 resp.

On a single RGB colour, full brightness, each voltage was the same.

Therefore, should one not set the output voltage above those, and have different levels for each RGB?

The strip itself has 5v written on it, and works at 5v, however I worry about shortening the lifespan.

The strip is like this one here: TV Led Strip

Thanks,

Dave

The voltage doesn't directly effect the brigthness of the LEDs. It's the current that dictates it. Each LED in the strip will have it's own current control due to resistors or a driver chip that works with the 5v input.

Perhaps this post will be useful to you - https://www.makeuseof.com/tag/connect-led-light-strips-arduino/

Thanks. It's a good article, although nothing new to me there. I was just concerned that just like applying 5v to a 2.3 led is not good, this might be the same, and there was a "hidden" element in the controller, along with the usual disclaimers about how these things are to be used...

I guess I'll shove 5v at it, and see how it goes :wink:

I was just concerned that just like applying 5v to a 2.3 led is not good,

There are resistors built-into the LED strip, one for each LED. If there are RGB LEDs, each LED has 3 resistors.

If you apply 5V directly across the LED you'll kill it!

When you apply 5V, the voltage gets divided with about 3V across the resistor and about 2V across the LED. The non-linearity of the LED (its resistance changes with voltage) makes the voltage across the led to "magically fall-into place". The resistor determines the amount of current through the LED so it's bright-enough but doesn't burn-up.

If you apply 12V, you'll still get about 2V across the LED, with the remaining 10V across the resistor... But you'll get too much current and you might fry the LED. (Higher voltage requires higher resistance.)

If you know the voltage across the resistor and the resistance you can calculate current with [u]Ohm's Law[/u]. (In a series circuit the voltage is divided but the same current flows through the resistor and LED.)

BTW - "Resistance" is the resistance to current-flow... More resistance = less current.

Although Ohm's Law is also true for the LED (it's a law of nature) we can't use it directly on the LED because the non-linearity means we don't know the resistance. Once we know the current and voltage across the LED we can calculate the LED's resistance, but the calculation is only correct under those conditions... If we change the series resistor or if we change the applied voltage, the LEDs resistance will change.

Thanks.

Instead of putting my meter across the end of the pins of the led strip, I put them directly on the ouput of the little module that controls them. It seems to output about 4.3v.

Even though I get the current vs voltage stuff, it was more about a strip labelled 5v, only being given 3 by it's included controller, and therefore I wondered if it was advertising a 5v USB connector that had in-built shizzle that actually only output a lower voltage. It seems to be only slightly lower from where I now measured, so I am thinking the rating on the strip, is for the strip, and I should be ok.

Thanks all. Now to figure out how to PWM control 9 MOSFETs from one arduino...