Go Down

Topic: Lighting 8-12v RGB led with arduino (Read 702 times) previous topic - next topic


This is the led I have: http://www.alibaba.com/product-gs/712130845/10_Watt_RGB_LED_high_power.html
These are the led input voltage:
RED -> 6-8V
GREEN -> 12V
BLUE -> 12V
And I noticed that the RGB leds have a common anode (positive) and 3 different catodes (negative), so how can I connect two of these leds to 12v and the other to 6-8v if they have the same anode?
Do I need to have 6v or something in the catode of the RED led for it to have --> 12v - 6v = 6v?



Apr 28, 2013, 05:55 am Last Edit: Apr 28, 2013, 05:57 am by codlink Reason: 1
You can use a simple voltage divider made up of resistors.

You use this to get the values: http://www.raltron.com/cust/tools/voltage_divider.asp



No that's not a good idea. For starters you're underestimating the requirement for high wattage rating resistors. You're also moving down a path that suggests you'll be trying to power the LEDs with no current regulation; believing that you can power (for example) the green and blue segments with 12V will burn them. Finally, shoving that much power through a resistor is quite wasteful.

I'd suggest to OP that some more searching for examples of how high power LEDs are driven would be a good idea. Here's a good place to start: http://www.thebox.myzen.co.uk/Tutorial/LEDs.html


Apr 29, 2013, 10:53 pm Last Edit: Apr 29, 2013, 10:57 pm by DVDdoug Reason: 1
No...  A voltage divider won't work!

With a high-power LED (1W or more) the best solution is a special constant-current LED power supply. (You can build one or buy one.)   

If you connect anything to a voltage divider that requires significant current, the resistance of whatever you hook-up, messes-up your calculated voltage.  Voltage dividers work fine for low current "signals", but not as power supplies.   LEDs make the situation worse since the resistance of an LED is not constant.

LEDs are not powered (properly) by a constant voltage.  You need to supply a constant (or approximately constant) current with enough voltage available to turn-on the LED.

You can create an approximately-constant current source, by using a series resistor (the voltage gets divided a lot like a regular voltage divider).   You subtract the rated LED voltage from your supply voltage to find the voltage across the resistor.    Then you use Ohm's Law to calculate the required resistance from the required current and the voltage across the resistor.  (In a series circuit, the voltage is divided among the series components but the same current flows through all components.)   

For best results, the supply voltage should be twice the LED voltage, so the voltage across the resistor is at least equal to the voltage across the LED.  The more voltage you have across the resistor, the closer you are to a constant current source.

A series resistor works fine for regular low-power LEDs.  For high-power LEDs, you need a high-power resistor.  Power is calculated as Voltage x Current.   So, with 12V across the resistor, it will have to dissipate the same (approximately) 3W as each LED element.

It's inefficient, the resistors get hot, and this is the main reason for using a proper constant-current switching power supply.

Go Up