high-power LED voltage control?

(Sorry for yet another thread about RGB LEDs...)

I have a 10W RGB LED that is essentially 3x separate strings of red, green & blue LEDs (so not common anode or common cathode; each colour has its own anode & cathode). The blue & green are rated at 9-11v whilst the red is rated at 6-7.5v. Current is rated at 350mA which I presume is per colour (this way the total power adds up to roughly 10W).

I also have 3x 330mA constant current LED drivers that accept 9-35v in.

My question is how do I ensure that a safe voltage is going to the different strings of LEDs? I realise that the voltage will change in order to keep the current constant at 330mA, but if I connect a 12v SLA (at ~12.8v) to the input of the driver I get just under 12.8v coming out which is too much for any colour (especially red).

You need to understand "constant current source". A constant current source will vary the voltage of its output to whatever value is necessary to keep the right current flowing. If you don't attach a load it will try to generate as high a voltage as possible, this does not reflect the behaviour with a proper load.

These constant current drivers are switching step-down regulators - all they need is a voltage in excess of the output voltage (by some margin), and they will efficiently convert the power to the output (93% according to the specs).

They say for 3 blue/green LEDs a supply of 15 to 24V will do, for red 12--24V will do. So you may find they won't drive the green/blue properly from just 12V.

Well I tested the blue & green with the driver powered from just over 9v worth of batteries wired in series (so that the output from the driver without load was just over 9v & within the safe range of the LEDs) & they lit fine - green left quite an after image in my eyes... I didn't actually measure what the output voltage was when the LED was attached though.

Although the spec sheet of the driver says that 15-24v will do for 3 blue/green LEDs I would be very apprehensive of applying this to a string of LEDs rated at a maximum of 11v? Is this really safe? Or will the voltage coming out of the driver not actually be anywhere near 15-24v when the LED is connected?

I think you are misunderstanding the LED voltage given. This is not a safe limit, it is the voltage dropped across the LEDs when a specified current is flowing through them.
Your constant current supply makes sure that the voltage is adjusted so that the correct current is flowing. It is the current that is going to kill your LEDs if there is too much not the voltage. Although too much voltage will push too much current through a device if you have nothing to limit it, but in your case you have. It is the constant current supply.

So is it safe to just power the drivers from 12v & hook them up directly to the LEDs?

No.

cjdavies:
So is it safe to just power the drivers from 12v & hook them up directly to the LEDs?

The current is an exponential function of voltage (and temperature dependent). This means that any attempt to power the LEDs through a pure voltage source is going to produce a current that could vary be many orders of magnitude depending on the temperature, the particular LED and batch etc - ie its not going to work - it'll could be far too dim one day and melt the next.

A non-pure voltage source (such as a power supply with a limit to the output current) might produce the result you want if the current limit is what you want, but you won't be able to run several different LEDs from it.

Individual current control is needed. Often this is a series resistor - gives crude current control, but good enough for most LED applications (except perhaps semiconductor lasers). A constant-current source is the ideal, you will have a brightness unaffected by variations in supply as a bonus.