10v high amps

Greetings,

Apologies for using an Arduino forum to post a non-Arduino question... at least, no Arduinos are involved as of yet...

I want to drive a string of high-power LEDs. Each one is 700ma with a 3.25v forward voltage. I figured that a 10v supply driving 3 chips in series, with suitable current restriction, would be acceptable; with active cooling, the tiny 0.25v oversupply across 3 diodes should be acceptable.

According to my calculations (er, ahem, a calculator i found on t'net), a 5w 1-ohm resistor in series with a 10v supply should adequately limit the current. The circuit should draw 2100mA. My small bench PSU is limited to 2A, so that would be a good start, before bringing in the big PSU for the final test.

So, I wondered about using a 7810 linear regulator; except they're limited to 1A, and even if I used 1 per diode, that 2v drop is going to cause some warmth... Meanwhile, I found a circuit diagram for a 30A 12V PSU which used a linear regulator to drop rectified 24V to 12V, and used paralleled TIP2955 transistors to pass up to 30A current: http://electronics-diy.com/12v-power-supply.php

So... I copied the circuit, substituting the 7812 for a 7810, ditched all the capacitors as I was feeding it nice smooth 12V DC (instead of rectifying 24VAC).

The voltage regulation works perfectly, but the circuit I built only passes about 1.4 amps (max), the TIP2955 gets chuffing hot, so I can only conclude it's not being driven to saturation, and that the regulator is passing the bulk of the current. The regulator gets warm, but not hot.

So - what have I done wrong? Do I need more volts to drive this circuit properly? Have I picked a bad transistor for a 10v output?

Stop thinking analogue. That wastes tooo much power.

3watt LEDs need switching constant current LED drivers.

Here’s a 1-96 LEDs driver board I made.
16 channels, 330mA (1watt) or 660mA (3watt), 3-6 LEDs in series, with 12-bit dimming and 2-wire I2C control bus, and switching Arduino supply. All on a 8x10cm board.
Leo…
Led Driver x16.jpg

These small boards drive three 600mA LEDs in series from a 12volt supply. With a bit of hacking (remove the bridge rectifier and solder a wire to the chip) you might be able to PWM them. Leo..

Thanks.... but I'm actually interested in why the circuit doesn't work properly. I must be missing something, and in the interests of learning, I'd like to know what it is (that I'm missing)...

I'm guessing that the transistor (I started with just one) isn't being driven to saturation, as it gets hot within a few seconds. Meanwhile, the total current draw is still less than the 7810 will give on its own.

If the transistors are regulating the voltage, of course they are not in saturation.

You don't run LEDs from a regulated voltage, you run LEDs from a regulated-controlled CURRENT. Usually, its a switching constant-current supply because these are very efficient. For example, a 700mA constant current supply could power one or more 700 mA LEDs, and the voltage "falls into place". (There is a voltage & power limit for the supply.) A linear (non-switching) current source is just as inefficient as a linear voltage supply, and you'll get heat.

With small LEDs you can use a voltage higher than the LED's rated voltage and a resistor in series sets/controls the current for an approximately constant-current source. That can be done with high power LEDs, but you'd want at least 15V (for at least a 5V drop across your current limiting resistor), but you'd need a power resistor, and it's a poor way to do it.

...with a 3.25v forward voltage. I figured that a 10v supply driving 3 chips in series, with suitable current restriction, would be acceptable; with active cooling, the tiny 0.25v oversupply across 3 diodes should be acceptable.

According to my calculations (er, ahem, a calculator i found on t'net), a 5w 1-ohm resistor in series with a 10v supply should adequately limit the current.

If the 10V supply is exact, and if the LED voltage is exact, that's 0.25V / 1 Ohm = 250mA. And, the resistor is dissipating 62.5 milliwatts.

Hi Doug,

Please - forget the LEDs for a moment.

Why does the linked circuit (which is not mine) NOT provide more than 1.5 amps @ 10v when it should be good for 5 amps? (assume 1 transistor, not the 6 parallel ones in the circuit diagram, and a 12v DC supply of up to 10 amps* replacing the transformer)? Maybe I need to parallel up the transistors anyway, even though I only need 5 amps (for now)?

I've never built my own switching PSU, guess I'll just have to shrug and buy one - which I hate doing because I learn nothing that way, except how to part with £ notes - and I'm already expert at that activity.

  • Edit to add - the 10A 12V limit is my bench PSU. I'd like more amps later - e.g. I have an old server power supply here which will deliver over 100A @ 12V. This is why I want to stick to 12V.

The linked supply is how we used to do it 40 years ago when we didn't have switching supplies. It's more a space-heater than a supply. Using 24volt AC to make 12volt/30A is just crazy. The guys who designed/posted this must be stuck in a loop.

As you already have been told, LEDs don't like a directly connected constant voltage supply. If you do use a constant voltage supply, current has to be limited with a resistor. That's how they do it on LED strips. Three LEDs in series (~10volt), a resistor (~2volt across), and a 12volt supply.

Read post#3 again. Experiment with these small boards and your 12volt/100A supply. Or use a three-LED and 3.3ohm/2watt resistor series string on your 12volt supply. Leo..

Thanks Leo, I’ll take a closer look at switching PSUs. They’re a long way out of my electronics ability to design/make just now; my main curiosity was why the “simple” 30A DC PSU circuit wasn’t working properly, I don’t have any dogs in that fight though, so I’m happy to ditch & go with 12v+resistor (for now) & a proper switching PSU at a later date, when I understand them.

I’ve also got some LED COBs I want to try out, but they need 36v… I’ve ordered a couple of DC-DC constant volt/constant current adjustable converters for those. Still, I really want to design my own one day… or at least adapt an open source design to my own needs.

More time studying the Art of Electronics required, methinks.

Thanks for all the replies again, folks.

Just some ebay links.

12volt/20Amp for US14.34, shipped to your door. If you have to buy that in parts in the UK, you would spend 10x that price. http://www.ebay.com/itm/Universal-12V-24-360W-2-5-10-20-30A-Switching-Power-Supply-Driver-for-LED-Strip-/231667699743?var=&hash=item35f078901f:m:mhU16LOJEyhyChwdfXTojWQ

~36vollt/20watt constant current boost converter (12volt in). ~US$2 http://www.ebay.com/itm/10w-20w-30w-50w-Constant-Current-LED-Driver-Supply-DC9-24V-High-Power-Light-Chip-/141706709084?var=&hash=item20fe60c85c:m:mEHDBX5PfPVdAfyyBtNOcgg

Hey Adev

I know it's a long time since this thread was active, but reading I got interested in knowing if you ever discovered the bug, if not, how many 7810 are in your circuit.

AdeV: So - what have I done wrong? Do I need more volts to drive this circuit properly? Have I picked a bad transistor for a 10v output?

That circuit requires 24V input to work, all the resistor values are specific to that, and its woefully inefficient even for a linear circuit, using devices nearly half a century out of date... Basically what you have there is a 24V heater.

10V is not large enough to provide stable current limiting to LEDS that require almost 10V, the current through the series resistor could fluctuact markedly with temperature or different batches of LEDs.

Either use a standard 12V supply with a large current limiting resistor, or a constant current LED power supply, both of these are switch-mode these days...

AdeV: Why does the linked circuit (which is not mine) NOT provide more than 1.5 amps @ 10v when it should be good for 5 amps?

If you put 10v across three 3.25v LED and a resistor, all in series, the drop across the resistor will be .25v.

So it should pass 250mA.

1.6A mean that you have a 1.6v drop across the resistor (you simply cannot drive that much current through otherwise.

Or, assuming the 10v is correct, your resistor is R=V/I =.25/1.6=1/6.4= .156 ohm.

n Or, the drop across LEDs is less than 3.25v