Excess voltage with voltage regulator

I'm repurposing my 7.4v (2S), 11.1v (3S) and 14.8v(4S) cell LiPo batteries from a previous use to lighting my wife's winter scene houses.

My specific question is, when the voltage regulator reduces voltage from the batteries above to the required 5 volts, what happens to the excess voltage? Is that voltage simply discarded? Is that voltage reserved so as to extend the time of useful life for the battery? Can I use the excess voltage to power an additional LED?

The voltage regulator I'm about to use is HERE:

The LED bulbs I'm using are HERE

Thank you in advance for any advice.

Mike

Those are switching regulators, so they turn on and off very rapidly. Smoothing by inductors or capacitors is required for stable, low voltage output. Some of the voltage is dissipated as heat, so you must not exceed the maximum output current rating of the regulator, or it will overheat and shut down.

With that type of voltage regulator ("switching" or "switch-mode") the energy is "converted" with high-efficiency. The voltage is lowered and less current is pulled-out of the battery than comes-out of the regulator.

With a linear voltage regulator energy is wasted and the voltage regulator heats-up. (A switching regulator heats-up too, but not nearly as much.)

LEDs are "current operated". They are non-linear. When the voltage goes-up a little the resistance drops a lot and the current goes-up a lot. Or when the voltage drops, the current drops a lot and the LED is dim.

High power LEDs (1W and up) are normally driven from a special "constant current" power supply. It puts-out the proper current and the voltage "falls into place". ...Pretty-much the opposite of how everything else works.

Regular little LEDs us a current limiting resistor. About half of the energy is wasted in the resistor, but it's no big deal since it's a tiny amount of energy and it's super-simple.

You can use a resistor with higher power LEDs, but make sure to calculate the power dissipated by the resistor, and buy a resistor that's rated for about twice the wattage. (It's best to allow some safety margin.)

The power (wattage) dissipated by the resistor is calculated as the voltage across the resistor X the current through it. The current through the resistor and LED are the same and the voltage is divided... If you apply 12V with 5V across the LED, there will be 7V across the resistor, etc.

Most high-power LEDs also need a heatsink. One watt is a lot of heat concentrated inside an LED chip.

That is a lot of light, I am assuming the items are < ~20 cm in size. I did that same thing many years ago and used the simple 5m LEDs, they worked great. I also powered from a Mains supply as she has over 20 units. Test with one and be sure it is what she wants. We used street lights etc repurposed from model train sets.

Boy... you guys are pretty bright. (that's what I am trying to accomplish with the LED's). Using the breadboard pictured, I setup a prototype with the 4 cell battery (14.8v). I ran it for about 10 minutes monitoring the heat from the voltage regulator. It ran cool and there was no perceptible change in the light output. This appears to work great for my purpose. Is there anything dangerous about this configuration that I should be aware of?

Again, I thank you for your response.

Mike

Not necessarily dangerous, but breadboards are intended for temporary experiments with low power logic circuits, and cannot support large currents, such as drawn by motors, servos and high powered lights. The tracks will burn if the current exceeds a few hundred milliAmperes.

Did you understand that there is no excess voltage?

Yes, but let me ask in another way...

If, in a make-believe situation, I use a 4S (14.8v) battery to power a 5v LED and the battery dies after 5 hours, would it also take 5 hours to die if I powered two 5v LED's?

So what I'm asking is, can I take advantage of the excess voltage by using it for another LED at the same time?

What is the Amp hour rating of the battery? What is the Watt rating of the LED(s)?

No, it would die in 2.5 hours. If you use 5 LEDs it would die in 1 hour. Make sense?

"No, it would die in 2.5 hours. If you use 5 LEDs it would die in 1 hour. Make sense?"

Yes. That does make sense. Thank you Jim P.

Excuse me for being ignorant on this topic but using the breadboard test as pictured in my earlier post, the LED worked perfectly. However, once I used the Arduino Nano, I couldn't get it to work. It was intermittent. With a 4S (14.8v) battery connected, the the output of the voltage regulator was 1.8v rather than the expected 5v. Is that because I am not using a resistor?

How are you integrating a Nano to this?
Schematic please

What was the purpose of the Nano?
Were you somehow trying to turn the LED off and on?

If the LEDs are wired in series you can run both for 5 hours. Whether your LEDs are happy with such an arrangement depends on the exact LED modules. In general if there are no drivers or so on the LED boards you can put them in series and drive them from a single driver.

I think my problem is solved. First, I had a bad solder joint. Once I fixed that then the output from the voltage regulator was 3.84v which caused the 5v LED to be pretty dull. I then changed the out put from the regulator from 5v to 9v. The result is an output of a little over 6 volts. The LED has been running great for about 6 hours now. I'm testing to see what the time will be with a 14.8v, 4S battery.

The question came up earlier, what do you need the Arduino nano for? I want to light up my wife's Christmas scene with these LED's. I'm turning them on and off using an old TV remote because there are about 14 of them. Eventually, I'll create groups of 3 or 4 that can be turned on and off at the same time using the remote and the IR Receivers.

My problem is solved and thank all that have tolerated my lack of knowledge. This forum has been a great resource.

You need to consider how many watt hours of electricity in your battery and how much you will use

Volts x Current = Watts

If you your pulling 1 Amp of current from your 12v battery you are using 12 watts of power.

If using your switching regulator (ignoring effeciency losses as they are effecient) to drop the voltage down to 6v. If you pull 1 amp of current at 6v you are using 6 watts of power

When batteries have a rating of 2000mah (which is 2 amp hour) this means you can draw 2 amps of power at the battery's voltage (12v) for 1 hour before it goes flat.

So a 2000mah 12v battery will have a capacity of 2amps x 12v which is 24 watt hours (it can run pulling 1 watt for 24 hours)

A 2000mah 6v battery will have a capacity of 2amps at 6v which is 12 watt hours

So someone saying they have a 20ah battery is meaningless unless you know the voltage. Which is why if you have a home battery setup you would specify the capacity in watts.

Decreasing voltage is quite effecient when using a switching regulator. Increasing voltage is less effecient, which is why when needing to increase the voltage your generally better wiring batteries in series (the voltage of each battery adds together)

The catch is, if you use a lower voltage and you have a device (motor or kettle) that uses a high amount of watts, you need more current to supply that. When you pull more current you need thicker wires.

Consider a 1500 watt kettle. To achieve 1500 watts using 240v you would only need 6.25amps of current (1500 / 240) If you had a 1500 watt kettle that runs of 12v, it would need to pull 125 amps of current. And that will require some very thick wire.

Hope that makes some sense.

I'm glad you got it to work.
Have a nice day!

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.