Go Down

Topic: routing power from solar source to consumers over a garden (Read 975 times) previous topic - next topic



I am in the middle of designing a system to light my garden at night based on some factors. It has a 100W solar panel which charge a 12V 22Ah battery and then spread it across the garden to power 20 individual lights location each with an Atmel MCU, 3.6/4.8v re-chargable battery and some Cree leds.

Each led enclosure uses 3.6V or 4.8V, still debating, should I:

- route 12V at the battery amps and at each light location use a LDO to get it down to 3.6/4.8V
- drop the voltgate at the source to 3.6/4.8V

Am I overthinking it, does it really makes any differece in terms of power efficiency? I know that conductor diameter is a factor depending on the amere I want to route, any other consideration?



I presume the low voltage supply is just to run the microcontroller?  If you can find some cheap DC-DC converters you
could probably arrange them to trickle-charge the rechargeable (NiMH I guess - use 4 cells for 5.3V or 3 for 4V, either
should work with 16MHz crystal).

Definitely avoid multiple cable runs, cabling get costly for long runs, and at low voltage the voltage drop becomes an
issue (less so for low currents of course).

Work out how much resistance in total you can accept between battery and the load, and budget your cable accordingly.

You absolutely must check the battery voltage and prevent over-discharge, lead-acid batteries are simply wrecked by
over-discharge (11V minimum is a good guide), even one brief overdischarge will significantly affect capacity and cell
balance.  Its best to sense battery voltage at the battery if you have long cable runs so you aren't confounded by
IR voltage losses.  So a combined solar battery charger and monitor is probably a wise investment at that end.
[ I won't respond to messages, use the forum please ]


Thanks for the reply.

I have a solar panel delivering about 100W into a solar charger controller connected which charge a 22Ah lead acid battery. I do not control this process, it's all industrial grade already built, I just assembled it.

From the LOAD connector of the solar control charger I will power my 20 lights sources. For some reason, not necessary a good one, I also have 3.6V made out of 3 x 1.2V 2Ah NiMH cells at each light location. They should charge form the main solar power source and lit when required a CREE high power led. The LED is kept at minimal 3.0V 20mA IDLE current operation which changes to 3V 200mA if the motion detector sense anything in the surrounding. This creates some ambiental light at idle with floodlight event detector.

The question was if I should run 12V or 5V from the solar charger around the garden. Not sure why but it sounds like it would not matter in the end?


All else being equal, distributing higher voltages means distributing lower currents, which is preferable.  You'll also have more leeway with voltage drops.  Downside is, of course, having to do your conversion at each node.

Since the LEDs are actively current limited, is the drive circuit capable of handling 12v inputs?  What about your local cell chargers?  Regulating to 5v at each spot for motion detection seems relatively inconsequential with all the other wizardry going on in each of those nodes.  It seems all the requisite electronics should or will do their own DC-to-DC conversion anyway, so you're not adding much complexity.


why are you having "local" batteries in your lights ? if this is about a self sufficient system you want as little losses as possible and charging and discharging batteries looses you current as it is not a 100% efficient process. Distribute your power at 12V this will minimize current loss, then use switchmode drop down regulators (NOT LDO regs as they just waste 12V-4.8V x current = power lost in watts = pay for much bigger panel like 3 times bigger). there is a cheap 1A reg in a DIP8 package that costs pence and has most of it built in you only need the inductor and capacitor and diode.

Go Up