Go Down

Topic: routing power from solar source to consumers over a garden (Read 982 times) previous topic - next topic

cio74

Hello,

I am in the middle of designing a system to light my garden at night based on some factors. It has a 100W solar panel which charge a 12V 22Ah battery and then spread it across the garden to power 20 individual lights location each with an Atmel MCU, 3.6/4.8v re-chargable battery and some Cree leds.

Each led enclosure uses 3.6V or 4.8V, still debating, should I:

- route 12V at the battery amps and at each light location use a LDO to get it down to 3.6/4.8V
- drop the voltgate at the source to 3.6/4.8V

Am I overthinking it, does it really makes any differece in terms of power efficiency? I know that conductor diameter is a factor depending on the amere I want to route, any other consideration?

Thanks.

MarkT

I presume the low voltage supply is just to run the microcontroller?  If you can find some cheap DC-DC converters you
could probably arrange them to trickle-charge the rechargeable (NiMH I guess - use 4 cells for 5.3V or 3 for 4V, either
should work with 16MHz crystal).

Definitely avoid multiple cable runs, cabling get costly for long runs, and at low voltage the voltage drop becomes an
issue (less so for low currents of course).

Work out how much resistance in total you can accept between battery and the load, and budget your cable accordingly.

You absolutely must check the battery voltage and prevent over-discharge, lead-acid batteries are simply wrecked by
over-discharge (11V minimum is a good guide), even one brief overdischarge will significantly affect capacity and cell
balance.  Its best to sense battery voltage at the battery if you have long cable runs so you aren't confounded by
IR voltage losses.  So a combined solar battery charger and monitor is probably a wise investment at that end.
[ I won't respond to messages, use the forum please ]

cio74

Thanks for the reply.

I have a solar panel delivering about 100W into a solar charger controller connected which charge a 22Ah lead acid battery. I do not control this process, it's all industrial grade already built, I just assembled it.

From the LOAD connector of the solar control charger I will power my 20 lights sources. For some reason, not necessary a good one, I also have 3.6V made out of 3 x 1.2V 2Ah NiMH cells at each light location. They should charge form the main solar power source and lit when required a CREE high power led. The LED is kept at minimal 3.0V 20mA IDLE current operation which changes to 3V 200mA if the motion detector sense anything in the surrounding. This creates some ambiental light at idle with floodlight event detector.

The question was if I should run 12V or 5V from the solar charger around the garden. Not sure why but it sounds like it would not matter in the end?


SirNickity

All else being equal, distributing higher voltages means distributing lower currents, which is preferable.  You'll also have more leeway with voltage drops.  Downside is, of course, having to do your conversion at each node.

Since the LEDs are actively current limited, is the drive circuit capable of handling 12v inputs?  What about your local cell chargers?  Regulating to 5v at each spot for motion detection seems relatively inconsequential with all the other wizardry going on in each of those nodes.  It seems all the requisite electronics should or will do their own DC-to-DC conversion anyway, so you're not adding much complexity.

sparkylabs

why are you having "local" batteries in your lights ? if this is about a self sufficient system you want as little losses as possible and charging and discharging batteries looses you current as it is not a 100% efficient process. Distribute your power at 12V this will minimize current loss, then use switchmode drop down regulators (NOT LDO regs as they just waste 12V-4.8V x current = power lost in watts = pay for much bigger panel like 3 times bigger). there is a cheap 1A reg in a DIP8 package that costs pence and has most of it built in you only need the inductor and capacitor and diode.

majenko

What you want are "BEC" devices - Battery Elimination Circuits.  They are basically a switch mode power supply on a little board.

I use these ones: http://www.hobbyking.com/hobbyking/store/uh_viewitem.asp?idproduct=22494

They are ideal for running MCU systems at pretty high efficiency from a 12V battery source.
Get 10% off all 4D Systems TFT screens this month: use discount code MAJENKO10

Chagrin

For 5V DC over 24 gauge copper (like the size in ethernet cables) for every 10 feet of distance you lose .01V at 20ma or .1V at 200ma. Hardly a concern.

http://www.calculator.net/voltage-drop-calculator.html



SirNickity

When the lights pull 200mA each, and you have a garden where you're paralleling lights for a total distance longer than 10 ft, it very well could be a concern.  (Granted, in this case I would advise using better than 24awg cable.)

majenko


When the lights pull 200mA each, and you have a garden where you're paralleling lights for a total distance longer than 10 ft, it very well could be a concern.  (Granted, in this case I would advise using better than 24awg cable.)

Which is exactly why the power grid don't transmit all their power at 110V...  The losses are just too great and the cables would have to be impractically thick.

20 lights at 200mA each, that's 4A.  For 4A you would need 15AWG minimum thickness.  If you can transfer at over double the voltage (i.e., at 12V) you can use lower current.  Say 85% typical efficiency for a switching BEC, you would need to only transmit 1.96A - so you can use 18AWG.  Of course, that's assuming that all lights are on at once, and you use the same cable throughout.  You could use gradually tapering cables as you go down the garden further away from the battery, but that would mean buying lots of different sizes of cable.

To reduce the number of BEC circuits you need you could group the lights into power domains, with all the lights within a certain radius running off one local BEC.  You could have, say, 5 BECs, with 4 lights running off each with the BEC located as close to the center of the lights as possible.
Get 10% off all 4D Systems TFT screens this month: use discount code MAJENKO10

conradin

Just 2 thoughts.
are your LEDs providing the wavelengths necessary to act as grow lights?  My understanding of sunlight, LEDs and the growth process is that 400-700nm is where its at for optimal growth.  going above or below, those limits is wasting energy.

my other thought, also regarding efficency of growth, and quality of light is to ask what if you PWM your LED output?
PWM can greatly improve your efficency, but for these purposes, i dont not know how it would effect quality of light pertaining to plant growth.  I mean plants dont have retnas, they dont see, but they do response bio-chemicaly to energy levels, which are effected if you use PWM.  using PWM LED output may still offer some charming results to efficiency in your application.

sparkylabs


but they do response bio-chemicaly to energy levels, which are effected if you use PWM.  using PWM LED output may still offer some charming results to efficiency in your application.


Which means reducing pwm will reduce energy levels. I didn't realize this was supposed to make plants grow. If you have 500+W/sqm in your back pocked your laughing

SirNickity

I don't think it's for the plants.  Why would grow lights need motion detectors?

majenko


I don't think it's for the plants.  Why would grow lights need motion detectors?

So the lights would only come on if the plants grew? ;)
Get 10% off all 4D Systems TFT screens this month: use discount code MAJENKO10

Henry_Best


I don't think it's for the plants.  Why would grow lights need motion detectors?


Trifids?

Go Up