The idea is to put the generator into the optimal power point to maximize energy havesting. This is mostly done by driving a DC/DC converter with pwm. By varying the pwm the current and voltage taken from the generator can be adjusted.
So far I understand the concept.
Lets say we use a 12V Battery. How is it possible that the DC/DC converter is used for putting the generator into the right operation point and to output the right Battery loading voltage at the same time?
For example the generator is generating 30V free running at 3m/s. The optimal Voltage for that windspeed is 15V. So in my understanding I would adjust the pwm of the buck converter to reach 15V input. But this will lead to some output voltage which probably is not the desired battery loading voltage.
Most designs only use one dc/dc converter. From my understanding after the first converter another module would be needed to ensure the right loading voltage for the battery.
jacko91:
For example the generator is generating 30V free running at 3m/s. The optimal Voltage for that windspeed is 15V. So in my understanding I would adjust the pwm of the buck converter to reach 15V input.
I wonder if there is a basic misunderstanding in that - but I'm not sure what you mean by "input" - input into what?
The optimal state for a generator is a combination of voltage and current (not either one on its own) - the state at which volts * current gives you the greatest number. With a wind generator (as with solar panels on a cloudy day) the optimum point can be expected to change frequently and rapidly. Also, the voltage is generally determined by the speed of the generator so, at 30v it will be running faster than it would be at 15v. But to slow it down it needs a heavier load - i.e. a greater current must be drawn from the generator.
Control electronics can be used to boost the output voltage (from the electronics) so as to increase the power being fed into a load. This is probably easiest to comprehend if you imagine the power being fed into a resistive heater. By Ohms law a higher voltage means more current and a hotter heater. By increasing the power that is fed into the load there is a corresponding increase in the power demanded from the generator - the power going into the electronics. If the electronics draw more current from the generator it will slow it down and the generator voltage will fall. The trick is to figure out the optimum point.
Things are considerably more complicated if the power is being fed into a battery because there is a limit to the capacity of the battery and the control electronics must be designed so it does not overcharge the battery - which is a combination of not charging it too quickly and stopping the charge when the battery is full. In other words, when the battery is full all the wind energy must go to waste, or be diverted into a different use. Of course if the output of the generator is small in comparison to the battery capacity this is less of an issue.
I wonder if there is a basic misunderstanding in that - but I'm not sure what you mean by "input" - input into what?
I mean the input of the DC/DC converter, so basically the rectified DC voltage from the generator.
I think what i was missing in my understanding is that the battery is kind of setting the output voltage of the Converter.
So the boost or buck converter is used to adjust the current going through, and by adjusting the current it will adjust the voltage of the generator as well.
I just completed a solar power project. The solar cell is rated at 12V. During the day the solar cell puts out from 0V to 11 volts. My need to is get the solar cell output to 7 volts. So My Buck/Boost will start working, my configuration, when the solar cell is producing 4 volts. When the solar cell is producing 4 to 7 volts a boost is done to the solar cell output. When the Solar cell output is over 7 volts the dc-dc does the buck thing.
I would figure that when the blades turn slow, the power output is low and needs to be boosted to operational values and when the blades are turning fast the power output needs to be bucked.
jacko91:
I think what i was missing in my understanding is that the battery is kind of setting the output voltage of the Converter.
I think it would be more accurate to say that the battery imposes a limit on the output voltage from the converter.
Another way to think of it is that the power taken from the generator (gen volts * gen amps) must be the same as the power going into the battery (batt volts * batt amps) plus the power loss in the electronics.
I have some solar panels that help to charge some lead acid batteries and reduce the amount of diesel fuel needed for charging. At the time I bought the panels (2011) I figured that I would not get enough value from an MPPT charger to justify its cost because of the limited capacity of the batteries. I can't take the risk of doing no engine-charging in case there is no sunshine tomorrow. Even without an MPPT charger I had estimated the solar panels would pay for themselves within 3 years and they easily achieved that.
Another thing to keep in mind (if you are using lead-acid batteries) is that they suffer serious damage if they are left discharged for any length of time. They really need a very full charge (8 hours or more of charging without being discharged) about once a week. It is not unusual to have a windless week.
Robin2:
I think it would be more accurate to say that the battery imposes a limit on the output voltage from the converter.
Another way to think of it is that the power taken from the generator (gen volts * gen amps) must be the same as the power going into the battery (batt volts * batt amps) plus the power loss in the electronics.
...R
Thanks, good explanations.
I totally got it now. Ready to move into realisation.