How to make a variable ampere power supply..??

I would like to make a power supply which will have an option of variable voltage output and variable ampere

I can make the variable voltage with LM317 or LM338

But how do I make a variable ampere (current) circuit..??

hi joy!

you can add a shunt to the circuit and grap the voltage over them. this value is linear to the current due the shunt and can be used as a feetback to the voltage supply.

du you know what i men?

it's difficult for me to find the right words, english is not my native language.

lg oxyl

add-on: you can also use a hal-sensor with i2c-port or linear voltage output. the advantage of this a lover impedance (important for higher currents an less lots) and the option of isolation between control and load-circuit.

The LM317 can be used both as a variable voltage and current (Ampere) regulator.

See fx:

http://www.techlib.com/electronics/regulators.html

Also the LM317 is often used as constant current generator fx. to dive LED's

First of all lets figure out what you really are after so you will know what to call it in the future.

Any voltage adjustable power supply can supply a variable amount of current into a fixed load resistance, just raise the voltage to raise the current and lower the voltage to lower the current. That's basic ohms law.

Now some bench type DC power supplies can incorporate addition modes of operation besides just having voltage output control. One such mode is adding an adjustable 'current LIMIT control'. That is were one can continue to vary the DC output voltage into a fixed load resistance, with current following ohms law as before, however if the current reaches the adjustable current set-point limit control, the voltage will not be allowed to increase, but rather stay clamped at the value that triggered the limit. And of course decreasing the voltage control would allow the current to decrease.

Another adjustable current mode of operation is called 'constant current' mode. This mode is where a desired amount of current to set and the power supply will automatically change the voltage output, up or down, to keep the current at the desired value into the load resistance. Of course the supply can only maintain the constant current up to it's maximum voltage limit as of course it could not force any constant current into a open circuit no matter how high a voltage it could attain. Again Ohms law rules the roost.

So to state it in a different way, lets assume you have a fixed 10 ohm 'load' resistor attached to the power supplies output. An adjustable DC power supply with both variable voltage output and a variable constant current mode, could still NOT force a constant 2 amp current flow while also supplying a constant 10 volts output at the same time. Dr. Ohm says that is against his law and he will put you into electron jail if you attempt it.

So what modes of operation do you desire for your DC power supply.

Adjustable regulated voltage output?
Adjustable maximum current limiting?
Adjustable constant current output?

And keep in mind the first and third modes can not be active at the same time.

Lefty

hello retrolefty!

nice report! let me just add a commend...

you're right, the first and third modes can not be active at the same time, but it's possible to provide both modes by one circuit. so you can change the operation-mode every time you want. may be thats not important for joy's project but if joy want to build a power-supply for testing or something else it will be very helpful to have all modes available .

lg oxyl

but it's possible to provide both modes by one circuit. so you can change the operation-mode every time you want.

Yes, I know that. The used industrial lab DC supply I bought on E-bay years ago has both variable voltage and adjustable current limiting, but not the adjustable constant current mode of operation. I'm sure there are some that might also add adjustable constant current also, just not quite as common.

OP:
Make no mistake that designing and building a supply with all those adjustable modes is not a trivial effort. Once you have decided on the modes you wish to have, then you need to provide the maximum voltage and current capacity you wish to have avalible from the supply, as the specific values will have a big effect on what devices you can use in the design. An adjustable DC power supply that provide 0-40 vdc at up to 10 amps has to deal with how to 'waste' 390 watts of heat if it is set to provide a contant 1 volt output into a .1 ohm load.

Lefty

retrolefty:
The used industrial lab DC supply I bought on E-bay years ago has both variable voltage and adjustable current limiting, but not the adjustable constant current mode of operation.

... then what is the difference between setting a current limit and turning voltage to max versus a adjustable constant current source?

then what is the difference between setting a current limit and turning voltage to max versus a adjustable constant current source?

Well in that mode you set up, if the load resistance increased the current would drop below the current limit, where as with a true constant current mode the supply would automatically try and adjust the voltage output to maintain a constant current to the now new load resistance.

Current limiting = allow any current to flow to the load, from 0 up to the set current level limit.

Constant current = adjust output voltage up or down to maintain a constant current value to the load.

Lefty

retrolefty:
Well in that mode you set up, if the load resistance increased the current would drop below the current limit, where as with a true constant current mode the supply would automatically try and adjust the voltage output to maintain a constant current to the now new load resistance.

As is the case with current limit up to max supply voltage.

Point is I don't really think there is a difference, because a constant current source can only maintain its current up to the supply voltage limit.

to make the thread complete, lets talk about stepped-regulation.

i will pick up the drawn scenario with 1 ohm/ 1A on a 0-40V supply.

but first, there is a mistake in your calculation. you "only" waste 39W not 390W. you can decrease this by using stepped-voltage-supply...

BenF:

retrolefty:
Well in that mode you set up, if the load resistance increased the current would drop below the current limit, where as with a true constant current mode the supply would automatically try and adjust the voltage output to maintain a constant current to the now new load resistance.

As is the case with current limit up to max supply voltage.

Point is I don't really think there is a difference, because a constant current source can only maintain its current up to the supply voltage limit.

I agree in your set-up, but that is a corner case situation in that it only applies because the power supply is already at it's maximum possible output value. Current limiting and constant current are different modes of operation.

Again, constant current mode will attempt to maintain a specific amount of current flow by changing the output voltage up or down, but of course to it's maximum possible voltage only. Current limiting does not prevent less current to flow to the load if the load increases it's resistance, correct?

Lefty

oxyl:
to make the thread complete, lets talk about stepped-regulation.

i will pick up the drawn scenario with 1 ohm/ 1A on a 0-40V supply.

but first, there is a mistake in your calculation. you "only" waste 39W not 390W. you can decrease this by using stepped-voltage-supply...

Didn't I have a decimal point before the resistance value? Just checked, yes I did. Supply is set to supply 1vdc at 10 amps to a .1 ohm load. 10 watts of load, power supply is dissapating what?

you can decrease this by using stepped-voltage-supply...

Yes I know that, I've seen it done by switching of transformer secondary winding taps. Not a trivial task if desired to be done automatically. Or even safely if done manually if the load is still attached and powered on.

Lefty

retrolefty:
I agree in your set-up, but that is a corner case situation in that it only applies because the power supply is already at it's maximum possible output value. Current limiting and constant current are different modes of operation.

No, it really is the same thing.

If you change the load, current will stay at the set limit, but voltage is allowed to change between 0V and Max V for your supply (only subject to the power limits of the supply itself). I think you’re confused with the max voltage setting. This is just an additional capability not found in a constant current only regulator. The current limit setting however is in itself a constant current regulator. There is no such thing as an absolute constant current source across any load and so "current limiting" is a more appropriate name for what it really is and how it works. Current limiting is typically used as a term when max voltage is adjustable, whereas constant current is the term used when max voltage is fixed.

I think you’re in for a treat if you look at your lab supply with new eyes. You have an excellent adjustable constant current source for charging your NiCd batteries, power a LED (irrespective of forward voltage) or just about any task requiring constant current through use of adjustable current limiting.

I think I was not able to make all of you understand my requirement properly...

I would actually like to make a Power Supply Unit which will have to selectors in front
with one you will be able to select the voltage output
and with another you can select ampere output

For example set the voltage to any from 2 - 30volt ( let say you selected 12 Volt)
now select your ampere output ( now u may set the ampere from say 0.002 to 10Amp)

I hope I am clearer now...

There is no such thing as an absolute constant current source across any load and so "current limiting" is a more appropriate name for what it really is and how it works.

We will just have to leave it as agreeing to disagree.

Lefty

For example set the voltage to any from 2 - 30volt ( let say you selected 12 Volt)
now select your ampere output ( now u may set the ampere from say 0.002 to 10Amp)

Ok, you set it for fixed 12vdc and say 1 amp constant current, now what will you be powering with those settings? The load resistance has to have a say in this!

Never mind, I have failed at making my point before and don't think I can do better. :wink:

Lefty

Joy:
I think I was not able to make all of you understand my requirement properly...

I would actually like to make a Power Supply Unit which will have to selectors in front
with one you will be able to select the voltage output
and with another you can select ampere output

For example set the voltage to any from 2 - 30volt ( let say you selected 12 Volt)
now select your ampere output ( now u may set the ampere from say 0.002 to 10Amp)

I hope I am clearer now...

Is your goal to learn how to design an adjustable power supply, or is it to build a power supply for other things? If its the latter, it will probably be cheaper to just buy the supply you are after (in the long run - and depending on what your final needed specs are - a 30 VDC/10 Amp power supply is not going to be easy to build, though you may find it cheaper than buying one - maybe).

cr0sh:

Joy:
I think I was not able to make all of you understand my requirement properly...

I would actually like to make a Power Supply Unit which will have to selectors in front
with one you will be able to select the voltage output
and with another you can select ampere output

For example set the voltage to any from 2 - 30volt ( let say you selected 12 Volt)
now select your ampere output ( now u may set the ampere from say 0.002 to 10Amp)

I hope I am clearer now...

Is your goal to learn how to design an adjustable power supply, or is it to build a power supply for other things? If its the latter, it will probably be cheaper to just buy the supply you are after (in the long run - and depending on what your final needed specs are - a 30 VDC/10 Amp power supply is not going to be easy to build, though you may find it cheaper than buying one - maybe).

Truly speaking I was always clear about how the Voltage Output from a source can be increased/decreased by voltage regulators like LM317 or LM338...

But few days back I saw a charger on a battery shop which had a variable resistance(pot) in from of the charger by which the user can set the ampere output of the charger.
Till then I am not able to figure out how it is controlled...And how can I measure the Ampere output of the charger..(remind you I dont want to measure the amp consumed by any load, I want to measure the ampere output of that charger which is set by the pot)..

Joy:
But few days back I saw a charger on a battery shop which had a variable resistance(pot) in from of the charger by which the user can set the ampere output of the charger.
Till then I am not able to figure out how it is controlled...And how can I measure the Ampere output of the charger..(remind you I dont want to measure the amp consumed by any load, I want to measure the ampere output of that charger which is set by the pot)..

Amp output from the charger and amp consumed by the load will always be equal. This is fundamental physics as explained by Kirchoff and there is nothing we can do about it other than to accept it as a fact and learn to use it to our advantage.

At the fundamental level, a power supply can only control its output voltage (not current). From Ohms law however we know there is a simple relationship between load resistance and current. So if we add a feedback where we measure current through the load, we can continuously adjust voltage in such a way that current remains constant. That is if current is too low, we increase output voltage and if current is too high, we reduce output voltage. This is done in a regulation loop until output/load current match our set target.

The control you saw on the charger is used to set an upper current limit. When we charge a battery (say SLA), internal battery resistance is initially very low and so current limiting may be required (partly to protect the charger and partly to protect the battery). Say we charge at 14.4V, but with a 2 amp current limit. The charger will then initially supply 2A, but at a lower voltage. The actual voltage (as controlled by the charger) will be determined from the current feedback in accordance with Ohm’s law. As internal battery resistance increases, voltage will increase so that current output stays at 2A. When the charger reaches 14.4V, current will start to drop off as internal battery resistance increases and the battery approaches a full charge.

Here’s a link to a design for a linear lab supply with a fairly good explanation as to how it works and what components are used. A SLA battery charger works according to the same basic principle. That is to control output voltage in response to current and voltage feedback.

http://tuxgraphics.org/electronics/200707/bench-power-supply-unit.shtml

I'm working on a similar project based on this design:

http://powersupplycircuit.blogspot.com/2009/07/mini-bench-power-supply-circuit.html

Being this a total analog solution, controlled by two simple voltage dividers, I'm working in an upgrade to control an monitor it with an Arduino.
Modifying the electronics to supply more current seems not that complicated, more voltage might be complicated.
It will feature constant current, constant voltage and current limiting modes.

To get a nice resolution I got a TDA1543, that proved to be easy to use and has 18-bit, and has two output usefull for voltage and current. Next step will be using better ADC to get more accurate readings. I've bought some ADS1211 24-bit ADC for that purpose.

Then I plan to make a voltage and a current meter to sow the actual readings. The settings will be controlled with rotary encoders. So far I've just tested an simulated the parts, and it's a lot of fun!

And so far, in terms of the local market, it's still much cheaper than buying something already made.