SMPS Working

Hey There,

I have a 5V 3A SMPS which says that it can take input voltages between 100V and 240V AC and frequencies between 50Hz and 60Hz.

How do these work..?
How can they take such a large range of input voltage and provide a constant output voltage and current..?

I know the basic working of the SMPS Technology. But all the articles I read suggest a fixed DC input voltage to produce a desired voltage below this input. How can such a wide range AC produce a fixed DC input voltage to the SMPS IC..?

Thanks in Advance..!

Hello,

Do you know the concept of PWM?

Typically, SMPS controllers operate with this type of switching, so the higher the input voltage, the smaller the pulse width to keep the output constant.

You can also search for PFC, which deals with power factor correction.

One of the most common ICs that are used in SMPS is the UC3842:

http://noel.feld.cvut.cz/hw/philips/acrobat/5060.pdf

For the PFC, we have the UC3854 as an example:

Hi SagarDev.

There's 2 things you can learn from the example on page 7 of the first link from the previous post.

First, the AC is rectified.
So the 50 - 60 Hz isn't completely true, but that's what was used when testing your power supply.
It would be happy to work on some 125 volts DC, no doubt.

Second: feedback tells the supply it's doing what it is supposed to do.
Take a look at transformer T1.
It doesn't have 2 coils, but 3.
The 3rd coil (lower left side) creates the feedback needed for the SMPS to do it's thing.
Pin 2 of that controller is named vFB for voltage FeedBack.
It needs to see a fixed voltage which is derived from 16 volts through a voltage divider, so these schematics tell us.
As long as the divided 16 volts are present, the controller knows it is working properly.

Of course this is valid for this controller, but other controllers will work in a very similar way.
A voltage divider may not be the most accurate way to do things, but it is cheap and an easy solution.

SagarDev:
constant output voltage and current

Hello,

Complementing our comments, we need to make it clear that the phrase "constant output voltage and current" is not true.

The most common power supply has only the constant voltage and the maximum current that it can supply, but the current is not fixed, it is not constant in 3A, it can be 3A if the load requests, but generally the load does not operate with maximum consumption, so as not to reduce the life of the components of the power supply.

When a power supply is built to have constant current, the voltage will not be constant, it will vary according to the load characteristic.

The reason why the current and the voltage can not be constant at the same time is influenced by the load, since there is a relation between the voltage, the current and the resistance of the load. Look for Ohm's law.

You can access these tutorials to know a little more:

  1. DC Circuit Theory
  2. Ohms Law and Power <---------
  3. Electrical Units of Measure
  4. Kirchhoffs Circuit Law
  5. Mesh Current Analysis
  6. Nodal Voltage Analysis
  7. Thevenin’s Theorem
  8. Nortons Theorem
  9. Maximum Power Transfer
  10. Star Delta Transformation
  11. Voltage Sources
  12. Current Sources
  13. Kirchhoff’s Current Law
  14. Kirchhoff’s Voltage Law

https://www.electronics-tutorials.ws/dccircuits/dcp_1.html

There is plenty of information on the subject on the internet, you can search and view videos, images and texts about this and other related topics:
https://www.google.com/search?q=how+to+smps+work

SagarDev:
I know the basic working of the SMPS Technology. But all the articles I read suggest a fixed DC input voltage to produce a desired voltage below this input.

Then all those articles are wrong, or you've misunderstood. For instance in DC-DC SMPS its
pretty common to have units that accept 18--72V input range (nominally 48V).

However there are converters which are listed as 12V in 5V out, for instance, but if you read
the datasheet you'll find the actual input voltage range is pretty wide - they have just been
optimized for a common use case and the manufacturer's want people looking for a 12->5V
converter to actually find their product...

MarkT:
Then all those articles are wrong, or you've misunderstood. For instance in DC-DC SMPS its
pretty common to have units that accept 18--72V input range (nominally 48V).

However there are converters which are listed as 12V in 5V out, for instance, but if you read
the datasheet you'll find the actual input voltage range is pretty wide - they have just been
optimized for a common use case and the manufacturer's want people looking for a 12->5V
converter to actually find their product...

Hello,

Yes, it is easier to say the basics that everyone repeats, so most of the internet postings are made with copies of some article that someone took some time to develop.

The fact is that there are circuits that can lower the voltage, and there are also circuits that can raise the voltage.

The most used circuit to raise the voltage, is based on the use of inductors, which are loaded magnetically, and soon after being turned off, the magnetic field discharges forming a voltage peak.

Just look at the high voltage coil of a car, which we can notice is something simple, maybe even magical, for those who do not know about it.

Image source: Faraday's Law and Auto Ignition

The PFC circuit, for example, usually receives the variable voltage (90 ~ 240Vac) converted to DC (Vdc = Vac * 1.41) in the rectifier, and filtered with capacitors to avoid moments of low voltage, and then with the use of switching a coil , promotes about 400Vdc, which is then used in the main conversion circuit, say 400Vdc to 12Vdc, or any other value, such as 400Vdc to 5000Vdc (5kV) etc.

An interesting fact, are those computer power supplies that have a selector switch (110V / 220V) I believe that only a few countries have such a model in the market, but anyway, when using the selector switch in 110V mode, in fact the rectifier circuit is being set up in voltage doubler mode.

Selector Switch Detail (115V/230V):