I want to power a NRF24L01+PA+LNA from a 5v source. As far as I know I have two options. Use a buck converter to step down the voltage to 3.3v or use a voltage regulator (I would use the AMS1117).
Which is better solution? I've heard that buck converters are inefficient at low currents; and voltage regulators become very inefficient at higher currents. Which approach is most optimal?
I want to power a NRF24L01+PA+LNA from a 5v source. As far as I know I have two options. Use a buck converter to step down the voltage to 3.3v or use a voltage regulator (I would use the AMS1117).
Which is better solution? I've heard that buck converters are inefficient at low currents; and voltage regulators become very inefficient at higher currents. Which approach is most optimal?
ningaman151:
I've heard that buck converters are inefficient at low currents; and voltage regulators become very inefficient at higher currents. Which approach is most optimal?
That's correct, and so, as I am sure you must realise, the answer depends on the project. There will be some projects where the voltage regulator will be optimal. For other projects, the buck convertor will be optimal. There will be projects where either will equally optimal.
For a project where the circuit is in a low power/sleep state for long periods, and wakes occasionally to take sensor readings and quickly transmit them before returning to sleep, then a voltage regulator will probably be more efficient, provided you choose the right regulator.
For battery powered circuits with a very long life between recharges, e.g. weeks or months, I use MCP1700, MCP1702 or similar. These perform much better than AMS1117 in this situation because they have a very low voltage drop and a very low quiescent current.
However, MCP1700 has a much lower maximum input voltage and can supply far less current compared to AMS1117.
PaulRB:
I use MCP1700, MCP1702 or similar. These perform much better than AMS1117 in this situation because they have a very low voltage drop and a very low quiescent current.
Isn't the voltage drop constant no matter what voltage regulator you use? (5v - 3.3v = 1.7v)
Yes, sorry. What I meant to say is the minimum voltage drop.
Suppose your circuit is designed to run at 3.3V but in practice will run ok down to 3.0V, and you want to power it with a li-ion battery. Li-ion batteries have a "nominal" voltage of 3.7V but are 4.2V when fully charged and drop to around 3.0V when empty.
If you used an AMS1117, that has a minimum voltage drop of around 1.0V (even more, if a high current is drawn). So with a fully charged li-ion, it would only output 3.2V, which is just enough. As soon as the battery voltage falls below 4.0V, the output won't be enough. At that point, the battery still has most of its charge.
If you used MCP1700, that has a minimum drop of 0.2V. When the li-ion is fully charged, it would drop 0.9V so that the output was 3.3V. As the battery voltage drops, it would continue to output 3.3V, until the battery voltage falls to around 3.5V. After that, the output would begin to drop below 3.3V, but the circuit would still run until the battery voltage falls to 3.2V. At that point, most of its charge has been used.
The low drop out is a definite advantage over the LM1117. Another big advantage is that the MCP1700 requires only 2 0.1uF ceramic caps where the LM1117 requires a 10uF tantalum and 1 or 2 0.1uF ceramic caps.
I think that my next 3.3V regulator order will be for some MCP1700.
The data sheet specifies the 10uF tantalum cap on the output. You should always obtain and read the data sheet for any components that you are going to use.
A minimum of 10-μF tantalum capacitor is required at the output to improve the transient response and stability.
If you do not use the cap, the transient response and stability can not be guaranteed. Using any part without following the manufacturers recommendations is never wise.