LDO Voltage Regulator - Minimum Output Current

I want to replace the LP2985-33DBVR fixed 3.3V 150mA Voltage Regulator in my design with one capable of higher output current, since my design (based on the datasheets) may draw up to 280mA.
I found the REG102NA-3.3 400mA regulator, but I am not sure as to what its minimum output current is.

Should I use this regulator? Will the regulator work properly if the circuit draws less than 400mA?

REG102NA-3.3 Datasheet

Thanks in advance,
Alex

You seem to miss what those figures mean. The LDO is a constant voltage device, not constant current. The 400mA is not a fixed value, but an upper limit. That LDO will source as much current as is necessary to maintain 3.3V, up to 400mA. If the circuit only requires 280mA, then 280<400=within design specs. With your other LDO, 280> 150(max)= almost 2x what the LDO can source= 0V, max current, much heat= roasted LDO regulator.

HI Alex,

The part has no stated minimum current, meaning you could draw 0 current and it will be stable.

The part seems to be "rated" for 250 ma out, I would think the 280 you are planning will be OK ..but (see below).

The 400 ma specification specifies the current at which the regulator will start to reduce output voltage to ensure the current can never go above 400 ma. Think of it as an RPM limited on your car. You engine is not meant to run at that max RPM but if you tried to the car computer will cut back the gas to insure the engine stays below that RPM.

Now for the (see below): It is always difficult when one first starts to apply semiconductors, the datasheets are complex and have requirements that are not obvious. In this case, look at the last entry on page 3 of the datasheet.

If you use the SO-8 Surface-Mount device the temperature rise is 150 °C/Watt.
So if your input voltage is 5 volts and you output voltage is 3.3V and your current is 0.280 Amps
Your power dissipated in watts is Pw = (5-3.3)*0.280 = 0.48 watts

At a temperature rise of 150°C/watt the resulting temperature of the regulator is 150 * 0.48 = 72
°C This is rise so if the ambient temperature is 30 °C the regulator internals are at 102 °C. Not bad, in general you don't want to go above 110 to 120 °C.

Now just like the current, the regulator will limit the internal temperature to 160 °C. Again the chip is not meant to run at or near this temperature but will protect itself if starts to get to the destruction temperature.

Hope I helped.

Good luck.

Thank you very much for your replies!

@JohnRob that was a very helpful pointer and certainly something I will take into concideration.

@tinman13kup I had it figured out exactly like that, but when you dive into datasheets, you often question what you do and do not know :confused:

JohnRob:
If you use the SO-8 Surface-Mount device the temperature rise is 150 °C/Watt.
So if your input voltage is 5 volts and you output voltage is 3.3V and your current is 0.280 Amps
Your power dissipated in watts is Pw = (5-3.3)*0.280 = 0.48 watts

I will be using the SOT-23 package, which means 200°C/W, so I will have to decrease the regulator input voltage to 4V, instead of 5V by adding a resistor from the +5V line to the regulator Vin. With the input current being roughly equal to the output current, the resistor needed would be:
Rnominal = 1V / 0.28A ~= 4 Ω, 1/2 Watt
Rmax = 1V / 0.4A ~= 3 Ω, 1/2 Watt

So, choosing a 4Ω 1/2Watt resistor would mean a maximum Pw = 0.7V * 0.4A = 0.28 W power dissipation in the regulator and, thus, 200°C * 0.28W = 56°C junction temperature + ambient with an operating maximum of 125°C. Meaning the regulator would operate safely.

Are my calculations correct? Have I missed anything?

You could also use a diode in place of the resistor. Not knowing your physical layout the diode could use a diode with a TO-220 case, able to dissipate plenty of power.

If you are laying out your own board, you could add more copper (area) around the regulator to dissipate more heat (i.e. lower °C/watt)

Good luck