Series resistor affect on Voltage Regulator

If a resistor is connected between power and a Linear Voltage regulator, how is the circuit affected. What is the math to figure such a thing out.

  • How is the Voltage regulators' output affected
  • How much voltage will the resistor drop and how much will the regulator drop

My guess was that the regulator acted like a diode and dropped a constant voltage but this was not the result of my tests.

Here is an example circuit.

  • The resistor in question is R1
  • Please ignore the diode and capacitor.
  • This was not the circuit I tested on. Mine was much simpler

Thank you!

Volt drop across the series resistor depends on the current through it.
An 7805 regulator needs at least 2volt across to properly maintain output voltage.
That leaves 12volt - 5volt output - 2volt regulator = 5volt max across the resistor before things go haywire.
That’s a max possible current draw with a 100ohm series resistor of 5/100 = 0.05A = 50mA.
A regulator without resistor dissipates 12volt - 5volt output = 7volt across * 0.23Amp estimated = 1.61watt.
Should not be a problem with a small clip-on heatsink.
A better option is a 5volt buck converter. Lower drain from your 12volt battery, and little or no heat.
Leo…

Hi,
Is this related to the information you were given here?

Volt Drop across R1 = I x R1 = 0.23 x 100 = 23V, not going to work.

Tom.. :slight_smile:

CptMichles700:
My guess was that the regulator acted like a diode and dropped a constant voltage but this was not the result of my tests.

Linear regulators are specialized negative-feedback amplifiers that endevour to maintain the output voltage
constant in the face of load changes.

To work they need a certain amount of "voltage headroom", which is typically 2 or 3V, less for "LDO" (low-dropout) regulators.

They can only source current, not sink it (for positive voltage regulators), and most have a fixed set point
output voltage (some are variable).

A series resistor on the input can be used to spread heat dissipation between the regulator and the resistor,
but the resistor limits the current you can draw before the regulator output drops as explained above.

Typically its easier to bolt a regulator onto a heatsink than it is to bolt a power resistor on, and considerably
cheaper, so noone uses series resistors like this, its not very flexible and costs considerably more (power resistors are a lot more expensive than standard voltage regulators).

If you have a lot of voltage to drop, a DC-DC converter before the linear regulator can be used - less heat
dissipation, and so long as there is enough voltage headroom for the linear regulator, you get the benefits
of a linear regulator (low noise and ripple).

I think I figured it out.

TomGeorge yes, I'm referring to that post.

Its a voltage divider

The voltage drop of the resistor is result of a voltage divider between the resistor and the linear regulator.

If the input voltage is 17v and the Linear voltage regulator is a 5v regulator, then that leaves 12 to spread between the resistor and the LVO.

The LVO can said to have the resistance of its load.

Thus if the R1 is 330R and LVO load is 10k ohms, then the Voltage drop across R1 can be found by using the voltage divider equation.

R1(vdrop) = 12v x (330/10330)

When I physically tested it, my results were 0.5v off but the input voltage was not exactly 17v so that might account for it.

How it influences current
???
As I raised the value of R1 the current barely changed at all, as if R1 was not even there.
But if I made R1 VERY large, then it started to affect the current.

Maybe R1 only starts to seriously affect the current when it drops so much voltage that there is not enough left for the LVO to do its job (drop out voltage I think (2v) )

Conclusion
I may very well be wrong.
I still don't understand what math I need in order to figure out how a resistor will affect the current.

I'll keep experimenting or make the final circuit and change the resistor if the thing gets destroyed.
(Electronics are cheap)

CptMichles700:
Conclusion
I may very well be wrong.
I still don't understand what math I need in order to figure out how a resistor will affect the current.

I'll keep experimenting or make the final circuit and change the resistor if the thing gets destroyed.
(Electronics are cheap)

Your typical voltage regulator will be a circuit or electronic system. So if you have a circuit diagram of the whole system - including the series resistor and external voltage source, and the regulator's load, you could then have a chance to determine a few things..... like the regulator's input voltage and input current etc.

While some 'black box' systems have a nice easy input/output relation to solve what you want to solve..... that's not always the case. Although, maybe with some basic assumptions, like output current roughly equal to input current (and/or taking some power efficiencies into consideration), you could possibly estimate a few things, like regulator's input voltage, and power dissipated in the series resistor. But that's just assumptions. As long as you make your assumptions, you could always check out your predictions by making measurements on a real circuit..... or software circuit simulator.

And the average voltage regulator will have the input terminal voltage "at least" 2 volt higher than the regulator output terminal. So, if your regulator is meant to output 5 volt, then the input terminal will need to be "at least" 7 volt. This means that if you have a series resistor on the input side, then the higher voltage side of the resistor (resistance R) would need to be "greater than or equal to" (7 volt PLUS i.R), and 'i' is the current through the resistor, which could be approximately equal to the regulator's output current (under a basic assumption of input current approx equal to output current).

In simple mathematical terms :

Assume supply voltage is Vs
Assume system current is I
Assume regulator output is Vo
Assume minimum regulator input voltage is Vo+2

Resistor input voltage = Vs
Resistor output voltage = Vo+2
Therefore voltage dropped across resistor is Vs-(Vo+2)

But voltage dropped across resistor is IR which is also Vs-(Vo+2)

So maximum value permitted for R is [Vs-(Vo+2)]/I

Hi,

My guess was that the regulator acted like a diode and dropped a constant voltage but this was not the result of my tests.

No it doesn’t, it acts like a regulator, it regulated its output to keep it at 5V, as the input voltage varies, the LM78 series need the input voltage to be at least 2V higher than the designed output voltage.

Have you read the spec sheet for an LM7805?

CptMichles700:
The voltage drop of the resistor is result of a voltage divider between the resistor and the linear regulator.

If the input voltage is 17v and the Linear voltage regulator is a 5v regulator, then that leaves 12 to spread between the resistor and the LVO.

The LVO can said to have the resistance of its load.

then that leaves 12 to spread between the resistor and the LVO.

NO, then that leaves 12 to spread between the resistor and the 2Volt minimum drop Linear Reg.

You must use 12v - 2V = voltage across the resistor.

The LVO can said to have the resistance of its load.

No, because the voltage across it and the current through it will vary to keep the Linear Reg output at 5V.

Tom… :slight_smile:

LM7805 (1).pdf (351 KB)