I have a circuit that starts at 3.5V and progressively increases to 8V
I'd like it to start at 0.5 and increase to 5Vso essentially I'd like to wash 3V off the circuit .
How can I do this?
I tried a voltage divider but I found that R1 got incredibly hot and burnt after a few seconds.
I looked into buck converters but they keep voltage stable. I need an increase in line with the input.
Sounds like your resistors were too small and were dissipating too much power. Try using resistors that are proportionally bigger or use resistors that have a higher wattage rating.
Designing a buck converter seems like the logical solution.
"I looked into buck converters but they keep voltage stable." Huh? A buck converter is adjustable. All you would need is to provide input voltage sensing, output voltage sensing, and a brain in the middle to adjust the output to match Y value when the input is X value.
Or, without a brain to run the circuit, maybe you can find or think up a smart op amp circuit which will do what you want.
You really should include more information to get the best course of action. What is the circuit producing the voltage? What is that voltage used for? Is the output going to a hi Z input, or do you need hi current? Are you trying to limit the output to <5vdc or are you just trying to obtain a reading ?
ty_ger07:
Designing a buck converter seems like the logical solution.
"I looked into buck converters but they keep voltage stable." Huh? A buck converter is adjustable. All you would need is to provide input voltage sensing, output voltage sensing, and a brain in the middle to adjust the output to match Y value when the input is X value.
Or, without a brain to run the circuit, maybe you can find or think up a smart op amp circuit which will do what you want.
jackrae:
That's called an Opamp (Operational amplifier)
What are you trying to say? I don't understand your point. Yes, I mentioned that. Or, you can make your own logic using something simple like an ATtiny85 to monitor voltage levels and adjust PWM of the buck converter. In the past, I have used an ATtiny85 to do exactly this without using an op-amp; plus it had push buttons to adjust up or down the voltage and held the desired voltage output relatively stable.
I have a circuit that starts at 3.5V and progressively increases to 8V
I'd like it to start at 0.5 and increase to 5Vso essentially I'd like to wash 3V off the circuit .
How can I do this?
I tried a voltage divider but I found that R1 got incredibly hot and burnt after a few seconds.
I looked into buck converters but they keep voltage stable. I need an increase in line with the input.
Thanks
You may need a voltage subtraction circuit (*) using an opamp perhaps - or then again this could be an
xyproblem thing - why are wanting this, what is this circuit for? Is a one-off ramp or a sawtooth
generator? Is this a linear ramp?
Power_Broker:
What? Of course a resistor divider does.
Perhaps he means for a circuit with a relatively large and variable load. For a high impedance circuit, yes, a resistor divider works well. For a circuit with an electric motor, lights, speakers, or any relatively large amount of load variance, a resistor divider behaves very poorly compared to other options.
IF (that's a capital if) the common reference point of the 3.5 to 8 volts is NOT (and that's a capital not) the same as the 0.5 to 5 volts reference point of the measuring device then a resistor chain (potential divider) could be used, providing the measuring device is of suitably high impedance.
Similarly, if accuracy wasn't important and he was willing to accept a fair degree of non-linearity at low signal values then a 3v zener diode would suffice as a means of chopping off 3 volts from his varying DC signal.
But, on the basis that the OT simply said "I have a circuit that starts at 3.5V and progressively increases to 8V I'd like it to start at 0.5 and increase to 5Vso essentially I'd like to wash 3V off the circuit" would suggest that he wants a common reference point. If that is the case then a potential divider will not meet his needs.
He would require an amplifier with a fixed off-set.
His suggestion that R1 burned up might not be that he set too low a value, it might be that he endeavoured to common up two separate ground references via the resistor, which actually had a substantial potential difference between them.
Until subscribers actually provide sufficient information within their enquiries there will inevitably be frustrated and sometime unprofessional responses.
jackrae:
But, on the basis that the OT simply said "I have a circuit that starts at 3.5V and progressively increases to 8V I'd like it to start at 0.5 and increase to 5Vso essentially I'd like to wash 3V off the circuit" would suggest that he wants a common reference point. If that is the case then a potential divider will not meet his needs.
That is only if the output of the system is supplying power to a load. If he's just feeding the output to the ADC, then it doesn't really matter since the ADC is a high impedance input.
Actually, even if he wants to use a voltage divider to provide power to a load (this is most likely NOT the case, but OP needs to specify), he can still use a voltage divider as long as he uses a buffer between the divider and the load.
The enclosed provides a fairly accurate 3v drop as required. To get it bang-on, give R1 or R2 some adjustment and calibrate against an accurate voltmeter.
Note the ~4.3mA constant current drive to the TL431 drop section.
regards
I'm sorry to have caused such frustration.
I am very new to this kind of circuit building and as such have little knowledge of the depth of question I need to submit.
I will draw up my circuit and post up to assist. Thanks
I'm sorry to have caused such frustration.
I am very new to this kind of circuit building and as such have little knowledge of the depth of question I need to submit.
I will draw up my circuit and post up to assist. Thanks
This is more like what you might want:
Just remember to size the resistors correctly, connect the grounds properly, and make sure there are no shorts with the soldering, etc.
In Factory operation the Transmission control Unit (TCU) controls a solenoid via variable voltage from 0.5 - 5V to modulate pressure application of a solenoid and lock up clutch. The solenoid has 5 ohm resistance and I have measured a maximum current draw of 600 mA.
My project thus far has been to manually switch via a replay a 5V input into the circuit of the Solenoid to activate it when desire. I used a LM2596 to drop the 12VDC from my vehicle source to 5VDC to activate the solenoid. I wired it into the circuit in parallel.
When manually activated the solenoid is switched at 5VDC and controls the lock up clutch well. The problem I have found is that now the TCU activates the solenoid at a higher voltage than it previously did. When the factory operation of the Solenoid occurs it starts at 3.5 VDC and modulates up to 8VDC. This causes harsh engagement of the Lock up and undesired "thud"
I assume this has something to do with the fact I've parallel wired in the LM2596 supply.
I fitted a diode of around 40V to the output of the LM2596 and although this assisted a little the issue is still occurring.
Rather than continue to stab in the dark I knew the wealth of knowledge on this forum could assist.
I know there must be a solution to this that is relatively simple. I will post the crude drawing of the system as it is set up shortly.