Need help with differential amplifier.

Hello,

I have a voltage sensing reading around 0.6V - 3.3V and I seek to subtract the 0.6V baseline prior to the ADC. I believe using a differential amplifier configuration would be suitable for my purposes. But I am not getting the results I aim for. I suspect my differential amplifier circuit is set-up wrong or I am using the amplifier incorrectly.

I am using this set-up with a TS922 (http://www.st.com/internet/com/TECHNICAL_RESOURCES/TECHNICAL_LITERATURE/DATASHEET/CD00001188.pdf) op-amp chip and all the resistors equal to 63K Ohms. Vcc+ = 3.3V , Vcc- = GND, V2 = 0.6V - 3.3V reading , V1 = 0.6V (from voltage divided 3.3V).

I get a close to 0V reading but the sensitivity of the voltage reading is null.

I would greatly appreciate any and all help. Thank you for your time :).

Okay,

So first off, saying your resistors equal 63K ohms doesn’t really help us. If you could please give us the specific values of your R1, R2, and Rf.

Secondly, I’m confused as to what you are trying to get out of this. So you are feeding .6V into the V1? and then what is going into V2?

Sorry if it wasn't clear,

All the resistors shown on the graph are equal to 63K Ohms to make the gain = 1. V2 is a signal that varies between 0.60V to 3.3V. V1 is a solid 0.60V. I hoped to use the differential amplifier to make Vout = G*(V2 - V1) ; where G = 1.

However, to be clear: My main focus is to subtract 0.60V from a signal that varies between 0.60V - 3.3V. I want to do this because I want to stop spending bits on voltages I know I will never read such as 0V - 0.60V. All the bits would be utilized in reading the range 0.60V - 3.3V instead.

If there are other ways of achieving this without a differential amplifier, I am all ears :)

Thank you

Set V1 to 0.6V, input signal is V2 - you've got your signs mixed up.