I have a voltage sensing reading around 0.6V - 3.3V and I seek to subtract the 0.6V baseline prior to the ADC. I believe using a differential amplifier configuration would be suitable for my purposes. But I am not getting the results I aim for. I suspect my differential amplifier circuit is set-up wrong or I am using the amplifier incorrectly.
All the resistors shown on the graph are equal to 63K Ohms to make the gain = 1. V2 is a signal that varies between 0.60V to 3.3V. V1 is a solid 0.60V. I hoped to use the differential amplifier to make Vout = G*(V2 - V1) ; where G = 1.
However, to be clear:
My main focus is to subtract 0.60V from a signal that varies between 0.60V - 3.3V. I want to do this because I want to stop spending bits on voltages I know I will never read such as 0V - 0.60V. All the bits would be utilized in reading the range 0.60V - 3.3V instead.
If there are other ways of achieving this without a differential amplifier, I am all ears