Analog to digital Convert for a bipolar Voltage

Hi.
excuse my English.

I have a DC LVDT position sensor as:

Model: SDVH20-50B
Power: 15-28v DC (15 recommended)
output: 0-10V
range:0-50 millimeter

problem1.
I excite the LVDT and measure the output by a multimeter. for ZERO position I get -0.8 (12V power) and -.9 (24V Power) and after that output (almost) linearity goes up with the position; does LVDT broke or damage. it seems if I shift output by +0.9V, everything will be right. does LVDT damage? solution?

problem2.
For ADC, I used Arduino Uno pin that uses 0-5V. I use 2 resistors 4.7k for voltage divider. is this best solution for stepping down the 10V output?
Arduino Uno 10 bit ADC is good for my project.

sdvh20.pdf (575 KB)

Post the exact model number of the sensor you are using (SDVH20-50B is not described in the data sheet you posted), together with a wiring diagram showing how you are testing it.

According to the data sheet, it is impossible for the sensor to output a negative voltage, so you are probably doing something wrong.

Note: it is not a good idea to use a 12V power supply when the manufacturer recommends 15V minimum.

Post the exact model number of the sensor you are using (SDVH20-50B is not described in the data sheet you posted), together with a wiring diagram showing how you are testing it

Exact information on label of the LVDT:

Model: SDVH20-50B
Power: 15-28v DC
output: 0-10V
range:0-50 mm

Wiring that i use to measure output by multimeter:

Wiring diagram.JPG

You will need the data sheet for the sensor you have, in order to make sense of the output.

However, if the label states 0-10V output, and you measure a negative output voltage, it is probably safe to assume that the device is broken.

The LVDT is probably alright as it responses as the position changes!

To avoid the generation of negative voltage, one has to tie together all the zero-volt points (digital GND, analog GND, signal GND, PWR Supply GND, and chassis GND); but, practically it is not always possible. There is always some ground loop. Under this circumstances, the sensor is to be calibrated against two known points to accommodate the offset and gain of the sensor. There is no sensor which is ideal; all sensors are real sensors, and they have gain and offset. You may follow these steps to have your sensor calibrated.

1. Apply 15V as excitation voltage.
2. Use a 4.7k + 2.2k voltage divider circuit. Voltage across of 2.2k would be fed to ADC.
3. Keep the position at 10 mm; measure the voltage across 2.2k and record it as VD1. You have the point A(d1, VD1) = A(10, VD1).

4. Keep the position at 40 mm; measure the voltage across 2.2k and record it as VD2. You have the point A(d2, VD1) = A(40, VD1).

5. Take the unknown point C(d, VD).

6. Find the equation for d in terms of VD which is of the form:
d = k*VD + c. //k and c are known constants

7. Use 5V (DEFAULT) as the VREF for the ADC of the UNO. Simple algebraic manipulation will give you:

d = k*(5/1024)*analogRead(A1) + c; //5V —>1023/1024; analogRead(A1) = (1024/5)*VD

8. Declare d as a float variable; also, cast the right hand side of d by (float).

GolamMostafa:
To avoid the generation of negative voltage, one has to tie together all the zero-volt points (digital GND, analogue GND, signal GND, PWR Supply GND, and chassis GND); but, practically it is not always possible. There is always some ground loop. Under this circumstances, the sensor is to be calibrated against two known points to accommodate the offset and gain of the sensor. There is no sensor which is ideal; all sensors are real sensors, and they have gain and offset. You may follow these steps to have your sensor calibrated.

1. Apply 15V as excitation voltage.
2. Use a 4.7k + 2.2k voltage divider circuit. Voltage across of 2.2k would be fed to ADC.
3. Keep the position at 10 mm; measure the voltage across 2.2k and record it as VD1. You have the point A(d1, VD1) = A(10, VD1).

thanks for your explanation.
I try that, but it doesn’t work. here is my wiring:

(link & image)
http://tinyurl.com/y834djz2
voltage divider.jpg

The voltage divider only decreases the range, it does not do anything about negative voltage.

how about an Amp?

here is the circuit that I use(in simulation):

http://tinyurl.com/ya4e3s53

amplifier.JPG

It gave 1.25v (for -0.9 LVDT output) to 5V(for 10V LVDT output), so I will be able to calibrate 1.25~5V for 0~50 mm. I just wondering,is this circuit safe for LVDT and ADC? does it work practically??