Voltage drop through cable

Hi,

I’m using a TMP36 on a bread board which is giving me a pretty accurate reading when compared with a mechanical thermometer.

However, when connected to a water heather through about 3 meters of wire, I noticed a .2V drop.

I compensated the difference tweaking a bit the formula in the sketch but, even though I’m cheating, it’s not giving me the correct values. Is there a way around?

TIA

It doesn’t make sense to drop 0.2 v in a 3 m wire unless the current demand is high. The TMP36 should be working in the micro amp area..

I know but because the temp reading was off once connected the sensor to the cable, I found what was causing the problem: voltage drop. I’m using telephone cable for low impedance but no go. If I bypass the cable, the reading is fine. I’m thinking to double up on the line cable to half the resistance.

ebolisa:
I found what was causing the problem: voltage drop.

Voltage drop in the cable or inside the sensor?

ebolisa:
I’m using telephone cable for low impedance but no go. If I bypass the cable, the reading is fine. I’m thinking to double up on the line cable to half the resistance.

What is the total resistance of 3 m telephone cable – 0.2 ohm, 0.5 ohm? You would need hundreds of mA to drop 0.2 V if the drop actually is in the cable.

Thanks for getting back to me.

The voltage drop I’m experiencing is the difference between the sensor’s output and the Arduino analog in pin, so the drop is in the cable. I’m not close to the device as of now, so will check back with more info later on. Thank you.

Its not likely to be an IR voltage drop at all (telephone cable doesn't have 4k ohms resistance!).
It could be RF interference of a lack of isolation at the sensor
(there must be no connection at the sensor to anything except the sensor's wire).

You do have a RF-shuntng capacitor on the analog input?

You aren't by chance supplying power to the heater through the same cable you are using for the sensor are you? If you are using a common ground for the heater and sensor for example, it will cause strange things. The sensor power/ground/signal should be separate from any power going to something that is pulling significant current.

My bad. The voltage drop is .01V and the resistance is at .5 ohms from end to end of the cable generating a draw of 20 mA. Anyway, .01V still affects the reading.

Mark, the sensor is connected directly to the cable. No capacitors or resistors are used and the cable is not shielded. I do have approx. .060v rms of ripple at the input of the card.

Gpsmikey. The sensor is connected to the hot water pipe. No electrical connections.

Anyway, .01V still affects the reading.

It only affects the reading by 2 or 3, there are 4.88mV per step and you are only loosing 10mV.

ebolisa:
My bad. The voltage drop is .01V and the resistance is at .5 ohms from end to end of the cable generating a draw of 20 mA. Anyway, .01V still affects the reading.

If a TMP36 is pulling 20mA something is very wrong, it should be drawing less than 50uA (ignoring
load).

Mark, the sensor is connected directly to the cable. No capacitors or resistors are used and the cable is not shielded. I do have approx. .060v rms of ripple at the input of the card.

Then you are not using it properly - the 0.1uF decoupling capacitor is not optional and must be sited at the sensor. You should always use shielded cable for such a high impedance sensor.

My bad. The voltage drop is .01V and the resistance is at .5 ohms from end to end of the cable generating a draw of 20 mA. Anyway, .01V still affects the reading.

Mark, the sensor is connected directly to the cable. No capacitors or resistors are used and the cable is not shielded. I do have approx. .060v rms of ripple at the input of the card.

Add the bypass capacitor at the sensor-end as recommended in the datasheet. Since the ripple is worse than the error, the ripple is most likely contributing to the error.

Anyway, .01V still affects the reading.

No measurement is perfect. Your voltage & resistance measurements are not perfect either. How much accuracy/resolution (in degrees) do you need?

If you're not getting enough accuracy, you can easily make a calibration adjustment in software.

The TMP36 is accurate enough for many applications (+/- 2 degrees C), but most thermometers & temperature measurement systems need to be calibrated (with a known-accuracy calibrated thermometer). Typically, you "zero" the measurement with an offset (addition or subtraction) and you adjust the slope (multiplication by a factor less-than or greater-than 1.0).

(Calibration won't increase resolution or reduce noise/drift.)

Solve the 20mA thing first - if the sensor is taking 20mA is probably toast - it is certainly self-heating and
giving wrong values in that situation (the reason it only takes a tiny supply current is specifically to
reduce self-heating to a very low level).