My bad. The voltage drop is .01V and the resistance is at .5 ohms from end to end of the cable generating a draw of 20 mA. Anyway, .01V still affects the reading.
Mark, the sensor is connected directly to the cable. No capacitors or resistors are used and the cable is not shielded. I do have approx. .060v rms of ripple at the input of the card.
Add the bypass capacitor at the sensor-end as recommended in the datasheet. Since the ripple is worse than the error, the ripple is most likely contributing to the error.
Anyway, .01V still affects the reading.
No measurement is perfect. Your voltage & resistance measurements are not perfect either.
How much accuracy/resolution (in degrees) do you need?If you're not getting enough accuracy, you can easily
make a calibration adjustment in software. The TMP36 is accurate enough for many applications (+/- 2 degrees C), but
most thermometers & temperature measurement systems need to be calibrated (with a known-accuracy calibrated thermometer). Typically, you "zero" the measurement with an offset (addition or subtraction) and you adjust the slope (multiplication by a factor less-than or greater-than 1.0).
(Calibration won't increase resolution or reduce noise/drift.)