I hope someone can verify that i'm doing this correctly.
I have a thermistor hooked in a voltage divider with a 470 ohm resistor so i can measure temperatures from 32 F to 350 F using the ADC. I've figured out the RT curve, etc. However, I'm currently trying to figure out if self-heating will be negligible or if it is something i have to account for in my code.
With a multimeter i measured 10.2 milliamps across the thermistor (in the above configuration) at about 85F. this comes out to roughly 0.0107 watts. The Dissipation constant for this thermistor is 2 mW/°C (still air).
The way I calculated it self heating accounts for something less than 1 tenth of a degree C.
Am I right? Is it negligible? How should I go about calculating this on the next thermistor I choose?
Thermistors are not typically precision devices. The specs for the devices I've used have typically been +/- 2 degrees C or +/- 3% whichever is larger. Given that I think your self heating component should be fine.
You can improve the results by increasing the mass (heatsink) of the thermistor; however, this will increase the response time for the device to measure the ambient temperature. It is a tradeoff between sensitivity and long term accuracy.
Personally I would try it as is and then monitor the output for any long term drift.