I am using a TMP36 sensor connected to a NodeMCU. It works well at room temperature, but when the ambient temperature is 40°F I get a temperature of 54°F. At 32°F I got a reading of around 44°F
The TMP36 accuracy is ±2°C, so about ±4°F. I have measured the voltage I am reading with my meter and it matches the reading I am getting in my code.
The NodeMCU is soldered to a small breadboard type circuit board and the TMP sensor is also soldered to the circuit board about an inch away from the ESP8266. No housing.
I suspect that heat is being conducted through the wires to the TMP36 sensor. I plan on unsoldering it and adding it to longer wires.
My questions are for people who have actually used a TMP36 where temps are near 32°F (0°C):
Has anyone experienced elevated temperatures around freezing with TMP36?
Has anyone found that heat from the processor is elevating temperatures?
What was your solution?
The A/D of an ESP8266 is not very good. There could be an A/D offset.
Remove the TMP36, and ground A0.
Write a sketch that displays the A/D value with A0 grounded.
Load the temp sketch again, and subtract that offset value from the raw value, before converting to temp.
It would be much easier to use a digital DS18B20 sensor.
And yes, temp sensors are sensitive to heat radiation and conduction.
You could sleep the ESP, to reduce it's own heat.
Try adding a simple delay(1000); at the end of loop.
Leo..
My voltage reading is within 2mV of what I measure with my multimeter, so I am confident I am accurately reading what the TMP outputs.
yes, thanks, I will look into using those instead, much better accuracy, and digital too.
I isolated the TMP36 and placed it in a big heatsink well away from the ESP8266. Still about 15°F too high, so I suspect the TMP36 is simply too inaccurate at lower temperatures.
The ESP8266 is busy doing many other tasks so sleeping or doing nothing for 1000ms is not an option. I already delay 4ms after doing the analog read so that the wifi stack gets a chance to use the AD converter.
Does the data sheet support that view? I suggest you broaden your audience beyond the small handful of people that have that hardware.
I have seen the effect of processors on temperature sensors generally, as I've monitored the temps on many RTC based projects I've built. I can tell you the problem is real.
well, my reading of the spec sheet suggest it should work:
Calibrated directly in °C
10 mV/°C scale factor (20 mV/°C on TMP37)
±2°C accuracy over temperature (typ)
±0.5°C linearity (typ)
Stable with large capacitive loads
Specified −40°C to +125°C, operation to +150°C
The TMP35/TMP36/TMP37 do not require any external
calibration to provide typical accuracies of ±1°C at +25°C
and ±2°C over the −40°C to +125°C temperature range
though I don't know how much variance the linearity can introduce.
If you convert A/D to "volt" then you're using the wrong code (unless you make a voltmeter).
You should convert directly from A/D value to temp.
test how much offset the crappy A/D of YOUR NodeMCU has (mine had 8).
get an A/D reading, and subtract that offset.
multiply that result with a pre-calculated factor to get temp (I calculate 0.3125)
subtract 50.0 (because the TMP36 has a 50 degree C offset).
convert to F if that's what you are used to.
Untested.
Leo..
const int offset = 8; // pre-tested
float factor = 0.3125; // calibrate temp by changing the last digit(s)
void setup() {
Serial.begin(9600);
}
void loop() {
float tempC = ((analogRead(A0) - offset) * factor) - 50.0; // TMP36 has 50C offset
float tempF = tempC * 1.8 + 32.0; // C to F
// Serial.println(tempC, 1); // one decimal place
Serial.println(tempF, 1);
delay(1000); // dirty delay
}
Edit: The specs of an analogue sensor become meaningless the moment you connect is to a digital device. Then linearity and accuracy of the A/D of the processor becomes the dominating factor.
You don't have that problem if the A/D is integrated inside the sensor (sensor with digital output).
Right, to get accurate results you need end to end calibration. I once used ice water and boiling water as references.
I hope it's not too editorial, but I've noticed many people have unrealistic expectations from electronic temperature measurements. For example, complaining that they change too slowly, when the sensor itself can not change temps that fast...
Not true. Your code is also reading mV and then converting that to a temperature, when it multiplies your reading+offset by your factor. The factor you use to produce a Voltage is 0.3125 (which is 3200/1024) , is the same as what I am using.
Whether you add a calibration offset to the voltage read, or to the derived temperature makes no difference, if you only care about temperatures.
I'll say it again. I am getting accurate mV readings that (match the voltage within 2mV). The reason I stated that at the outset was to avoid all suggestions that there is something wrong with the reading of the voltage output of the TMP36. If the data matches the actual physical voltage measured externally then logically that code is working.
My devices allow the user to use the A0 pin to produce any combination of mV, °C and °F sensor readings.
I keep the mV reading and if the user wants the temperature I calculate °C temp. If they want °F I convert °C to °F as well. The same code reads any voltage the user wishes to measure, no matter what it is connected to.
I also verified my reference voltage: 3315mV, and if I used 3315/1024 my factor would be 3.2372 and give me even higher readings as it is higher than the expected 3200mV used to derive the factor to produce mV.
I get the similar results on two separate NodeMCUs using two different TMP36 components.
That was actually on a diode, it was a custom circuit. The phase change points of H2O are known to be very stable, only affected by altitude in the case of boiling.
Where did you obtain the TMP36? Is it a verified genuine part?
A major question does remain - how are you measuring the ambient temperature for comparison? What is your "gold standard"?
It's futile to try to limit responses here. Better to just ignore the replies that you don't find helpful.
You're walking a fine line with the "experienced" plea. We're mostly assuming nothing and just dealing with the information that is posted.
I think the issue of reference temperature hasn't been explicated enough... perhaps you haven't yet shown us everything but so far I only see a "hardware store" wall thermometer.
If you don't post your code, then we can only guess what your experience level is.
Most of the frequent posters here have 40+ years under their belt.
I started building valve radios in my pre-teens about 60 years ago.
Remember that 0C (32F) with a TMP36 is not some sort of a pivot point for calibration.
The TMP36 has a start point of -50C (zero A/D point).
Leo..
I compared the temperature outside to a FLIR DM93 meter with temperature probe, an alcohol thermometer and two sources of online current temperature conditions. I could also tell the temperature was not 60°F while trying to keep my fingers warm.
Only beginners use them, because they come with starter kits.
I have used analogue sensors about 40 years ago.
Prefer the much easier to use digital sensors today.
Leo..