DS18B20 self heating or what?

The self heating of the DS18B20 is a topic with a lot of things said about, GIYF. Yesterday I thought I had a bad case of self heating so I tried to explore it a little bit.

I wired up four DS18B20 on a breadboard. Two are on one bus, one of those is powered directly the other is powered in parasite mode. The parasite one got an additional cosy casing to try to keep all the heat. The third one has its own bus, powered directly via a PIN on the Arduino. Idea is to completely turn off the power during idle phases. The fourth one is like the third one with the difference of half a meter cable and a fan blowing mildly at the TO-92 package.

I took the "DS18x20_Temperature" example and modified it so that #1 and #2 are constantly polled while #3 and #4 are only polled once a minute. After 15 minutes I got the following results (almost stable on every read, within about +-0.2 °C or so):

#1 parasite+foam F3 Temperature = 20.69 Celsius
#2 self powered 62 Temperature = 20.87 Celsius
#3 power on measure D4 Temperature = 20.62 Celsius
#4 power on measure + fan 24 Temperature = 19.56 Celsius

So what it the conclusion? I'm not sure.

If I do some math, I end at the thermal resistance of a TO-92 with about 160 to 210 °C/W, depending on whom you ask. Maximum energy consumption of DS18B20 is about 5 mA at 5 V. Gives 25 mW or a deltaT of 5,25 °C. Taking only the "active" current of 1 mA it gives 1 °C.

The "1 °C" look so promising and it feels so wrong.

So, again, any conclusions?

ds18b20selfheatingtest.zip (1.74 KB)

An interesting experiment!

According to the data sheet, the DS18B20 has an absolute accuracy of +/- 0.5 degrees C, which means that on average, each individual DS18B20 will disagree with the others, somewhere within that range (probably a Gaussian distribution, so 67% of the time). You see approximately a "two sigma" variation of one from the average of the other three, which is not really statistically significant. It seems that from your data, the only conclusion you can draw is that if self-heating is operative, your experiment can't detect the effect in a statistically significant way.

Furthermore, if the distribution is a Gaussian, 33% of the time you can expect an outlier device > 0.5 C away from the others, so the low reading could be a fluke for that reason. Try swapping the devices to see if that changes anything.

I had the same suspicion.

By adding parallel DS18B20 to the fan #4 (so we have a #5 and #6) I see almost exact values for those DS18B20:

#2 62 Temperature = 20.75 Celsius,
#3 D4 Temperature = 20.56 Celsius,
#4 98 Temperature = 19.81 Celsius,
#4b 24 Temperature = 19.69 Celsius,
#4c 3C Temperature = 19.62 Celsius,

It is significant different. So: no, sorry, it is not the accuracy error.

Statistics can be tricky!

If you accept the manufacturer's claims for device to device variation as being a Gaussian distribution, with one sigma = 0.5 C, the expected variation in comparing any two such devices is sqrt( 0.5^+0.5^2) or about 0.7 C. A 1 degree C variation is roughly 1.5 sigma variation, so, statistically speaking, you can have somewhere between 67% and 95% confidence that you are observing a genuine effect. Some people might be convinced, others not.

Data sheet reading can be difficult!

I would be with you if I could see the values vary within the range of +-0.5. But they don't. They are more or less stable. I currently get

#2 62 Temperature = 20.69 Celsius,
#3 D4 Temperature = 20.31 Celsius,
#4 98 Temperature = 19.56 Celsius,
#4b 24 Temperature = 19.56 Celsius,
#4c 3C Temperature = 19.56 Celsius,

for a really long time, vary within occasional jumps +-0.1 at maximum.

I would interpret the data sheet value of 0,5 °C as theoretical maximum constant offset value of accordance between real temperature and what the sensors gives. On page 19 Maxim writes about a "Thermometer Error", in figure 17 they give a mean error of 0.2 °C. It is not very clear.

There is nothing in the data sheet that says anything about the error between two consecutive measurements. At least I didn't find anything. As far as I see in reality, the consecutive measurements are quite stable (and don't jump +-0,5 °C).

And yes, I swapped sensors. Same game, same results.

Very interesting experiment! +1 +bookmarked

can you post a series?
I am curious in the starting temperature and the curve how it stabilizes.
imho it is the delta of the temperature that is most interesting.

e.g. the temperature of the 4 sensors every minute over a time of 15 minutes would be great
(preferable in XLS format of course :wink:

data1 is just one measurement every second (or so). As far as I interpret it, it shows the room temperature sinking and that consecutive measurements are almost const. Sensor noise (is this the correct description?) is less than 0.1 °C.

data2 is holding my solder iron to the to-92 for some seconds, remove it and measure every second (or so).

It all doesn't say anything about real room temperature, only what the sensors thinks. But the difference between both seems to be constant +- 0.1 °C at maximum.

ds18b20.xls (71 KB)

Data sheet reading can be difficult!
I would be with you if I could see the values vary within the range of +-0.5. But they don't

Agreed that reading a data sheet can be difficult. My reading of the data sheet is one that I know to be common and here is how I understand the specification: if you buy a large number of DS18B20 devices, have them all at the same temperature and read out the individual temperatures, the readings would be distributed approximately as a Gaussian about some average, with a standard deviation of 0.5 degrees C.

What the manufacturer means by the 0.5 C figure is not entirely clear, but the data sheet does show the attached curve presumably showing the average performance for a large number of devices as a function of temperature. Note that on average, the devices tend to read 0.2 degrees low at 20 C, with a 3 sigma spread of about 0.5 C (for 99% confidence), but over the 0 - 70 degree range, the total spread is about 1 degree C (3 sigma).

DS18B20.png

So we agree that a single DS18B20 has a (more or less) constant offset to the real temperature. So if I have a row of DS18B20 is see different temperatures on each sensor, but (more or less) constant for each sensor. (the later shows my data from last posting.)

Right?

This does not explain why there is a difference of about 1 °C between "with fan" and "no fan" regardless which sensor I use. If I swap sensors I get (more or less) the same results.

Thanks for doing that "field-test"
Interesting sensor. Does the datasheet tell if sensor is compensated for self heating?
Which reading is the correct temp?
Next test is ice from your freezer. Make sluch. I hope you they get near 0,0 C

Your experiment does seem to show a small effect of fan cooling. What is difficult is to assign a level of confidence to "self heating" (which could certainly explain the observation) or to decide whether the effect is worth worrying about.

If self heating is important, then it was probably taken into account when the device was designed and calibrated by the manufacturer. Can one do anything about self heating? Is the temperature measurement more accurate if you use the parasitic power scheme versus constantly powered? At this point you don't know because you need another, independent method to measure the ambient temperature accurately.

One possible conclusion is that if you want more accurate temperature measurements, you probably need a different type of thermometer.

jremington, could you please explain the following:

  • reading the same sensor again and again gives (almost) the same results. There is no variation anywhere near +-0.5 °C. See my data, it is at maximum at about +- 0.1 °C at a given sensor.

  • comparing two sensors gives a (almost) constant difference between two sensors when exposed to the same environment situation. again this difference follows the before observation.

Why should two given sensors suddenly behave different when one is exposed to a different environment (blowing fan...)?

I'm totally with you if you talk about a prediction of unknown sensors. But here I have existing sensors whose behavior I can measure and compare...

I think that if you want calibrated temperature readings, you need to use an offset value for each individual sensor. I think that the 'standard' sensor is not calibrated for an offset, so it will read consistently for itself within 0.5 degrees as per spec, but the reading is not guaranteed to be within a certain deviation from a standard temperature. So sensors from different manufacturing batches would maybe all read the same, but not the same as a different batch.

There is no variation anywhere near +-0.5 °C.

This is not what the manufacturer is claiming!

The manufacturer claims that if you buy a DS18B20 sensor and use it to measure a temperature over a range of -10 to 85C, then the measured temperature will be within 0.5 degrees of the "true" temperature (more or less guaranteed).

The manufacturer also publishes the curve that I showed in the last post, which suggests that most devices, at one given temperature, will agree with each other much better than the overall range of +/- 0.5 C. However, the variation from the "true" temperature depends on the temperature.

As far as I can tell, the data sheet says absolutely nothing about the minute-to-minute variation of readings from one device, when kept under constant conditions. You are already showing by your measurements that this "noise" is much smaller than +/- 0.5 degrees C. That is certainly what I see with mine.

jremington:
This is not what the manufacturer is claiming!

The manufacturer claims that if you buy a DS18B20 sensor and use it to measure a temperature over a range of -10 to 85C, then the measured temperature will be within 0.5 degrees of the "true" temperature (more or less guaranteed).

At last, a proper summary..........

The link below shows readings at ten second intervals for 24 hours for three sensors with 12 bit resolution, two dangle side by side over my desk, the other is in the open drawer. Also shown is the difference between the first two.

http://homepages.ihug.com.au/~npyner/Arduino/20140129.CSV

This should be sufficient for mere mortals. If the gods want to get more serious, they probably use some other sort of sensor.

Some nasty spikes in the column reporting the difference between sensors 1 & 2. See attached.

OK, it looks like two things have happened to turn this into a bit of a comedy of errors.

First, I'm afraid it looks like my flippancy has muddied the water. Second, I think you have deleted a message that came through just as I was going out the door for a surf.

What I posted was no test, no experiment, just some DS18B20s going about their job in the normal manner and producing exactly what was expected of them. For the most part, they are just reading ambient temperature. The reason why sensor#3 is still in the drawer is that I'm not interested at the moment. Any variations from the bleeding obvious is just me stirring up the graph on the Android, which isn't often..

Despite my intervention, there should be enough data to demonstrate that there is nothing to get worried about. I submit that this thread is nonsense - an exercise in studiously looking for a problem that isn't there, instigated by a failure to read the data sheet.

I don't know what the hell's going on with that graph. It initially appears to be made from the first twenty readings, but bears no relationship whatsoever with the data in the file. The readings in the file hardly vary and the difference column is equally unexciting. The values on the X scale are not even in order.

I agree that we have beaten this non-topic to death. Wish I were surfing!

edcasati:
I think that if you want calibrated temperature readings, you need to use an offset value for each individual sensor. I think that the 'standard' sensor is not calibrated for an offset, so it will read consistently for itself within 0.5 degrees as per spec, but the reading is not guaranteed to be within a certain deviation from a standard temperature. So sensors from different manufacturing batches would maybe all read the same, but not the same as a different batch.

That is why I always talked about comparing two or more sensors with each other. That way the error is meaningless.

Maybe better late than never

Just been doing some tests withDS18B20 using the Maxim 1-Wire viewer software.
Bare device, 5-V supply, 1-second polling - self-heating up to 2-Celsius and rising.
Stuck an old Stanley knife blade on as a heat sink and self-heating at 30-second polling less than +0.5-Celsius.
Obvious really. Slugs it a bit. Someone's uncertainty principal about measurements changing the measurement; so un-powered DS18B20 very accurate, powered up, not so good - strap it to a copper hot water pipe with thermal paste, no problem. But any device, PT100 etc, will self-heat.