I don't think there is any real way of over-coming this issue, but it's worth throwing it out there.
A friend of mine installs BMS systems and he asked if there was any way to overcome the fact that the system thermistors (whether monitoring air or water) eventually degrade and drift in accuracy.
He asked if putting 2x Thermistors in the sensor and averaging them out would give better results, but I can't really see that making much difference. Surely they are going 'age' at roughly the same pace, and therefore the returned value will still drift.
My only suggestion was either measuring the temp in some other way that doesn't degrade E.G. Infra-red (but that isn't practical for water monitoring really).
Welcome to the mechanical world, you must properly protect the sensor element. Water eventually gets past heat shrink and will get to the sensor. It may take years. It took 5 years for my fish tank. Many sensors do not have a wear out mechanism but will change because of external forces.
@baffled2023 ...
Tell your friend to use a silicone potting compound (usually white, hardens when dry for better heat transfer) to encapsulate the thermistor for a waterproof setup. Standard in ebike batteries.
In most "serious" applications, equipment is calibrated/verified periodically. I work in electronic manufacturing and my multimeters & oscilloscopes are sent-out for calibration annually. I've worked other places where everything was calibrated every 6 months.
Same here. I've written code for medical devices that used thermistors to control temperatures to 0.1C accuracy and I have never heard of one drifting even after years in the field.
This has not been my experience with several NTC thermistors. I have a thermometer that is close to 15 years old and the accuracy is still pretty much the same. I will admit I am looking at the temperature over a limited range.
Now if the thermistor was not properly protected from the elements you may get some drift.
Well these are commercial temp sensors in air handling units, water tanks etc.
Such as:
The standard sensor is a 10K3A (Which I believe means 10k at 0°C).
I have had a Google and indeed, the sensors drifting does appear to be an know attribute to these sensors over time.
From what I understand, the control system is fairly dependant on accurate temp readings to control these systems.,,
BUT.... I can't personally believe that a drift of say 1°C is going to noticeably impact the control of the cooling or heating of a building.
I would possibly say there are other factors at work, such as bad cabling. This then may make these sensors read a lot more outside their boundaries.
I suggested logging the readings over a period of time to eliminate this possibility.
I think maybe part of this is he is a skimper... as in, I expect he buys the lowest quality
But.... I don't think RTDs will be compatible with the wiring in these systems
Don't RTDs need 4 core? Most of his sensors are simple Belden twin cable.
I will have to look into that. I think you can wire RTD's with 2, 3 or 4 core, but when in 2 core 'mode', they tend to suffer accuracy issues due to cable resistances etc.
At least... that is what I just read.
First: You / he needs to identify what the system requires. In my opinion "as accurate as possible" means the person has no clue what is really needed.
Then setup an error budget. Maybe contact a few mfg tech support folks.
BTW, I've never seen an Platinum RTD used in an air handler.
That explains why I've never encountered it. The temperature controllers that I've built have generally been below 40C and always used high-quality sensors. According to that table, it would be drifting less than 0.02C over 8 years!
Yes for a quality sensor (which I'm sure you use). In general I've found the largest factor in long term or failures etc is thermal cycling. Internal stresses in an component are difficult to design out.
I've also found it funny when someone asks for "as accurate as possible" I respond with some outlandish costing device. I always get the response ... " well we don't need it that accurate".
Yea. Getting reasonable responses from this guy is difficult.
I asked him what he considers to be bad drift and he reckoned that some where over a degree out. I suggested that maybe something else was at play here.
I just had a look at the controller he designed for these sensors. All the resistors, caps in the decoding circuit etc are 10% tolerance which probably doesn't help.
I have managed to get hold of the controller PCB.
But... I also spoke to the electrician who installs these systems and he says the wiring is appalling.
Low voltage mixed with mains, no screened cables etc
Where did he find those resistors?
In a cathode ray tube television that had its end of life 30 years ago?
In another recent thread it was mentioned that these resistors are not stable over time. Is any of these used as a reference resistor?