I'm making a hot plate with an Arduino Uno wifi rev 2. I'm using a standard heater cartridge to heat up an aluminum plate that will have a pot of water on it. I would like to position an infrared sensor above the pot in order to turn the heat off when the water reaching a certain temperature. So I'm looking for a sensor that can measure fairly accurately up to 8 inches. The pot will also be narrow so the field of view should be narrow as to only get the temperature of what's inside.
If anyone has any experience with these types of sensors and could point me in the right direction I would greatly appreciate it. So far I haven't had much luck trying to decipher the data sheets.
If you think this idea is unnecessarily complicated and you know an easier way to do it, I would appreciate hearing that as well.
I'm making a hot plate with an Arduino Uno wifi rev 2. I'm using a standard heater cartridge to heat up an aluminum plate that will have a pot of water on it. I would like to position an infrared sensor above the pot in order to turn the heat off when the water reaching a certain temperature. So I'm looking for a sensor that can measure fairly accurately up to 8 inches. The pot will also be narrow so the field of view should be narrow as to only get the temperature of what's inside.
If anyone has any experience with these types of sensors and could point me in the right direction I would greatly appreciate it. So far I haven't had much luck trying to decipher the data sheets.
If you think this idea is unnecessarily complicated and you know an easier way to do it, I would appreciate hearing that as well.
Thanks
Not likely to be very accurate as you will be measuring the temperature of the steam which is much hotter than the boiling water under the steam. If you want to measure the actual temperature of the water, itself, you need a sensor IN the water.
Paul
Don't bother asking about how the steam gets to be hotter than the water it is derived from, but I understand using an IR sensor over liquids is not a good idea, irrespective of the temperature. If you want to use IR, simply aim it at the pot, not the water. You may want to do this because it is so convenient, even if it is a bit ruff'n'ready. If you really want to use something that is popular, accurate, reliable, and well supported around here, use the DS18B20 as suggested above, which leaves you in no doubt about what you are measuring.
Paul_KD7HB:
Not likely to be very accurate as you will be measuring the temperature of the steam which is much hotter than the boiling water under the steam. If you want to measure the actual temperature of the water, itself, you need a sensor IN the water.
Paul
Wrong I'm afraid.
The condensing water vapour in the air above the water will never be hotter than the water itself
as evaporation extracts heat, and then the vapour is mixing with the cooler air. Above a strong
boil the surface gasses are mainly fresh steam and likely to just read 100C before rapidly cooling
as they mix with the air.
However water vapour / steam is likely to be pretty transparent to heat radiation anyway - had
a quick experiment with a kettle and heat camera, the "steam" rising above the water is definitely
much cooler than the water, and fairly transparent, so the basic temperature reading does seem to be
from the water's surface as you'd expect.
However cheap pyrometers (heat radiation thermometer) are not particularly accurate.
MarkT:
The condensing water vapour in the air above the water will never be hotter than the water itself
as evaporation extracts heat, and then the vapour is mixing with the cooler air.
However cheap pyrometers (heat radiation thermometer) are not particularly accurate.
I agree with both statements, despite Mark's misspelling of "vapor."
Cheap IR thermos also seem to be quite sensitive to the blackbody color of the surface being measured. I suspect that the OP is looking for something that doesn't involve a probe (which would definitely be the most accurate), and would allow use of any type of pot which would fit on the plate. Doesn't seem like an easy problem to solve.
S.
The condensing water vapour in the air above the water will never be hotter than the water itself
as evaporation extracts heat, and then the vapour is mixing with the cooler air. Above a strong
boil the surface gasses are mainly fresh steam and likely to just read 100C before rapidly cooling
as they mix with the air.
However water vapour / steam is likely to be pretty transparent to heat radiation anyway - had
a quick experiment with a kettle and heat camera, the "steam" rising above the water is definitely
much cooler than the water, and fairly transparent, so the basic temperature reading does seem to be
from the water's surface as you'd expect.
However cheap pyrometers (heat radiation thermometer) are not particularly accurate.
Gravity ensures the vapor escaping is hotter than the vapor dissolved in the liquid water.
Paul
Paul_KD7HB:
Gravity ensures the vapor escaping is hotter than the vapor dissolved in the liquid water.
Paul
Far be it from me to disagree with a person of such karma, but this is contrary to both training and experience in chemistry (with which I am intimately familiar, having worked my entire career as a chemist). Control of fractional distillation is always conducted by measuring the temperature of the vapor head, and it is always lower than the temperature of the liquid (though usually not by much). If the apparatus is open to the air (as in the case here), the difference will be greater due to the cooling which takes place ... and you're no longer even looking only at vapor. The visible steam is not vapor, its a condensate aerosol. If it were hotter than the boiling liquid it would not condense and it would not be visible.
In any case, the differences are probably going to be far less than measurement error, particularly if an IR thermometer is used...
S.
But you were not measuring using an IR thermometer. Your temperature readings were only possible when the condensate heated the measuring device, so it lost heat by transfer. The OP is wanting to measure by radiation only.
Paul_KD7HB:
But you were not measuring using an IR thermometer. Your temperature readings were only possible when the condensate heated the measuring device, so it lost heat by transfer. The OP is wanting to measure by radiation only.
Paul
The vapor heats the thermometer bulb, in the process being cooled, condensing, and dripping off. It is instantly replaced by fresh vapor. An equilibrium condition is very quickly attained which reflects the true temperature of the vapor as it passes into the condenser. This concept is absolutely critical to processes as diverse as petroleum refining and booze-making.
As far as measuring a vapor using an IR thermometer, I can't speak to that. IR thermos work on emitted IR radiation according to Planck's Law and I'm not sure how to apply that to a molecular vapor.
S.
srturner:
The vapor heats the thermometer bulb, in the process being cooled, condensing, and dripping off. It is instantly replaced by fresh vapor. An equilibrium condition is very quickly attained which reflects the true temperature of the vapor as it passes into the condenser. This concept is absolutely critical to processes as diverse as petroleum refining and booze-making.
As far as measuring a vapor using an IR thermometer, I can't speak to that. IR thermos work on emitted IR radiation according to Planck's Law and I'm not sure how to apply that to a molecular vapor.
S.
The heated water molecule will emit a photon in the IR frequency range. Each photon emitted will cool the molecule because energy is removed. The frequency(energy) of the photon is less each time because there is less energy. The more high energy photons seen by the sensor, the higher temperature it will register. The OP wants the temperature of the boiling water, so the photons right at the liquid surface are the ones needed. They are the first ones to escape the liquid surface. They have had the least time to emit photons, so are the "hottest".
Paul
What an interesting discussion - although maybe not very relevant to the OP.
While you might expect the vapour temperature to be higher - because the only molecules with enough energy can escape - the excess energy is required to break inter-molecular bonds. So I reckon at least for water in an open vessel the temperatures above and below the interface will be in equilibrium.
I would try to see if a AMG8833 IR Thermal Camera would fit the bill.
I use them to detect when I am in a room, even if I am sitting still, keep the light on thingy because there is a heat source in the room that is above xx degrees.
Idahowalker:
I would try to see if a AMG8833 IR Thermal Camera would fit the bill.
The AMG8833 max temp is 80 deg C with an accuracy of +/- 2.5 deg C, so is not suitable to measure the temperature of boiling water (per the OP's subject line).
The AMG8834 max temp is 100 deg C, so it could work, but its accuracy is even worse: +/- 3.0 deg C. The killer may be the humidity spec: not to exceed about 85% under any circumstances, and the RH limit goes down as temp goes up. I suppose the device could be enclosed in a case to protect it, but condensation on the lens/window may cause problems.
...and furthermore, unless the water in the pot had been distilled or deionized, its boiling temperature will increase over time. That is due to boiling point elevation caused by concentration of minerals or other solutes as the water boils off (see: "colligative properties").
I stand by my previous statements but will say nothing more on the subject. It is way OT for the purposes of the OP.
S.