I’ve been playing with the internal temperature sensor on various processors. The first test was very similar to analogRead… Start a conversion; wait for it to finish; print the result; wait 60 seconds; do it again. The values fluctuate ±1…
286 286 286 284 285
The values for various processors and ambient temperatures were different but the behaviour was always the same: ±1 around some value.
I suspected the culprit was a lack of de-coupling. The datasheet recommends using “ADC noise reduction” sleep mode to increase the accuracy so I decided to test that before adding capacitors. The “noise reduction” values are very stable but consistently higher!
I then decided to add de-coupling. The “polling” values are now also very stable but the “noise reduction” values are still consistently higher…
Polling / analogRead: 292 292 292 292 292
Noise reduction: 296 296 296 296 296
So, I have two questions…
Has anyone here tried the “noise reduction” sleep mode?
What the ef? Why would the “noise reduction” values be higher?
Another mystery: it appears that the "noise reduction" values are always even. The test has been running for about five hours (one value a minute) and there has not been a single odd "noise reduction" value.
This is only a philosophical guess but here goes.
I'd have thought that an ADC must always have a +-1 bit resolution about a centre point. There is no such thing as a single absolute or exact value in life, only an approximate measure of a variable and the mathematics of determining analogue values within an ADC must always have some degree of digital "dither". No two things are ever identical (only so near or like each other that neither the eye nor measurement can detect difference) If your ADC was only a 4-bit device there would still be +-1 bit resolution (there can be nothing less than 1) but the hysteresis would be so much larger.
With respect to filtering, if the measurement device's input is of high impedance then the capacitors are effectively charging to the peak values seen, however short they may be, so giving a higher figure than the unfiltered input will show (an average of any noisy signal perhaps)
Surely a simpler way of "stabilising" the reading would be to take an average of several readings and use the averaged value for display.
Having worked at a company that makes test equipment this kind of error is normal. Usually the bottom two OR MORE bits on an ADC are garbage. The data sheet for the atmega even says so. IIRC It also says that the temperature sensor is even worse!