"Oversampling” and “Decimation”, AVR Internal temperature sensor

Has anyone tried using “decimation” to increase the resolution of the internal temperature sensor ?
I took a shot at it and it has been running true to form compared to the thermometer I have set next to Nano.

The way I did this, the conversion is done outside the function allowing one to apply appropriate offsets and scaling factors according to the particular board and or temperature scale. The function call specifies the number of samples desired so it can be used for a quick read or for a long set to stabilize the results. This is no barn-burner but it appears to work.

Test code:

unsigned long Time;
float save;
void setup() 
 { Serial.begin(9600);
void loop() {
   word raw;
   float temp;
   byte i;
   raw = avrRawTemp(1024);
   temp= ((((float(raw))/4)-331.5)*1.8)+32; 
   // conversion:
   // divide to scale from 1/4 degree to 1 degree
   // subtract 273 for celsius
   // subtract 58.5 (offset varies by board) for external temperature
   // multiply by 1.8, add 32 for degrees fahrenheit
   // --- accuracy can be increased by using two point calibration --- 
   Serial.print(", ");
   Serial.print(", ");
   Serial.print(Time);                   // benchmark time
   if (raw<save) Serial.print(" <<<");   // temperature decrease
   if (raw>save) Serial.print(" >>>");   // temperature increase

Function Code:

word avrRawTemp(word samples)
  { /* samples: the number of samples to average                   */
    /* this number will be reduced to a power of 2!                */
    /* return: degrees Kelvin * 1/4, range 0 to 4096               */
    /* each sample has 16 ADC reads for the 12 bit virtual ADC     */
    /* REF: Atmel document number AVR121.pdf                       */
    /* on 16Mhz ATmega328 512 samples requires just under 1 second */
    /* 16 samples (16*16=256) gives fairly consistent results      */
    /*    on a steady-state system in under 40K microseconds       */
    unsigned long RawSum=0;                         // used to sum samples for averaging 
    word RawTemp=0;                                 // used to accumulate 10 bit ADC readings 
    word test=0;                                    // used to count samples
    byte exp=0;                                     // samples = 2 to the exp power, used as shift operand
    byte k=0;                                       // counter for ADC reads
    unsigned long Start=micros();                   // this was used for benchmark timing
    // turn on internal reference, right-shift ADC buffer,ADC channel = internal temp sensor
    ADMUX = 0xC8;
    delay(10);  			            // wait a bit for the analog reference to stabilize
    // as "C" lacks an exponential or power function (or operator)
    // we must use a resort to loops to calculate the binary exponent
    while (samples>1)    { samples /=2; exp++;}     // calculate power of 2
    samples=1;                                      // make sure samples = 1 (not 0)
    while (test++ < exp) { samples *=2;}            // set samples value to power of two
    test=0;                                         // reset test because we have abused it
    while (test++ < samples)                        // oversampling loop (for averaging) 
          {   for (k=0; k<16; k++) 	            // virtual ADC loop, 16 consecutive readings
                { ADCSRA |= _BV(ADSC); 	            // start the conversion                                
                  while (bit_is_set(ADCSRA, ADSC)); // ADSC is cleared when the conversion finishes
                  RawTemp += (ADCL | (ADCH << 8));  // accumulate the reading (low byte first)
              RawSum += (RawTemp >>2);              // accumalate virtual 12 bit ADC value
              RawTemp=0;                            // zero ADC accumulator for the next sequence
    Time=micros()-Start;                            // record benchmark time 
    return ((RawSum)>>exp);                         // average by shifting bit position, LSBs lost 

I searched for examples show the use of decimation with the temperature and did not find any.
Perhaps someone will find this of use. :slight_smile:


The real problem with the internal temperature sensor is that no one has come up with a reasonable application using it. It’s value is going to mostly reflect the amount of total current being consumed by the chip which in turn will be mostly influenced by the circuitry wired to the output pins of the device. So short of some kind of software alarm saying hey the chip is running a whole lot hotter then normal, do something about it, as I said I haven’t seen anyone use it for anything useful. But then again maybe someone will in time.


I was looking for a thermometer on Amazon that would record high and low temps. I thought that it would be convenient if I had one that would report the values to my server. Most of the review's that I read on Amazon were less than glowing. Then I spotted an add for a chinese Arduino clone for $9. I thought to myself: "A couple of sensors, some wire, a microprocessor and bit of code ... how difficult could this be?" ... that was a bit over two months ago. Now I have a USB device that reports the temperature back to my server ... and 4 additional arduinos, two LCD display, numerous resistors, capacitors, sensors, a VOM meter, soldering iron ....

lewtwo I'm curious about the accuracy of your measurements using the internal sensor. Do they agree with say a digital meat thermometer touching the chip (or IR thermometer pointed at it) over a variety of temperatures?

I am interested in that myself. I do not currently have either of those types of instruments. I have a taylor pocket dial thermometer but it is graduated in 2 degree increments and one needs a microscope to read it (it also needs to be calibrated). I have ordered two old fashioned 305mm laboratory thermometers to resolve the issue ( one partial immersion, one full immersion ). I plan to do a two point calibration on the Nano when they come in: one in an ice bath, the other in warm water. The Nano will be enclosed in a antistatic bag and a plastic baggy for those. I will also calibrate the Taylor at the same time.

My current offset was taken at 90 degrees fahrenheit using a 5-1/2 inch spirit thermometer that i believe to be fairly accurate. The Nano is currently reading 81.05 and the thermometer is reading about 81.5 degrees and the taylor is reading about 81 degrees (both read with a 5X loop). Putting my finger tip on the top of the Nano caused an almost immediate change in the readings.

I am wondering if a small finned heat sink (1/4 x 1/4 x 3/8) could be glued to the top of the ATmega32 chip on the Nano to decrease the variance between the internal and external temperatures.

One issue you have to think about is the signal noise amplitude compared to the quantisation steps in the ADC. If a signal is clean then the ADC will categorise it into steps in such a way that you can't find any more information about it than the lsb step size.

If there is the right amount of noise (independent from quantisation) in the signal then taking suitable averages of readings will extract more information (the ADC outputs a sampled normal distribution rather than a fixed value - this allows averaging techniques to infer sub-LSB information).

However you have to be careful - how accurate is the ADC anyway? The quantisation error might dwarf any gains, the noise added will still be present (although attenuated) on the output, etc.

However it is a valid technique in precision measurement to add an accurately controlled noise signal into the system before sampling. It also has other effects on the spectrum of the quantisation noise which can be useful I believe.

However it is a valid technique in precision measurement to add an accurately controlled noise signal into the system before sampling.

When I read the Atmel paper on the subject the I thought: "As we are dealing with an internal sensor and an internal voltage reference it is [u]extremely difficult[/u] to add any other circuitry to provide the required noise … unless that noise just happens to exist au naturel." So I wrote a sketch to take thousand a samples in the normal manner (10 bit) without any averaging. I was surprised to find that there already appeared to be adequate noise in the signal and that it appeared to be evenly distributed. The paper also discussed the some precise timing requirements. Much of the material was admittedly way over my head.

However you have to be careful - how accurate is the ADC anyway?

In theory the accuracy sucks. In some places Atmel quotes +- 10 degrees celsius but in theory the "reported" accuracy can be improved by calibration. In addition the device is calibrated in degrees Kelvin reported over the range of the internal 10 bit ADC: Thus we have a range of 0-1024 degrees Kelvin graduated in units of a single degree. One degree Kelvin/Celsius is the equivalent of nearly 2 degrees Fahrenheit. If the “Decimation” technique works then the the range would still be 0-1024 degrees Kelvin but the graduated units would be 1/4 degree Kelvin or a little less than 1/2 degree Fahrenheit. Those reporting increments fall into the range of being useful.

But all this is theory. None of it means anything unless the practical application proves to be valid. As all it would cost me to try was a bit of code I decided to give it a try. I have had it hooked up for well over 36 hours thus far. What I can tell from observations:

1) small changes in the environment temperature have a corresponding small change in the reporting log. 2) within the small range that it has been operating (~78-92 degrees Fahrenheit) the reported temperatures track well compared to readings taken on the two thermometers setting next to the Nano.

I expect that using a two point calibration (32 and 120 degrees Fahrenheit) to produce a more accurate scaling factor. I will apply that scaling factor across a range with an observed offset at 70 degrees Fahrenheit. Thus I expect the real world accuracy to be within the range of +- 1/2 degree Fahrenheit decreasing somewhat as actual temperature moves away from 70 degrees.

Variations in current usage (i.e. device load) affect the temperature reported by the internal sensor. In my target application I plan to add four more external temperature sensors. The only thing that the device will be doing is reporting the reading on those sensors at a regular interval (perhaps every 5 minutes) thus I expect the load to be very consistent.

What can I say: "It appears to work". That is all I require for my target application. Now I need to write an application on the PC side to stuff the data into a Database. :) :) :) :)

UPDATE: As I was setting up to do a multipoint calibration I decided to use one of the Nano 168 that I have because it has no pins solder in place. I pulled one of those out and loaded the sketch. To my surprise it returned a very high number for every reading.

I modified the sketch to get back to the normal 10 bit ADC. All the ADC readings come back as 1023.

The original Nano I used was a Sainsmart with a ATmega328P mpu 5v@16 Mhz The ones I was attempting to use were Chinese Iduino Nano 168 with ATmega168 mpu 5v@16 Mhz Same results with three separate units.

Then I loaded up the original sketch on a SainSmart Uno Ver 2 ATmega328P mpu 5v@16 Mhz It operated as expected. I just needed to adjust the offset.

I compared the specs on the 168 vs 328 and at firs glance see no reason why I am get maxed readings. I guess that I am going to get a bit deeper into the hardware ... maybe there's something to do with the timing factors.

retrolefty: The real problem with the internal temperature sensor is that no one has come up with a reasonable application using it. .... But then again maybe someone will in time.


check - http://forum.arduino.cc/index.php/topic,38091.0.html - definitely not perfect but certainly useful in many cases ;)