Using the AVR internal temperature sensor

The data sheet for ATmega 48/88/168/328 states that there is an internal temperature sensor which I would like to use. I have done as they data sheet says, and I do get a rather sensible temperature reading, which will change if I put the Arduino in question in the fridge or in a warmer place.

My problem is that I only get one single value, the first after reset, and it never changes no matter how many times I sample it. I have not done ADC by massaging registers previously, and I am not fully aware of what the rest of the Arduino bootloader and standard header code does, so to be on the safe side I turn of interrupts while sampling, I save the original values of ADMUX and ADCSRA and restore them before turning on interrupts and returning.

Can anyone tell me why every reading gives the same value? What am I doing wrong?

#include <avr/io.h>
#include <NewSoftSerial.h>

int chipTempOffset = 303;
float chipTempCoeff = 1.0;
byte i = 0;

NewSoftSerial lcd(255, 2);

void setup() {
  lcd.begin(9600);
  lcd.print("?f?B80");    // Clear display, set backlight
  delay(100);
}

void loop() {
  int rawTemp;
  char tick[5] = {'|', '/', '-', '*'};
  
  rawTemp = chipTempRaw();
  lcd.print("?aRaw: ");
  lcd.print(rawTemp, DEC);
  lcd.print("?x00?y1Temp: ");
  lcd.print(chipTemp(rawTemp), DEC);
  lcd.print(" C");
  
  lcd.print("?x15?y0");
  lcd.print(tick[(i++ & 0x03)]);
  
  delay(200);
}

int chipTemp(int raw) {
  int trueTemp = int(float(raw - chipTempOffset) / chipTempCoeff);
  return(trueTemp);
}

int chipTempRaw() {
  static uint8_t saveADMUX, saveADCSRA;
  int result;
  
  cli();
  saveADMUX = ADMUX;
  saveADCSRA = ADCSRA;
  
  ADMUX = _BV(REFS1) | _BV(REFS0) |     // Internal 1.1V Voltage Reference with external capacitor at AREF pin
    _BV(MUX3);                          // ADC8, on-chip temperature sensor
  
  ADCSRA = (saveADCSRA &
    (byte(~(_BV(ADATE) |_BV(ADIE))))) | // Clear AD auto trigger enable and interrupt enable
    _BV(ADEN) | _BV(ADSC);              // Set AD enable and AD start conversion
  
  while((ADCSRA & _BV(ADSC)) && !(ADCSRA & _BV(ADIF)));            // While conversion not finished
  
  result = (ADCH << 8) | ADCL;
  ADCSRA = saveADCSRA;
  ADMUX = saveADMUX;
  sei();
  return(result);
}

I use a serial LCD adapter from Modern Devices, and I use NewSoftSerial to talk to it. The offset and coefficient is for calibration purposes.

You need to read ADCL before you read ADCH. The AtMega uses a register lock mechanism so that when you read ADCL both registers are locked until ADCH is read.

Your function "chipTempRaw" will leave the registers locked so that the next sample requested always will be discarded. This is why you only get one meaningful value from your first sample after reset.

Thank you, BenF!

I reread the datasheet chapter on ADC. The way I interpret it the ADC will not be able to write any results until ADCH is read, and if ADCH is read first a subsequent read of ADCL will return an unspecified value that may or may not be the expected low bits of the conversion.

I changed my code accordingly but still got static results, or rather I got one normal reading followed by a static much higher reading. I did some experimentation, disabling interrupts or not, and found that the problem was that I was too cautious about restoring the registers.

When I do not try to restore ADMUX and ADCSRA when I am done, it all works as planned! :slight_smile:

I also tried just overwriting ADCSRA, as I do with ADMUX, but this did not work out. Modifying the relevant bits without touching the rest does the job when it comes to ADCSRA.

According to the datasheet both ADSC and ADIF will change when the conversion is finished, and the way I interpret it ADIF will be restored by hardware if, and only if, ADC interrupts are enabled. Trying to reset this bit did me no good, and I will stick to watching ADSC to know when the conversion is done.

My first 1-2 readings after each reset are much too low, and I am not sure why. When I use this in the future I will disregard the first two readings, probably by calling chipTempRaw() twice from setup(). Another nice thing would be to use a small ring buffer to collect several readings and use an average.

New code:

#include <avr/io.h>
#include <NewSoftSerial.h>
NewSoftSerial lcd(255, 2);

void setup() {
  lcd.begin(9600);
  lcd.print("?f?B80");    // Clear display, set backlight
  delay(50);
}

void loop() {
  int rawTemp;
  lcd.print("?aRaw: ");       lcd.print(rawTemp = chipTempRaw(), DEC);
  lcd.print("?x00?y1Temp: "); lcd.print(chipTemp(rawTemp), DEC); lcd.print(" C  ");
  delay(500);
}

int chipTemp(int raw) {
  const int chipTempOffset = 342;
  const float chipTempCoeff = 1.0;
  return(int(float(raw - chipTempOffset) / chipTempCoeff));
}

int chipTempRaw(void) {
  while((ADCSRA & _BV(ADSC)));                   // Wait for any ongoing conversion to complete
  ADMUX = _BV(REFS1) | _BV(REFS0) | _BV(MUX3);   // Set internal 1.1V reference, temperature reading
  ADCSRA &= ~(_BV(ADATE) |_BV(ADIE));            // Clear auto trigger and interrupt enable
  ADCSRA |= _BV(ADEN) | _BV(ADSC);               // Enable AD and start conversion
  while((ADCSRA & _BV(ADSC)));                   // Wait until conversion is finished
  return(ADCL | (ADCH << 8));
}

I'm just curious if you noticed that the internal temp sensor has like a +/- 10 degrees difference from actual temperature?

Still fairly useful, but for like $2 you can just get an LM335 or a couple bucks more for the one-wire temperature sensors, both of which have only about a +/- 1 degrees offset.

And for an added bonus, you won't have to use code that's way over my head! ;D

But I'm curious how accurate it is in the "real world" application, if you have something to test against you should let us know:D

When I do not try to restore ADMUX and ADCSRA when I am done, it all works as planned!

It takes some time for a new analog reference voltage to stabilize and in your initial version you would see the full negative effect of this as the reference changed between 1.1V and 5V right before and after every conversion.

The ADC hardware is also hihgly susceptible to electrical noise during conversion and any single sample could be significantly off. As you suggest, averaging is a god way to improve on accuracy/consistency.

If in your code you move configuration of ADMUX to chipTemp all you need in chipTempRaw would be as follows:

int chipTempRaw(void) {
  ADCSRA |= _BV(ADSC);               // Start conversion
  while((ADCSRA & _BV(ADSC)));    // Wait until conversion is finished
  return(ADCL | (ADCH << 8));
}

Rather than using a buffer to sample multiple values, you can use floats and calculate the running average. The outline of a temperature reading function could be something like this:

float chipTemp() {
  float tmp,avg;

  ADMUX = _BV(REFS1) | _BV(REFS0) | _BV(MUX3);   // Set internal 1.1V reference, temperature reading
  delay(10);  // wait for analog reference to stabilize
  chipTempRaw(); // discard first sample

  avg =chipTempRaw(); // use next sample as initial average
  // average 1000 samples
  for (int i=2;i<1000;i++) {
    tmp=chipTempRaw();  // get next sample
    avg=avg+(tmp-avg)/(float)i; // calculate running average
  }
  return avg; // return averaged temperature reading
}

This should give you consistent readings with sub-degree precision. Absolute accuracy however would be subject to a one-time calibration.

On a 16Mhz duino above averaging would still finish in less than 200ms. A one-wire sensor temperature would typically need 750ms for a single sample of equal precision. I'm not suggesting that one is better/worse, but rather that oversampling/averaging is the way to go if you need high accuracy/consistency when relying on the AtMega 10 bit ADC.

CaptainObvious: I have read about this temperature measurement in several datasheets (ATtiny 45, 84, 861 and ATmega 328p) and even though the subject is but briefly mentioned in all of them it is quite clear that the accuracy is +/- 10 degrees centrigrade after one single temperature calibration at room temperature (being 25 degrees C). The resolution is approximately 1 degree C, and using more than one calibration temperature point will improve accuracy.

Sorry about the code being less than obvious (unless you have plunged into the ADC chapter of the datasheet), but for me this method has advantages. The sensor is always there, no extra cost, no wires, just a few lines of code. By adding this code and calibrating each individual unit, any Arduino I use (with a mega xx8 series MCU) can find the ambient temperature.

My readings are stable and at room temperature correlate nicely with a quality digital thermometer standing next to the Arduino Pro Mini 328 5V 16 MHz I am currently using for this experiment. Right now the room temperature is 23.5 degrees C, and the Arduino keeps shifting between 23 and 24 (only offset adjusted from raw values 365-366).

The accuracy you mention is in the interpretation of the reading (after one single calibration, etc), not in the actual reading, and with more calibrations combined with multi sample averaging I think I can get 0.1 degrees C precision, which is much more than I really need.

Summary: It's free, it can be accurate, and it's great fun working at this level. :sunglasses:

If in your code you move configuration of ADMUX to chipTemp ...

One reason that I am concerned with making all configurations at each reading, and restoring them again, is that I intend to reuse the code. I will probably make it into a library, if it seems to work fine for me I will post it for general use. That is why I make no assumptions about the initial value of any registers, and try to restore them when I am done. But now I understand more about how this works, so to be safe I might check the initial value of e.g. ADMUX, change it only if necessary and in that case either wait for the reading to stabilize or return a value/flag telling the caller to wait a while and try again.

In you modified example you are not waiting for any ongoing conversion to complete. Is this safe? I also notice that you do not clear the ADC interrupt enable or auto trigger enable. Why? (As I wrote earlier I am rather new to this low level ADC coding, so I just want to understand how to code it safe and correct.)

I do not really need sub degree precision, and according to an application note (see below) even with correct calibration the true precision may not be better than 1-2 degrees. By using a ring buffer and a moving average I can "flatten the curve", as it would take more than one sample significantly different from the others in order to alter the result. (CaptainObvious: When I will need precision temperature readings I will indeed use an external circuit, but in this project a free thermometer is a nice bonus.)

In the application where I will first be using this technique I want to monitor true changes in temperature in intervals of five or fifteen minutes. The project is controlling a heating fan inside a car (in the winter). The fan uses a lot of energy, translating into money, and I do not want it to run more than necessary. It will be controlled by a relay, and I want to be able to start the fan earlier the colder it is outside (i.e. inside the unheated car), and also be able to turn it off when it is warm enough inside the car. If nobody comes to the car, disconnecting the mains wire to the heater assembly, I want my project to act as a crude thermostat, running the heater in periods of maybe five or fifteen minutes to keep the inside temperature above a specified limit.

During testing I plan to put an Xbee module in the box with the rest of my circuit, and let it send basic information about temperature readings and when the relay is turned on or off back to my main computer. (This project is for a good friend and neighbour of mine, and I hope to do field testing and final adjustments in the "production environment" of his car.)

The difference between -24.6 and -26.3 is completely irrelevant to me in this application. In fact I could probably do just fine with a precision/resolution of 5 degrees, as long as I know that it is a "true" reading, not a single sample that is off due to noise or anything else. Simple pseudo code might look like this:

On power up check initial temperature Ti
Set initial delay to two hours minus 20 minutes for each ten degrees that Ti is below zero
Sleep for initial delay
Turn on relay
Sleep one hour
Turn off relay
Repeat forever
    Check average temperature during fifteen minutes, one sample every 15 seconds
    If avg temp above zero, turn off relay, else turn on relay

The initial delay is to let the lower powered engine heater have enough time to get the engine up to a good working temperature before starting it. (We live in Stockholm, Sweden, and for those of you living in warmer climates I could tell a lot about the process of standing in the pitch black early morning in -25C/-13F with an unheated car, scraping a thick layer of ice from all windows and side mirrors while the engine is running and the ventilation is at maximum power and heating to try to reduce the ice on the inside of the windows...)

I have not studied them in detail yet, but I will be looking into the following excellent application notes too:

I looked into the topic of oversampling and averaging earlier when experimenting with an accelerometer, and found good information on Wikipedia and on Wikibooks. The latter also have some good books on C programming, by the way.

So far in my tests I have not done anything specific to reduce the ADC noise, I wanted to get the temperature readings working at all first.

I am not sure if it is true, but I did get the impression that the readings might change with Vcc. To get a good reading in the fridge, for a second calibration point, I hooked the Arduino (display and all) up to a 3 AA battery box. The LCD had trouble updating at about 8 degrees C (of course) but I did get a good and stable reading. After keeping the circuit in room temperature for an hour or two I got significantly lower readings (on battery power) than I did earlier, but when switching to USB power it jumped right back to where I started.

BenF (or anyone else), would you know where I can find a reference to what registers might be modified by other Arduino code, and what registers that it is imperative that I restore after changing them?

An "empty" sketch will include a one time init routine supplied by the Arduino team that configures your timers (timer0 for timekeeping and timers 1 and 2 for PWM), ADC will be enabled and that's pretty much it. As for general puprose MCU registers, they will all be managed by the "GCC" compiler and generally hands off unless you embed assembly code. Also if you do not use any functions related to the init defaults (e.g. millis(), analogWrite, analogRead) you're still ok to "mess up" even these related registers as you see fit.

With Arduino you are your own master and that's part of the beauty (and evil) as you stand and fall from your own efforts only. :slight_smile:

why not

void setup() {
  // set up the LCD's number of rows and columns: 
  lcd.begin(16, 2);
  analogReference(INTERNAL);
}

void loop() {
  // set the cursor to column 0, line 1
  // (note: line 1 is the second row, since counting begins with 0):
  lcd.setCursor(0, 1);
  lcd.print(analogRead(8));
}

see also http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1238764842

now only thing i'm curious about is obtaining the Tos from the eeprom as mentioned in the manual. Which can be used as the offset for calibration.

regards T

teeman: I had not seen that thread. It seems to depend on specific versions of the Arduino IDE, while my solution does not. As for the calibration data in the EEPROM, I must admit that I have not a clue. I have not tried to find it.

I made a simple spreadsheet in OpenOffice.org for finding appropriate values for calibration from measured temperatures. Maybe someone will find it useful.

calibration spreadsheet

Latest program:

#include <avr/io.h>
#include <NewSoftSerial.h>
NewSoftSerial lcd(255, 2);

void setup() {
  lcd.begin(9600);
  lcd.print("?f?B80");    // Clear display, set backlight
  ADMUX = _BV(REFS1) | _BV(REFS0) | _BV(MUX3);   // Set internal 1.1V reference, temperature reading
  delay(100);
  chipTempRaw(); // discard first sample
}

void loop() {
  float rawTemp;

  rawTemp = chipTempRaw(); // use next sample as initial average
  for (int i=2; i<2000; i++) {
    rawTemp += (chipTempRaw() - rawTemp) / float(i); // calculate running average
  }

  lcd.print("?aRaw: ");       lcd.print(rawTemp);
  lcd.print("?x00?y1Temp: "); lcd.print(chipTemp(rawTemp)); lcd.print(" C  ");
  delay(2000);
}

float chipTemp(float raw) {
  const float chipTempOffset = 336.59;
  const float chipTempCoeff = 1.17;
  return((raw - chipTempOffset) / chipTempCoeff);
}

int chipTempRaw(void) {
//  while((ADCSRA & _BV(ADSC)));                   // Wait for any ongoing conversion to complete
  ADCSRA &= ~(_BV(ADATE) |_BV(ADIE));            // Clear auto trigger and interrupt enable
  ADCSRA |= _BV(ADEN) | _BV(ADSC);               // Enable AD and start conversion
  while((ADCSRA & _BV(ADSC)));                   // Wait until conversion is finished
  return(ADCL | (ADCH << 8));
}

I made two comments in the thread mentioned by user teeman, started by user AussieNell, todays comment is relevant for anyone reading this thread too.

I do not think the concept of patching the common Arduino code is a good solution for implementing this feature since it will only work on some of the supported hardware (e g not on the Mega or on a Sanguino). A library would be a better solution, IMHO. I currently lack the time and motivation to transform my code into a library, but of course others are welcome to give it a try. If you do, try to check the hardware the code is running on, since not all chips used in Arduino platforms will support this, and my code is hardware specific for the ATmega 48p/88p/168p/328p family.