It's always nice to have a clear, digital readout of temperature, but for my latest project I am going all retro, and am building an analogue temperature indicator using a 0-5V voltmeter.
I'm having a blond moment and need some clarification on how I figure out the calibration formulae for the sensor I am using on my Arduino. Hoping someone can help!
So, I am going to use a DS18B20 temperature sensor, with my Arduino and a 0-5V analogue voltmeter. I am trying to work out the calibration factor. Here's my assumptions and calculations that I need some assistance on please...
The temperature sensor has a range of -55 to +125 °C and can be programmed to output a 9 bit through 12 bit resolution. I will use the 12 bit resolution, so -55°C = 0, 125°C = 4096.
-55°C - 0 and 125°C = 4095
The range of temperatures I want to display for my location on the analogue voltmeter (0-5V) is -5°C to +35°C and so I want to calibrate a Digital PWM out on my Arduino to output 0V when the temperature is -5°C, and 5V when the temperature is +35°C.
The range of the sensor is 180 degrees across 12 bits, and I want to use a range of 40 degrees which is
40/180*4096 = 910 bits of the 4096 bits in total, which is almost enough to use with the 10-bit range of the Arduino analogue input.
So, on the 0-4096 bits of temperature sensor output, I only want to use from bit 1365 to bit 1980 as my 0-1024 analogue input, then mapped to the 0-255 on the PWM 0-5V output.
Is this possible? How do I map it please?
It's a Friday evening after all :)