Problem reading sensor values

Hi, I am using UNO Rev3 and trying to measure temperature using the NTE 7225 sensor. I have a 3.3K resistor as load between +5v and the sensor input to measure the sensors output. When I use the UNO with the USB powering it, the reading are good. When I switch to the 9V power I see an rise of around .02 volts to 0.3 volts. I can maybe use a 0.1uF cap to resolve this. Not sure if that will work.

Worse when I use the same config without any changes on my Nano with the same USB from the laptop, I see weird reading of voltage its way off than what I was expecting. I have no clue whats going on there, any idea of whats going on?

Thanks. rn77

Show us how you wired the setup and post your code (don't forget to use code tags with the #-button).

Attached is the circuit diagram used to read the analog voltage. The VCC is nothing but +5V from the UNO board.

The program simply reads the pin A0, and map the precision against 1023 levels for 5.0 volts levels.

void loop() {
  float temp;
  float kelvin;
  float celcius;
  float farh;
  
  // read the input on analog pin 0:
  int sensorValue = analogRead(A0);
  // Convert the analog reading (which goes from 0 - 1023) to a voltage (0 - 5V):
  float voltage = sensorValue * (5.0 / 1023.0);
  voltage = voltage *100;
  Serial.println(voltage);
  celcius = voltage - 273;
  farh = (1.8 * celcius) + 32;
}

temp.png

Well keep in mind any time you are working with the arduino analog input pins that the default voltage reference is the board's Vcc/Avcc voltage. So if the board is being powered by your PC USB port then that value can be anywhere from 4.75 to 5.25 vdc and still be within USB specs. And if the board is being powered from the external power connector then the boards voltage will be determined by the actual output voltage of the on board +5vdc regulator which of course will rarely be exactly 5.000 vdc, but rather some value within the regulators tolerances. And if you change to another board it's on board regulator will have it's unique Vcc value. So how you decide to deal with all these various tolerances is up to you, and you can even just ignore them and live with the slight accuracy differences, but do try and understand the reason for possible small differences depending on power source and device variations.

So when you use a statement like this: float voltage = sensorValue * (5.0 / 1023.0); //actually you should be using 1024.0 I believe Understand you are making an assumption about that 5.0 value and the actual value can vary for the reasons stated above. And none of this is even addressing actual accuracy and variation in the sensor chip itself which is a whole other thing. In instrumentation quality measurements one usually does a setup for a final calibration that then makes any needed final 'offset' tweaks to the code to compensate for any error seen when the sensor is compared in real time to a independent 'reference sensor'.

Not sure what your Nano problem might be, but I would certainly check the wiring carefully again.

Lefty

Consider using the 3.3V output as the ADC voltage reference (see the documentation of the analogReference function). This will however limit the maximum temperature you can measure to about 330K, or 57C.

@Retrolefty: Thank you for the detailed explanation, I had pretty much assumed that the board would have a regulator which would take care of these issues. Like most of the programming languages I thought count would start from 0 so made it 1023, will try with 1024 and check the results. I would rather consider using a digital single wire sensor which has better tolerance for voltage rather than an analog output one due to all the voltage precision issues on the board/USB supply.

@DC42: Thanks dc42, I will give a try.