I have a temp and a pressure sensor for my nano and I want to build a gauge. My code is somewhat working, it reads data and the voltage-divider gets it power from a 5V. I have some questions.
Pressure sensor can read from 0-10bar: 10-184 Ohm.
Temp sensor can read from 20-170C: 3003-23 Ohm.
What pull-up resistor should I use?
Currently I'm using a 10k but i think the readings could be more precise if I'd choose lower ones. I read about the rule of thumb of getting 10x of the resistance that I want to read, and it would suggest to use a 30k for the temp sensor as its 10x as the 20C's value. My problem is that I want it to be the most precise between 70-110C which is 368 to 102 ohms. What would be the difference be if I change from the regular 10k to a 2k resistor on that circuit? Should I just stay on a 10k one for both or go for lower ones?
Both sensor has only 1 wire to the "signal", the ground comes from the engine where you thread it into.. On the desk I just ground the case as it will be in the car as well.
Strange rule.
You get the highest resolution of the A/D if the sense resistor is the same value as the pull up resistor (at working temp/pressure). But that might not be practical if that sense resistor is too low.
Car radiator thermostats open at 82C, so that seems a good point to match pull up resistor value with 82C thermistor value.
Leo..
It seems like i have read something stupid on the internet. The temp sensor is going to be an oil temp sensor, but i get the point.. Find the sweet spot that I want it to be the most precise and use that resistance as a pull-up as well.