I recently finished a project that required an RTD temperature sensor and extreme accuracy and repeatability. For those of you who are interested, an RTD is basically a 100 Ohm platinum resistor. Its temperature versus resistance curve is well known and there are polynomials available to convert from resistance to temperature. A quick Google will probably tell you more than you really want to know about this device.
My application had some size constraints and I also had a single supply voltage to operate from. There is also a healthy temperature swing during normal operation, on the order of 60 degrees Celsius. RTDs have to be operated at very low current in order to keep self heating from affecting their reading. With that constraint, the output from the RTD is similar to what you would get from a 350 Ohm strain gauge device, around 20 mV full scale. Normally, that would warrant using an instrumentation amplifier to get the signal up into a range suitable for a/d conversion, say 2.5V.
I had an 8VDC regulated power source to work from, so I decided to use 5VDC as my circuit voltage. I've used them several times in the past and if you only need a few mA of extremely precise voltage, the REF19X series from Analog Devices is hard to beat. Originally designed as precision reference sources for ADC/DAC, they come in output voltages from 2.048V to 5.000V. I used the 5V version, the REF195. It has an initial accuracy of +/- 2mV, line and load regulation of 4 ppm/V and 4 ppm/mA respectively and a maximum current rating of 30 mA. At around 1V dropout over full range, it makes for a pretty decent voltage regulator with the addition of a couple of decoupling capacitors.
Now the interesting part. Way back at the turn of the century, Linear Technology introduced a series of low cost, high performance delta-sigma ADCs. By the way, other manufacturers make them as well. This is not an advertisement for LT, they just happen to be the ones I'm most familiar with. I won't bore you with a detailed theory of operation but basically, the delta-sigma converter is voltage controlled oscillator the output pulses of which are a precisely known amplitude and duration such that their integral = Vdt is known, but the interval between pulses is variable. For a low input voltage the interval is wide and as the voltage increases, the interval shortens so that the input voltage is proportional to the mean of that interval over time.
Since the ADC has differential inputs and a differential reference ( which the pulse amplitude is referred to), the overall accuracy is proportional to Vref and is on the order of 15 ppm of Vref. On the digital side, it's basically a counter so it's guaranteed no missing codes over 24 bits.
And 2 to the 24th power is a BIG number! Resolution to almost 1 part in 17 million. For 5V full scale that is roughly 3E-7 V/bit!
So what does that mean? For my 20 mV nominal RTD signal I get a reading of around 67,000, over 16 bits resolution.
Well, I hope that wasn't too boring. My point is this, if you have very small voltages you need to measure very accurately and signal conditioning would just add to your error margin, the high resolution delta-sigma a/d conversion might just be the solution.
If any one is interested to see an example of how compact a solution this can be, I have posted the circuit board from this project and from a couple of other things I'm currently working on here:
Almost forgot! The interface is a 3-wire protocol compatible with SPI but is really easy to bit bang as the timing is not critical.