Far-seeker:
KeithRB:
That is why they have a thing called a voltage divider.That's possible of course, but precision is potentially going to take a big hit even before the resolution of the ADC is factored in. From the perspective of the signal's information content a voltage divider going from 5 to 15 V down to 0 to 5 V means there will be roughly three times less usable information detectable in the output signal. For some applications that's not a problem, for others it will be. Instead, I would check to see if there is an affordable ADC chip/module that can handle the orignal input voltage and output serial communication of some sort at 5 VDC.
I'm afraid you are wrong in nearly every detail in that paragraph!
Firstly information is a logarithmic measure so a loss of a factor of 3 in precision is a loss of 1.6 bits of information. Given a good ADC can give 20 bits, that would be only a loss of 8% information (18.4 bits / 20 bits).
Secondly a voltage divider doesn't throw away information like that - you need to know the signal/noise ratio and bandwidth of the source and the actual resistance values and their temperature, then you can calculate degradation in signal/noise ratio. If the source is already noisy the potential divider might have almost no effect. If the source is clean and low-impedance then a high-resistance voltage divider might be injecting massive amounts of noise compared to the source.
Resistors generate voltage noise proportional to temperature and to the square-root of the resistance value and to the square-root of bandwidth.
Thirdly the ADC might be introducing quantisation noise that's far greater than the signal noise - in which case everything else would be academic. If you are only using a 8 / 10 / 12 bit ADC its likely to be the dominant source of error.