Reading voltage for pressure transducer

I have a pressure transducer which outputs 0-10 V. I want to measure 1 mV - 10 V with a resolution of 0.1 mV across the full range. Is this possible or just pure fantasy? (I am an electronics amateur if it isn't obvious.) I see there are 24 bit ADCs with onboard voltage references available as breakout boards (ADS1220) but I have no idea if these are capable of the accuracy I want, and I know for a fact they can't do 0-10 V full range?

You have 10k steps which means about 14 bit resolution. That should be doable. You can reduce the 0-10V to 0-5V by a simple voltage divider.

Smajdalf:
You have 10k steps which means about 14 bit resolution. That should be doable. You can reduce the 0-10V to 0-5V by a simple voltage divider.

I'm still reading and trying to understand how ADCs work. Are there minimum voltages required to get "accurate" measurements? In other words, if I put in a 5X voltage divider to get a 0-2V sensor output to feed into the ADC, is this okay? Or do I need to step up the voltages with gain once the output of the sensor gets in the range of, let's just say 1 mV - 100 mV?

Reducing signal voltage reduces the signal to noise ratio. It is not good to decrease the signal more than needed.

By increasing the signal you may get better resolution for the increased range("poor" resolution at 100mV - 10V and "better" resolution for 0 - 100mV). There is usually no need to do so - by default the ADC should have the same resolution at 100mV as at 1V.