220 Ohm is too low of a pull-down for your 0 to 10K output resistance voltage source.
It should be much higher than the maximum output resistance of your voltage source (at least 1:10), in order to not have a measurable effect on the voltage read.
If you pot is at, say, 60% from ground, that is like a resistor divider where you have an 4K resistor on top and a parallel of a 6K and 220 Ohm resistor below (I'm supposing you understand how a pot works). If you had only the pot, the voltage at the moving pin of the pot (the output voltage, the one you want to measure) is 5V x (10K x 60%) / 10K = 3V. With the 220 Ohm pull down, the parallel makes it as if the lower resistor of the resistor divider is 10K x 60% x 220 / (10K x 60% + 220) = 212 Ohm, leading the an output voltage of 5V x (212) / 10K = 0.106V . I wouldn't use a pull down (nor up) smaller than 100K.
Imagine you have a pure theoretical voltage source. Whatever current you draw from it, it's voltage remains unchanged. In real world voltage sources this doesn't happen, because all of then have some resistance in series, so the current being pulled from the source will cause a voltage drop at that resistance and the voltage you measure "outside" varies with the current you pull from it. This resistance is the "output impedance". The smaller it is, the higher are the currents you can pull from the source without significant change in its voltage. "Impedance" is a more general term than resistance which covers the fact that real world circuits have a resistance that varies with the frequency of the signal going through it. For DC signals, it's the same as just resistance. We usually just say "impedance", but we can say resistance (which would imply DC signals).
Input resistance is kind of similar, but now you have the resistance from the input to ground, internally
in the device. This is measurable like if you plug the Ohm-meter directly at the input. If this resistance is small, you need to drive that input with a source that also has a small output resistance, otherwise, the current being pulled by the input will affect the voltage of the source (due to it's output impedance). High impedance inputs are more "sensitive", because they "can read" sources with high output impedance - let's say, like electromagnetic waves. Note that a source can have hundreds of thousands of V at the output, but if it has a very high output impedance, any tiny current you pull from them makes those voltages become very small. See the catch? That's why voltage-meters have high input impedance, otherwise they would change (significantly) the own signal being measured!
Although the ADC has an input impedance (resistance) of many MOhm, it is recommended that the output impedance of the voltage source being measure be at most 10K. The ADC has an input capacitor that samples the input voltage. This capacitor must be charged in a very small amount of time by the source being measured; if the source has an output impedance too high, there isn't enough time for the capacitor to charge to the source's voltage and you get erroneous readings. The more current the source can provide, the faster the capacitor charges.
Ups, sorry for the speech