An oddity about OScope cables

I was checking the resistance of the center lead of an oscilloscope cable, and I discovered that it was about 300 ohms in x1 mode, and about 9 megohms in x10 mode. I don't know much about transmission line theory, but why wouldn't those two resistances be in about a 1:10 ratio?

Thanks in advance for any insights.

This is from Wikipedia's oscilloscope article.

General-purpose oscilloscopes usually present an input impedance of 1 megohm in parallel with a small but known capacitance such as 20 picofarads.

In X1 mode, the input impedance is 1 megohm (300 + 1 megohm).
In X10 mode, the input impedance is 10 megohms (9 megohms + 1 megohm).

OK, that explains it. Thanks.

The resistance of the inner conductor is to eliminate reflections back and forth along the
cable I believe. The 1M resistance is in the 'scope front-end, this forms a divider with
9M when in x10 mode. The capacitance of cable and front-end form a capacitive divider
with an preset variable capacitor in the probe housing, a small screwdriver is needed
to set this to match the particular 'scope input using the calibration signal (a square wave
output by the 'scope for the purpose).

In x10 mode a scope probe only forms a tiny load on the circuit it is measuring, in x1
mode its a much heavier load - use x10 unless you have a reason not to. And calibrate!