I was checking the resistance of the center lead of an oscilloscope cable, and I discovered that it was about 300 ohms in x1 mode, and about 9 megohms in x10 mode. I don't know much about transmission line theory, but why wouldn't those two resistances be in about a 1:10 ratio?
The resistance of the inner conductor is to eliminate reflections back and forth along the
cable I believe. The 1M resistance is in the 'scope front-end, this forms a divider with
9M when in x10 mode. The capacitance of cable and front-end form a capacitive divider
with an preset variable capacitor in the probe housing, a small screwdriver is needed
to set this to match the particular 'scope input using the calibration signal (a square wave
output by the 'scope for the purpose).
In x10 mode a scope probe only forms a tiny load on the circuit it is measuring, in x1
mode its a much heavier load - use x10 unless you have a reason not to. And calibrate!