OK, so this is my problem. I think digitally, or at least one dimensionally. To me, the electromagnetic wave on a wire measured at a point in time is a voltage and a current, two values that can be measured somewhat precisely with the right equipment (an oscilloscope). For digital, that signal is taken as 0 or 1 and fluctuates between two voltages, let's say 0V and 5V, and maybe overshoots these a bit but it is a pretty obvious and easy to understand signal. For analog, on an oscilloscope, it can get greatly more complex but really it's still an electromagnetic wave with a voltage and current at a certain point in time.
Frequency is repeating changes. If the change in voltage is regular and repeats 1,000 times per second we say that the signal has a 1KHz frequency. So far, so good. But when we talk about real world signals such as radio or audio, the signal is very complex. Looking at it on an oscilloscope it is all over the place. But it still has one and only one voltage at a given moment of time, does it not? But something like an audio signal with a 100Hz, 200Hz, and 300Hz tone in it, though it is one signal with one voltage at one time (rapidly changing) when played back out of that signal clearly gives the 100Hz, 200Hz, and 300Hz tones. Ok, that's what I don't understand. How can we look at this complex wave and say it is actually 100Hz, 200Hz, and 300Hz? How does a spectrum analyzer (which I have no experience with but appears to me to do this job) do this? It feels like the world is multiplexing multiple signals into one signal, which is then that changing voltage on a wire, and at the other end it can be demultiplexed, even by something as simple as a speaker which just puts the signal into the air for us.
Anyway, I don't know what I am asking for here. What can I read on this to clear up my lack of understanding? I guess I just don't understand analog signals at all.