I been seeing a lot of 110-240VAC adapters around lately. How can the power supply "figure out" the input voltage? I imagined that if a circuit was built for 110VAC --> 5VDC then 240VAC --> 10.9V. But this is not the case. I read on wikipedia that the mains AC power is first rectified, then brought down to appropriate levels by a small high-freq. transformer, and rectified again. But is the voltage input universal?
Your Wikipedia reading is correct. What was left out is that the small high-frequency transformer does not have standard AC sinusoids fed into it, but an on-off switching waveform. The duty cycle of this waveform (percentage time on divided by total period) determines how much power is transferred from the input voltage to the output voltage.
So if the power supply sees the output voltage is getting too low, the switching duty cycle is decreased. If the output voltage is getting too high, the switching duty cycle is increased. It doesn't much matter what the input voltage is (within limits).