If you connect 220 VAC to a capacitor, and resistors in series, wouldn't it then contain 220 V, just that the current would flow at a different speed (amps)?
Just for your understanding, this is how I understand it -without the complex AC maths - :
If you connect a capacitor to 250V AC ( assuming it's rated for that voltage ), it will allow a small AC current flowing.
The current is determined by voltage, capacitance and frequency.
If you add a resistor in series, you can measure a voltage across the resistor according to V = I * R.
The voltage across the capacitor and the current are not in phase, so it is not producing heat according to DC Voltage*Current, which is the advantage over a resistor voltage divider.
But for currents above a very few mA and a voltage of 250V @ 50Hz you'd need big expensive capacitors, so this approach is not useful if you're looking for a real power supply.
And the initial current when switching on independent of the AC voltage phase, might be way beyond the steady state current.
Besides, there's no galvanic separation between the 250V and your low voltage, and every hint should start with a "
Children Don't Do This At Home" disclaimer.
Rather use LTSpice simulation to see that theoretically, if you're interested.
In reality, you can't and should not compete with wall warts, which nowadays have a small transformer/inductance running much more efficiently at high frequency switching.