Transformer vs Capacitor voltage divider

Hi,

I just learned about a Capacitor voltage divider as a way power low DC voltage circuits from mains (http://www.extremecircuits.net/2010/07/mains-powered-white-led-lamp.html).

What is the difference from this method to using a transformer, in terms of power, safety, cost, etc.?
I've been searching all afternoon on Google, but couldn't find a suitable answer...

You can only use low current and the ripple current rating of the capacitor has to be able to take it. This makes the capacitors expensive. Normally in authentically divider you have ten times the current going down the chain as you want to take out of the bottom so it is quite inefficient in terms of power used.

The circuit you show is not a voltage divider. It is using the capacitor as one might use a series resistor to drop line voltage. Since the current through a capacitor is out of phase with the voltage across it (by 90 degrees in the ideal situation) the heat or wattage generated is zero. This technique is widely used in low current demand systems.

Since there is no effective isolation from the mains voltage this concept should NEVER be considered where there is ANY possibility of personal contact with ANY of the device components. Therefore is should NEVER be considered as a means of providing a "wallwart" type power supply.

The capacitor used should also be rated for the service duty - usually specified as having class X insulation.

The reactance (capacitive resistance) is frequency dependant and is specified as Xc = 1/[ 2.Pi.F.C] where F = frequency and C= capacitor value in farads.

What is the difference from this method to using a transformer, in terms of power, safety, cost, etc.?
I've been searching all afternoon on Google, but couldn't find a suitable answer...

The main important difference is in the safety realm, a transformer offers galvanic isolation from the power mains while a capacitor does not.

Lefty