# Phase change through transformer?

Hi there

I am wanting to to detect 240V mains zero crossings for a TRIAC based heater controller. I have a 240/12 Volt transformer whose output is rectified, regulated to 7.6Vdc which powers the arduino.

My question is: Will the transformer's output be phase shifted with respect to its input? I am planning to put the 12Vac lines into a comparator (op-amp) and have arduino check the comparator's output to sense the Mains zero crossing. Anti-phase is of course no problem. Unfortunately I have no oscilloscope to check directly.

Will this work, or will the transformer induce a phase-shift?

thanx, Socco

Will this work, or will the transformer induce a phase-shift?

Either 0 or 180 degrees phase shift, depending on which terminal of the secondary winding is used for circuit common. Either way zero crossing should have no phase difference.

Lefty

Can't recall the reason why but I believe there is a 30 degree phase shift between input and output. I experienced this in reality when trying to connect a pair of power circuits in parallel many years ago.

The 0 or 180 degrees mentioned previously refers to start and finish end of the secondary winding. However this does not consider the shift from primary to secondary.

If you don't believe this then there is a very simple experiment that can be done to illustrate that there is indeed a phase shift.

Set up two transformer circuits

Circuit "a" contains only one transformer , say 240 to 12 volts

Circuit "b" contains 2 transformers the first is 240 to 110 and the second is 110 to 12

If there was no phase shift through the transformers then the two 12 volts outlets would be in-phase

However you will find they are not in phase, in fact they will be out by 30 degrees

Each transformer has a 30 degree phase shift so circuit "a" is lagging the input by 30 degrees and circuit "b" lags the input by 60 degrees (due to having 2 transformers)

Doesn't a triac only switch on or off at zero crossing anyway ? I've had to put snubber circuits on triacs so they'd turn off at all with inductive loads.

Firstly triacs only switch off on zero crossings, they can switch on at any time.

Secondly a perfect transformer doesn’t introduce a phase shift (and its windings have infinite inductance)! In reality a good transformer at full load will act pretty close to a perfect transformer, but if its at low load the inductance of the primary will start to matter. So expect a small phase shift.

However this is all moot as the triac switches off at zero current and the transformer will sense the zero voltage crossings - if your AC load is not purely resistive the load itself is introducing a phase shift. Fortunately heaters are usually 100% resistive at mains frequencies.

The other thing is that there is little point implementing phase angle control with a heater. The thermal time constant will insure you get just as good a results if you turn it on and off at the rate of say one second.

Will the transformer's output be phase shifted with respect to its input?

And here was me thinking the question was what is quoted above

jack

jackrae: Will the transformer's output be phase shifted with respect to its input?

And here was me thinking the question was what is quoted above

jack

Perhaps your memory was in context of 3 phase AC power transformers. The following is something I found while googling the topic:

Detla-Wye transformers appear to have very large (30 degree) phase shifts simply because of the difference in connection between primary and secondary. Through suitable interconnection of different secondary coils from a three phase primary (you may need more than three secondaries, and they may have different numbers of turns) you could get any phase shift desired. For example, special transformers are sometimes used to generate 18 phases on the secondary, each with its own phase displacement, in order to feed rectifiers and get smoother DC.

But the original question really speaks to phase shift in a single phase transformer. An ideal single phase transformer doesn't produce any phase shift; the output signal is a perfect in phase copy of the input signal. Real transformers have numerous non-idea features, and will produce a small phase shift. This can be minimized but not eliminated, by using more expensive construction techniques. It becomes an engineering problem to select a transformer with sufficiently small phase shift and distortion for any given application.

Lefty

The thermal time constant will insure you get just as good a results if you turn it on and off at the rate of say one second.

My 2000 degree F kiln project results agree with you

Hi Lefty,

You're probably on the right track with your comments. The original site problem ( some 30 years ago) was to bring a back-up LV control voltage into a large AC gas compressor fed from two separate sources from a common plant power system. It was eventually established that there was 30 degrees of difference between the two LV supplies and a transformer count verified that each supply system involved a different number of transformers in the chain. I am unaware if there was indeed connection variations between phases though we did use the same phase pair to "eliminate" the obvious problems of out-of-phase supplies. Similarly I recall when working with on-line drawings it was always essential to ensure that there were the same number of transformers in each leg of back-up or shared supply lines. Perhaps some of our power engineers can clarify the issue.

Either way it must be physically impossible (however minute - or large) for output and input voltages to be in phase irrespective of load. Input current produces magnetic flux and this must lag drive voltage ( CIVIL) since flux is a reaction to an action. The output voltage can only be induced by the core flux which is already lagging the drive voltage and the output current again lags the output voltage. Hnece the output voltage and current must lag the input voltage and current.

jack

Either way it must be physically impossible (however minute - or large) for output and input voltages to be in phase irrespective of load. Input current produces magnetic flux and this must lag drive voltage ( CIVIL) since flux is a reaction to an action. The output voltage can only be induced by the core flux which is already lagging the drive voltage and the output current again lags the output voltage. Hnece the output voltage and current must lag the input voltage and current.

I don’t agree with that. I still say a single phase transformer imposes no phase shift on it’s own. If the secondary is wired to a pure resistive load then the secondary voltage will be in phase with the primary voltage. Now if the load wired to the secondary has inductive or capacitance reactance, then all bets are off as the old “ELI the ICE man” rule applies.

Lefty