So I have continued to work on my power sensing system and I have a question regarding the voltage sensing portion thereof. I have been doing some reading around the web and I come away with the impression that In order to make a transformer behave like a perfect transformer (ie no or minimal phase lag) that the secondary needs to be fully loaded.
I am currently using little ERA transformers for the line voltage step down, 0.08VA with a 6Vac rms output and a 230v input. If there is no load, the output is closer to 9.2Vac. To load this transformer fully, one would presumably have to apply a 450 ohm resistor between the secondary terminals though using a 470 one might make more sense (availabity, tolerances, etc).
I want to minimize phase lag introduced by the transformer because the allowable phaselag that the ADE7753 can compensate for internally is rather limited. The folk at analog and elsewhere advocate sensing voltage via resistor networks, a topology I find somewhat less inclined towards hobbyists like myself. But cost and lack of phase lag are two definite advantages of this approach.
So to terminate a drop-down transformer with a 1/4w 470 ohm resistor or not? What would be your reasons one way of the other? Many thanks in advance!
Excellent article on transformers at http://valvewizard2.webs.com/Transformers.pdf which explains that zero phase shift occurs at zero load. Once you apply a load to te secondary phase shift starts to creep in.
I don't think its true to say that zero phase shift occurs at zero load. The primary of the transformer has both resistance and inductance, so the magnetising current will be phase shifted from the supply voltage by less than 90 degrees. This in turn will cause the secondary voltage to be a little out of phase with the primary voltage.
In theory it should be possible to compensate for the phase shift (assuming constant mains frequency) by putting a capacitor with a carefully-chosen value in parallel with the secondary.
The point of a transformer is that primary and secondary see the same flux linkage, so the voltage across each turn of each winding is identical. As soon as you load the thing you add IR losses so errors increase (unless the windings are carefully constructed with the exact same current density). Under no load the current in the primary is tiny compared to the short-circuit current of the winding at DC so the IR losses are also tiny. Induced EMF = rate of change of flux linkage.
MarkT:
The point of a transformer is that primary and secondary see the same flux linkage...
Only true for an ideal transformer. Real transformers suffer from flux leakage.
MarkT:
... so the voltage across each turn of each winding is identical.
Even if we ignore flux leakage, that is only true for an ideal transformer. A real transformer has nonzero winding resistance and finite primary inductance [and nonzero leakage inductance cased by flux leakage]. Think of the winding resistance as a resistor in series with each winding of an ideal transformer, the primary inductance as an inductor in parallel with the primary of the ideal transformer, and the leakage inductance as an inductor in series with one or both of the windings. Even with no load on the secondary, the combination of primary inductance and primary winding resistance causes the secondary voltage to be phase-shifted with respect to the primary.
In theory it should be possible to compensate for the phase shift (assuming constant mains frequency) by putting a capacitor...
When I did a project like this (a long time ago) I compensated for phase-shift with software and I found the timing experimentally.
Also... Assuming you are looking for the zero-crossing, it might actually easier if you find a different point near the zero-crossing (say 1V) and then compensate/calcuate the zero-crossing. It can be difficult to find the exact zero-crossing, especially if you are running from a single-ended power supply.
Thank you all. I have enclosed a proposed circuit for the transformer voltage sensing. It uses the bipolar inputs into the ADE7753 (+/- 0.5VAC) and no common-mode voltage appears appears to be needed (unlike most ADCs I know).
FWIW, the ADC is represented by the 370kOhm resistor. The 1k resistor and 33pF cap are suggested by Analog for filtering purposes. The 200K-10K-200K resistor network is to bring the output from the transformer to safe levels.
The common-mode voltage here is about 3V, I suppose I could kill that component if I grounded the intersection between one of the 200K and the 10K resistors. (i.e. on the IN- side) of the voltage divider. Is this what you would do to eliminate the common-mode voltage?
Thoughts about the circuit? Should it produce a good bipolaxr input into the ADC? I figured the resistances were sufficiently high as not to interfere much with the secondary windings...
MarkT:
The point of a transformer is that primary and secondary see the same flux linkage...
Only true for an ideal transformer. Real transformers suffer from flux leakage.
MarkT:
... so the voltage across each turn of each winding is identical.
Even if we ignore flux leakage, that is only true for an ideal transformer. A real transformer has nonzero winding resistance and finite primary inductance [and nonzero leakage inductance cased by flux leakage]. Think of the winding resistance as a resistor in series with each winding of an ideal transformer, the primary inductance as an inductor in parallel with the primary of the ideal transformer, and the leakage inductance as an inductor in series with one or both of the windings. Even with no load on the secondary, the combination of primary inductance and primary winding resistance causes the secondary voltage to be phase-shifted with respect to the primary.
Standard silicon steel mains transformers have core relative-permeabilities in the thousands, and primary inductance impedance several orders of magnitude higher than the winding resistance (otherwise the primary would melt at full load) - the variation from ideal is tiny. I would be surprised if the phase shift of an unloaded mains transformer was more than the order of minutes of arc (milliradians). How ideal do you want?
MarkT:
Standard silicon steel mains transformers have core relative-permeabilities in the thousands, and primary inductance impedance several orders of magnitude higher than the winding resistance (otherwise the primary would melt at full load) - the variation from ideal is tiny. I would be surprised if the phase shift of an unloaded mains transformer was more than the order of minutes of arc (milliradians). How ideal do you want?
You appear to have a optimistic view of mains transformers. The design of a mains transformer primary is a compromise. Too many turns and the primary inductance will be very high but its resistance will be high as well. Too few turns and the primary resistance will be low but the inductance will also be low and the magnetising current flowing through the primary resistance will cause too much heating. The best compromise (from the point of view of size and weight) is for the magnetising current to be a substantial fraction of the full load primary current.
I just measured the following values from a 240V 50Hz 1.8VA mains transformer:
For this transformer, the primary impedance is a little over one order of magnitude greater than the primary winding resistance. Small transformers such as the one the OP is proposing to use have worse regulation than larger transformers, so I suspect it will be less than one order of magnitude for that one, and the phase shift correspondingly greater.
jackrae:
the question was -loaded or unloaded for minimal phase shift
I reason that loaded will result in less phase shift, but only slightly (e.g. I expect the phase shift to be reduced by under 10% for the transformer I quoted, unless the leakage inductance is significant). I wouldn't bother with the load resistor, but either compensate in software or use a C-R network at the secondary to introduce an opposite phase shift.
jackrae:
Guys, in trying to show how clever you are at electrical theory you are losing sight of the question.
the question was -loaded or unloaded for minimal phase shift
Simple, unloaded. Extra current in the primary and current in the secondary both work to increase the phase shift if you work it out on a phasor diagram (modelling copper losses only). Anyway the error is tiny at no load, why complicate a simple circuit?
I think that the "Resistive divider AND an isolation transformer would be the beginnings of a measurement setup that would enable one to measure the "Relative Phase Shift of any transformer. II HEAVILY advocate an isolation or 1:1 ratio transformer because of the VERY HIGH Potential for Electrocution... not to mention other 'strays' and a device to measure the phase shift... typically an oscilloscope.
Grumpy_Mike:
At mains frequencies this will do chuff all about filtering.
Hi Grumpy_Mike,
I believe it has to do with filtering out harmonics associated with sampling the wave-form at up to 29ksps. The RC setup was recommended by the manufacturer, hence I followed their advice.
I wonder if I can impose on you for (hopefully) no more than a moment. Specifically, if you look at p. 15 of the ADE7753 spec manual, you can see how Analog proposes using a 600kOhm array to bring 110VAC voltage down to save input levels on one side while grounding the other end.
I prefer using an isolation transformer (which I'm modeling as an AC source), you can see the simplified bipolar circuit above. What I noticed though is that the setup I'm proposing will result in a common-mode voltage whereas the Analog spec sheet shows a max input of +/-0.5V on the inputs but is mum re: common-mode voltage. Does this mean I should simply ground one leg of the AC transformer and then follow in the footsteps of the Analog AC design (i.e. a 47kOhm drop-down resistor instead of 600kOhm given that the transformer output is about 9VACrms)? Or do you think my bipolar design will work?
I believe it has to do with filtering out harmonics associated with sampling the wave-form at up to 29ksps. The RC setup was recommended by the manufacturer, hence I followed their advice.
No.
The break frequency of 1K and 33pF is 9.6MHz, are you sure you have that capacitor value right?
the Analog spec sheet shows a max input of +/-0.5V on the inputs
Can't see that can you point me at the page where it says that?
Constantin:
I prefer using an isolation transformer (which I'm modeling as an AC source), you can see the simplified bipolar circuit above. What I noticed though is that the setup I'm proposing will result in a common-mode voltage whereas the Analog spec sheet shows a max input of +/-0.5V on the inputs but is mum re: common-mode voltage. Does this mean I should simply ground one leg of the AC transformer and then follow in the footsteps of the Analog AC design (i.e. a 47kOhm drop-down resistor instead of 600kOhm given that the transformer output is about 9VACrms)? Or do you think my bipolar design will work?
The circuit you provided does not define the DC common-mode voltage on the inputs of the ADE7753. Since it has an input range of +/- 0.5v for both inputs, I suggest the arrangement show in the attached schematic. This defines the common mode voltage as 0V and gives reasonable rejection of common-mode transients. Choose the ratio R1:R2 to give a little under +/- 0.5V peak at each input (remember that the transformer will deliver more than the nominal 6V rms when lightly loaded). Choose R1 + R2 to load the transformer lightly (R1 between about 1K and 10K). Then choose C to cancel out the phase shift in the transformer (see my earlier post) - it will be much larger than 33pF and will also help suppress mains-borne transients. Also I suggest grounding the transformer core.
My apologies, gentlemen, when I replied I managed to enter the wrong multiplier for capacitors, it should have been 33nF, not 33pF. It's pretty likely that being off by three orders of magnitude had to do with Grumpy-Mike's question. (Sorry! I'll blame our recent third kid for that malfunction.)
See p. 22-25 of the datasheet re the analog inputs discussion. That's also where Analog shows off it's +/- 0.5V input range for the ADC inside the ADE7753.
The voltage source is a nominal 230VACRMS input and 6VACRMS output transformer but the unloaded output is 9.2VACRMS per the spec sheet. Since this transformer is rated for 230VACRMS, I presume I can also run it at 115VACRMS, right? The secondary voltage would then be about 4.5 VACRMS, no?
Here is a drawing of what I think dc42 thought up for me. Could you confirm that I got the architecture right?
The drop-down resistors were chosen BTW to accomodate a 1.41 conversion from RMS to P-P as well as a 25% safety factor. (i.e. the ADE7753 will be able to read input voltages up to about 290VACRMS being applied to the primary side of the transformer.
The chip itself is OK with inputs as high as 6V without sustaining damage, well in excess of what I expect the transformer to ever survive. The transformer is on a circuit with a fuse and a MOV, hopefully that combination will kill any excesses before they can damage either the switchmode power supply or the transformer.
Yes, that schematic describes what I suggested. You may need to increase the capacitors if you intend to compensate for the phase shift of the transformer.