How, specifically, does an electric meter measure wattage?

David82:
I'm trying to build one and learn a few things in the process. When i try to google it I get the wrong question answered (like how to read a electric meter, etc.). AC just looks like a sine wave on an oscilloscope so what characteristics of that sine wave are measured to determine wattage used? If that's not what is looked at, then what is?

Standard AC voltage and current units of measurements are given in equivalent DC RMS values such that if your were to power a 120 ohm resistor from both a 120 vac power source and another time with a 120 vdc battery, both are said to create 1 amp of RMS current and the resistor is dissipating 1 RMS watt of power. AC voltage can be measured in other units like average voltage, peak voltage, peak to peak voltage, all derived from the same indentically sized AC voltage wave form.

The measurement of true AC power consumption (as measured by your billing meter) involve other additional factors like Power Factor (phase angle difference between voltage and current for loads other then pure resistance) and dealing with non-sine wave current flows ( the AC voltage provided to you is always in sine wave form, but some of your loads don't draw current as a pure sine wave and must be converted to true RMS current to obtain true power consumption). But before advancing you first should nail down voltage/current/wattage measurements as they apply to AC and DC circuits and what the units of measurement really mean. This seems to be a good article on the subject: http://www.allaboutcircuits.com/vol_2/chpt_1/3.html

Secondly, if a 1:1 transformer is used between the meter and the load, would the meter still read correctly?

Yes, the 1:1 ratio tells you the voltage and current levels with be equal between the primary and secondary minus a few percentage power loss in the transformer core in the form of heating of the iron core due to eddy currents.
Lefty