I've got an old electronic clock, from the 70's, and after opening it, I noticed that it uses the 50 Hz mains power supply frequency as reference.
So, wondering how accurate could be this frequency, I've built this simple circuit to measure it :
The input come from an AC to AC wall 12 V transformer.
The first diode acts as a rectifier, creating a positive half wave, and the zener diode cut it to +5.1V max.
The mosfet is, may be, optional, but helps to feed an arduino pin with a perfect +5V square signal.
The arduino counts the pulses, and after 50 increases a seconds counter.
The number of seconds is sent via serial USB to a computer (a Raspberry pi), which checks this value against its own time counter, synchronized with Network Time Protocol. The two values are recorded in a text file, so it is possible to make statistics and draw graphs about mains frequency stability.
Connecting an oscilloscope to the A mark of the circuit above, I noticed that the AC voltage was not perfectly rectified. There is residual positive voltage when the AC is negative :
I also noticed that when I touch the wires, the shape of the curve improved, so I replace my own body by a 22k resistor, connecting A to ground :
Then I obtained the desired curve :
So I will be pleased if someone with better electronical understanding than me explains why there is such residual positive voltage without the 22k resistor...