I've tried to read up on the various techniques for doing this. To date they seem to revolve around adding a dc bias, then sampling the waveform via the Arduino ADC at a high rate to determine/approximate the peak-peak value of the input.
This seems like a lot of processor time and effort is being used getting readings when most of them will be discarded (only the peak high and peak low are used). For my application I don't need high resolution, in fact 5% of full scale will be fine, nor do I need good repeatability. I simply want to set a threshold to activate an output, and that output will be reasonably consistant, let's say within 2% of full scale.
I intend to do the following, and invite your opinions on it's feasability and possible pitfalls.
- Measure an ac current of 0 to 40 Ampere with an SCT-013 CT, nominal output 5v for 50A.
- Take the CT output (nominal 5v pk-pk)and bias it positively by about 3vdc.
- Feed this into a full wave rectifier, where I'll 'lose' about 1.1vdc across the diodes.
- Smooth and filter the rectifier output to produce a (gently!) rippling dc signal.
- Feed this signal into a voltage divider to keep the maximum signal below 5vdc, and then into the Arduino ADC.
My intention is to only read the ADC once every 200ms or so (every 10 mains cycles), which will be perfectly adequate for my needs.
Is it feasable?
Is it sensible?