Staying in phase and measuring sags...

Hi and thanks to you both for the answers. The equipment we're looking to testing is consumer electronica like DVRs that have a fairly moderate power draw (<50W) but which can react somewhat sensitively to very short power outages or even sags. You can imagine how happy home users might be if their DVR omits recording the latest Game of Thrones episode just because there was a short hiccup in the line power supply.

However, I am not aware of any industry standards re: how long a power supply is supposed to last (i.e. be able to ride through an incoming hiccup, high or low). Based on published specifications, I presume that most power supplies are tested on a continuous +/- 20% nominal VAC basis but I would really appreciate any resources you can recommend.

I'd like to explore three scenarios, i.e. testing on a

  1. Continuous brownout basis (i.e. at what sag voltage does the DVR stop recording)
  2. Intermittent brownout (i.e. how low can you go with a 0.5ms, 1ms, 1.5ms, etc. brownout without affecting the recording)
  3. Complete dropout basis (i.e. all the way to 0VAC) at 0.5ms, 1ms, etc.

I haven't considered spikes, and I would be very appreciative of any industry norms you could point me to. I saw an interesting circuit that used a 18V transformer as a buck or boost converter to explore the behavior of the attached equipment at +/- 15% of nominal voltage using a DPDT switch to switch back and forth.