I have a dimmer circuit and need to look at the 110VAC side of the circuit. I have a Rigol DS1102E with the below probes.
I connected this to a socket, taking care to ensure the house grounded wire (neutral) is connected to the shield, and the house live/hot wire was connected to the probe tip. The probe is set on 10X.
It trips my house breaker. I have not tried swapping, as I am certain I got the wires correct measuring with my DVM.
Any thoughts on what might be causing this? I read some other forums that recommend floating the o-scope or lifting the ground wire, or using a resistor network to bring down the voltage to the input. I have measured much higher voltages on o-scopes before and have a hard time believing any of that is necessary. Maybe I got the wrong hardware?
Any pointers are appreciated.
Scope
The front of the unit states 300Vrms all inputs.
It states the following at the end of this document.
Maximum Input Voltage
400V (DC+AC Peak, 1MΩ input impedance)
40V (DC+AC Peak) [2]
and another section states:
Maximum input voltage on analog channel
CAT I 300Vrms, 1000Vpk; instantaneous overvoltage 1000Vpk
CAT II 100Vrms, 1000Vpk
RP2200 10:1:CAT II 300Vrms
RP3200 10:1:CAT II 300Vrms
RP3300 10:1:CAT II 300Vrms
Probes
Specifications
- Attenuation Ratio: 10:1 and 1: 1 switchable (slide switch on the handle)
- Input resistance of 10 megohms (10:1) or 1 Mohm (1:1)
- Input capacitance: 10×(14-18P)、1×(120P)
- Compensation range : 8 pF - 35 pF
- Bandwidth: 100 MHz (10:1)
- Rise Time: 3.5ns (10:1) and 56ns (1:1)
- Maximum operating voltage : 600V (10:1) or 300V (1:1)