Detecting 110V DC

I have a project where I have to detect the presence of 110V DC. It turns out that the voltage on the point I am connected has voltages from some 70V - 130V DC. In some of the other projects I use this setup:


According to the datasheet for the optocoupler I use, the max forward current is 50mA:

My guess is R1 resistor of some 10K will do the job. The problem is, I don't have any power supply to try it with those voltages. That said, I am not sure which resistor to use.

Any of you messed up with this? Or is there some other way to detect those voltages?

50mA is the "absolute maximum" LED current before it possibly burns-out... I'd shoot for about 5mA.* Since there is only a couple of volts across the the internal LED we can just assume (approximate) that the full voltage is dropped across the resistor (R1). Then from Ohm's Law we an calculate 70V / 0.005A = 14K Ohms. (You do want to "detect" 70V, right?) Yes. 10K would also work.

We also need to calculate the power (Wattage) dissipated by the resistor. (A regular "small" resistor will burn-up.) This time you should use "worst case" of 130V. There a couple of ways to calculate power... One way is Voltage squared / R and I get 1.2 Watts. It's always best to "de-rate" components, so you need a 2W resistor at whatever standard-value is close to 14K.

  • I didn't make a calculation for the 5mA, but Figure 6 on the datasheet shows the current transfer ratio. A ratio of 100% means 5mA through the LED gives you 5mA through the transistor. So with the 10K pull-up resistor (R2), the current is limited by the resistor and the transistor is saturated, which is what you want.
1 Like

You cannot use this circuit because the optocoupler has a maximum reverse voltage of only 6V. You must protect it from reverse voltage. See these circuits, they are more reliable.
AC main detector module
230v/110v AC Mains Detection Module

This thread is about detecting a DC voltage. There may well be no need to protect against any reverse voltage.

As voltages up to 130V are hazardous, I would recommend putting the optocoupler and resistor R1 in a small plastic enclosure.

Sorry I didn't notice DC.

The problem is the point I want to detect voltages has them in a range from 70-130V DC. I don't need to check what voltage is on that point, just to see if there is any. There are only two cases, is there any, or none. That is why I use an optocoupler.

Thank you for your great response. But, with 14K I will be good for 70V@5mA. There are cases when there will be 110V, or even 130V. According to your post, with 110V and that same 14K, I will get some 10mA. Which is good enough.

At the store I buy, there is 15K 2W 2512. I think it will do the job.

The opto transistor only switches 330uA, so about 660uA LED current would be enough for a poor quality opto coupler with a CTR (current transfer ratio) of 50%.
Likely not a problem if you shoot for 1-2mA worst case (70volt).

In other words, 15K, and I am good to go.

Or 47k (~1.5mA@70V).

The PC817 has a minimum current transfer ration of 50%. This means the receiving transistor could switch a current that is 1/2 of the forward diode current.

At a diode current of 5 ma you would be able to conduct 2.5 ma with the transistor.

R = e / i

R = 70 / 0.005

R= 14k this would be your maximum resistor value to guarantee operation. Likely the current transfer ration is much higher in a typical part.

I assume the collector goes to a 3.3volt Arduino pin (first post).
10k external pull up (could be internal pull up) is 330uA transistor current...
15k will of course work, but anything over 1mA is just heating up the LED CL resistor.

I agree with @Wawa I should have mentioned the final values should be predicated on the current needed at the transistor output. Then working back to what is required for current in the LED. I would add a 10 to 20% safety factor.

If the 110V DC is not powering things like motors and solenoids that can generate nasty high voltage transient spikes, I would simply use a potential divider; something like this:


At least that allows use of ordinary small resistors.

In fact the 4.7M resistor will give a lot of protection anyway.

A voltage divider can only be used if grounds can be shared.
Opto isolation is safer, and distance between devices becomes less of a problem too.

I would never use this you suggested.

We don't know the details of the source of this DC voltage but I suggest you work out how much current will flow through the processor's input protection diode if the voltage were to increase to say 150V for some reason. If you are still concerned about "safety", change the 4.7M resistor to a 10M resistor: you would still get more than good enough readings from the analogue-to-digital converter. The circuit I am proposing is far "safer" than many low voltage connections we see to Arduino inputs.

Why is distance a problem? Are you concerned about lightning?

If there is ANY possibility of the input being reverse connected I'd go with this suggestion

only a few extra components, available ready built and not expensive.

I already suggested this solution in post # 3, it was rejected.

DC, guys. Besides, it will be on a PCB. I don't want any modules.
And if you look a little bit closer, you will see the similarities.

Voltages that are unknown, are not smart to directly connect to MCUs input pins. By any means.
And all I need is 1 or 0. I just made a prototype with a 15K 2W resistor as suggested. It should do the job. I will know from Monday.

You told us that the voltage could be up to 130V DC. My suggestion of using a potential divider does not connect that voltage directly to MCUs input pins. I am very happy for my suggestion to be criticised if that criticism is based on a professional knowledge of electronics.

It's not smart using a 2W resistor when @Wawa has written: