Hey there!

I have a voltage divider which is supposed to function as a crude input protection for an ADC (differential reading). I am familiar in calculating basic voltage dividers and this basically is just that, but somehow I am stuck now.

The ADC will see 0.24 volt since that is the differential voltage between the points A and B. But I am struggling to find the right equation to calculate the actual input voltage. Maybe someone is able to point me into the right direction.

Thanks, V.

Edit: I've been juggling with the formula a bit and it seems like I got my solution. So the voltage drop across R

_{2} is supposed to be: U

_{R2} = U

_{in} * R

_{2}/(R

_{1}+R

_{2}+R

_{3}). Rearranging that formula will give me the input voltage.

I tried to verify that in practice and built a quick test-setup on a breadboard.

- R
_{total} = 20.74 kOhm - R
_{2} = 0.9834 kOhm - V
_{R2} = 0.2372 Volt

The equation gives me 5.003V, opposing to 5.01V which I've fed into the circuit. But I guess that's okay since I have to take some measurement error into account.