Greetings,

I want to make a digital voltmeter that goes down to µVolts and up to 100Volts (DC). I will use a resistor network to measure the voltage. With the help of voltage divider equation I calculated the required resistor values.

``````    R1 40kohm       R2 1kohm
Vin---/\/\/\--------/\/\/\---GND
|
``````

What is the upper limit?

Can you explain me more please.

What range are trying to measure? Such as 10uA to 1000uA?

I listened to your recomendation and changed my first post.

I don't have a schematic. I am trying to gather all the required componenents.

I wrote my post poor from information. I changed it with the help of @gilshultz I hope you will read and help me.

Hi,
Are you trying to measure that range with one potential divider?

Have you wondered why Digital Multimeters use many ranges to get the one range you want to read.

Thanks.. Tom.....

Yes I wondered.

I have been making projects for three years.I started making projects with arduino. I can make simple circuits without looking anywhere. I can design circuit boards but I don't know anything about filtering and EMI. I can work with transistors opamps resistors etc. I am working with registers in atmega328p. I can write complicated codes(My friends think they are complicated). Note: I'm not an electronics engineering student right now, I'll be next year.

Hi,
DMMs have ranges because they are not able to measure a large span of current or voltage at once.
You probably will need an array of potential dividers and to auto range to maintain an accurate and stable measurement.

arduino digital multimeter autorange

Thanks.. Tom...

I will Google it as soon as I finished my homework.

Apart from the calculation involving the resistor divider network, the reference voltage, ADC resolution etc., you'll have to think about a means of protecting the circuit if the user selects an incorrect range.
If you are going down to the microvolts ranges, you'll also probably need to amplify the voltage to get a useful result.

Why do I need to amplify the signal? I can just use a 24 bit ADC. Am I wrong?

100V/(2^24) = 5.96uV per step.
ADCs don't work on 100V signals directly tho, so you have to divide it down to not exceed the highest voltage your device can measure. 3.3V or 5V or 10V or 12V, etc.
The lowest priced 24 bit ADC I could find (that a datasheet would open for) was this one

It only takes analog voltage in up to 3.6V, so that 100V input would need to be divided down by about 33x.
3.6/(2^24) = 0.0000002V/step, 0.2uV.

Not necessarily wrong. But I guess that close to the lower limit, in the low microvolts range, things like input impedance of the device, leakage and other factors may start influencing the accuracy of the results, therefore some sort of buffering / amplification may be useful.
On the subject of input impedance, the resistor network that you've shown in the OP has a very low input impedance (41k ohms) compared with most digital volt meters where even cheap ones could be around 10megohms.

ADC's working voltage range (0V to 3.6V) will 1/33rd the amplitude) so if you wanted to measure
10uV, (without the divide by 33 circuit), you could measure 5.96uV and the ADC would report
1 count. But WITH the DIVIDE BY 33 input circuit, the ADC will see 10uv/33= 0.3030uV, which is well below it's minimum range. What this means in plain English is that if you are going divide by 33 to reduce the high voltages, you have to multiply (AMPLIFY) low LOW voltages.
If you think about it, this creates a Catch-22. How can you amplify ALL ADC inputs if all the inputs above 3.6V and all ADC inputs below 0.109V need amplification ? An example might help. Let's say the voltage to be measured is 1.00V. 1V/33 = 0.0303V. This is the voltage the ADC will see because you divided that input by 33. if you then multiply that input by 33, the ADC input would be 33V. Obviously , this is not going to end well for and ADC operating on 3.6V. There is your Catch-22. What to do ? Well, since you're not yet and electronics engineering student , this may be beyond your current level, but for the rest of us it is just another day at the office. You add a comparator to the output of the divide by 33 circuit with a Vref of 0.100V. The comparator switches a small dip relay. An analog switch will have internal resistance which would alter the voltage. A relay would not. When the comparator detects a voltage BELOW 0.1V, it switches the relay, rerouting the output of the divide by 33 voltage divider to the input of a Gain of 33 non inverting op amp amplifiier. (Rin= 1k, Rf= 32k (1%). Gain A = 1+R2/R1).
Example: The 1V input.
1V /33= 0.0303V
0.0303V<0.1V
.'. (therefore)
the comparator switches the input to the amplifier, and the ADC sees 0.0303V33= 1.00V
1V/5.96uV=167,785 counts.
max count for a 24-bit ADC = (2^24-1)=16,777,215
167,785 counts/16,777,215 represents 0.01 x100 = 1% of the ADC's range. (167785
100=16,778,500)
Obviously, if you read 167,785 counts from the ADC, you need to check the COMPARATOR
Flag (a separate signal (logic low/high) from the comparator circuit to a logic input of the processor that must be read to read the comparator/amplifier status, | where "1" = AMPLIFIER ENABLED
Having read a "1" on the Amplifier Status Input pin, your code would then need to divide
this value by 33,
167785/33=5084
50845.96uV=0.0303, the value on the output of the divide by 33 voltage divider.
You must then multiply this by 33 to compensate for the divide by 33 voltage divider.
0.0303V
33=0.9999986V (round up to the next whole number = 1.00V), the value of the
input signal.
If someone knows an easier way, I would be interested to see it but that's my suggestion,
for what it's worth. I'm only an Electronics Engineering Technician, not a BSEE, so it might
be quite different from what a BSEE would suggest. It is what it is.

This topic was automatically closed 120 days after the last reply. New replies are no longer allowed.