ADC IC recomendation


I want to make a digital voltmeter that goes down to µVolts and up to 100Volts (DC). I will use a resistor network to measure the voltage. With the help of voltage divider equation I calculated the required resistor values.

ADC_IN = Vin * (R2%R1+R2)

I planed to use a delta sigma ADC because their resolution can go up to 32 bit. As I know if you divide your reference voltage by your ADC resolution it will give you a number and this number is your ADC step. For example atmega328p has a 10 bit ADC and its ADC reference voltage is 5 volts so 5%1024 = 0.0048. If I read a ADC value of 1 this means I am measuring 0.0048 volts in my ADC input. Am I right? My reference voltage for ADC is 3.3V. Can someone give me an advice?

    R1 40kohm       R2 1kohm

What is the upper limit?

Thanks for the answer,

Can you explain me more please.

What range are trying to measure? Such as 10uA to 1000uA?

Your question and the terms you used is confusing. We are assuming you are talking about DC, are we? Load (current) is measured as amps or in your case probably uA (micro amps). You state "uVolts" which is voltage not current. As asked by CrossRoads What is the upper current limit. You need to tell us what the min and max voltages are, the min and maximum current you need. You have a ADC, that converts analog voltage to digital you will have to use a current sensor of some type, it could be a resistor, hall effect device etc. Using resistors the ADC can read a lot more than 3.3V. You state programmable load, what part is to be programed and in what increments. More information please. Also when you post schematics post real schematics, not frizzy things. Also it is important to post links to each hardware device you have showing technical details. If I search for an ADC I get about 115,000,000 responses which one doi you have.

Thanks for the answer,

I listened to your recomendation and changed my first post.

I don't have a schematic. I am trying to gather all the required componenents.

Sorry @CrossRoads,

I wrote my post poor from information. I changed it with the help of @gilshultz I hope you will read and help me.

Are you trying to measure that range with one potential divider?

Have you wondered why Digital Multimeters use many ranges to get the one range you want to read.

Can you please tell us your electronics, programming, arduino, hardware experience?

Thanks.. Tom..... :grinning: :+1: :coffee: :australia:

Thanks for the answer,

Yes I wondered.

I have been making projects for three years.I started making projects with arduino. I can make simple circuits without looking anywhere. I can design circuit boards but I don't know anything about filtering and EMI. I can work with transistors opamps resistors etc. I am working with registers in atmega328p. I can write complicated codes(My friends think they are complicated). Note: I'm not an electronics engineering student right now, I'll be next year.

DMMs have ranges because they are not able to measure a large span of current or voltage at once.
You probably will need an array of potential dividers and to auto range to maintain an accurate and stable measurement.

Can I suggest you google

arduino digital multimeter autorange

Thanks.. Tom... :grinning: :+1: :coffee: :australia:

Thanks for the answer.

I will Google it as soon as I finished my homework.

Apart from the calculation involving the resistor divider network, the reference voltage, ADC resolution etc., you'll have to think about a means of protecting the circuit if the user selects an incorrect range.
If you are going down to the microvolts ranges, you'll also probably need to amplify the voltage to get a useful result.

Thanks for the answer,

Why do I need to amplify the signal? I can just use a 24 bit ADC. Am I wrong?

100V/(2^24) = 5.96uV per step.
ADCs don't work on 100V signals directly tho, so you have to divide it down to not exceed the highest voltage your device can measure. 3.3V or 5V or 10V or 12V, etc.
The lowest priced 24 bit ADC I could find (that a datasheet would open for) was this one

It only takes analog voltage in up to 3.6V, so that 100V input would need to be divided down by about 33x.
3.6/(2^24) = 0.0000002V/step, 0.2uV.

These parts can accept up to 10V, but cost 10 times as much!

Thanks for the answer.

Not necessarily wrong. But I guess that close to the lower limit, in the low microvolts range, things like input impedance of the device, leakage and other factors may start influencing the accuracy of the results, therefore some sort of buffering / amplification may be useful.
On the subject of input impedance, the resistor network that you've shown in the OP has a very low input impedance (41k ohms) compared with most digital volt meters where even cheap ones could be around 10megohms.

The requirement to divide the ADC input by 33 means that your ADC measurements within the
ADC's working voltage range (0V to 3.6V) will 1/33rd the amplitude) so if you wanted to measure
10uV, (without the divide by 33 circuit), you could measure 5.96uV and the ADC would report
1 count. But WITH the DIVIDE BY 33 input circuit, the ADC will see 10uv/33= 0.3030uV, which is well below it's minimum range. What this means in plain English is that if you are going divide by 33 to reduce the high voltages, you have to multiply (AMPLIFY) low LOW voltages.
If you think about it, this creates a Catch-22. How can you amplify ALL ADC inputs if all the inputs above 3.6V and all ADC inputs below 0.109V need amplification ? An example might help. Let's say the voltage to be measured is 1.00V. 1V/33 = 0.0303V. This is the voltage the ADC will see because you divided that input by 33. if you then multiply that input by 33, the ADC input would be 33V. Obviously , this is not going to end well for and ADC operating on 3.6V. There is your Catch-22. What to do ? Well, since you're not yet and electronics engineering student , this may be beyond your current level, but for the rest of us it is just another day at the office. You add a comparator to the output of the divide by 33 circuit with a Vref of 0.100V. The comparator switches a small dip relay. An analog switch will have internal resistance which would alter the voltage. A relay would not. When the comparator detects a voltage BELOW 0.1V, it switches the relay, rerouting the output of the divide by 33 voltage divider to the input of a Gain of 33 non inverting op amp amplifiier. (Rin= 1k, Rf= 32k (1%). Gain A = 1+R2/R1).
Example: The 1V input.
1V /33= 0.0303V
.'. (therefore)
the comparator switches the input to the amplifier, and the ADC sees 0.0303V33= 1.00V
1V/5.96uV=167,785 counts.
max count for a 24-bit ADC = (2^24-1)=16,777,215
167,785 counts/16,777,215 represents 0.01 x100 = 1% of the ADC's range. (167785
Obviously, if you read 167,785 counts from the ADC, you need to check the COMPARATOR
Flag (a separate signal (logic low/high) from the comparator circuit to a logic input of the processor that must be read to read the comparator/amplifier status, | where "1" = AMPLIFIER ENABLED
Having read a "1" on the Amplifier Status Input pin, your code would then need to divide
this value by 33,
50845.96uV=0.0303, the value on the output of the divide by 33 voltage divider.
You must then multiply this by 33 to compensate for the divide by 33 voltage divider.
33=0.9999986V (round up to the next whole number = 1.00V), the value of the
input signal.
If someone knows an easier way, I would be interested to see it but that's my suggestion,
for what it's worth. I'm only an Electronics Engineering Technician, not a BSEE, so it might
be quite different from what a BSEE would suggest. It is what it is.