# ACS172 - DC current measuring: Problem reading Vout

Hey guys,

I’m using the schematics attached below to start testing the ACS172, 5A version. Basically, I’m just varying the potentiometer in order to check if current and Vout vary according to the sensor’s sensitivity(185mV/A). Apparently this is fine, since when I vary the current supplied to the load, Vout responds accordingly. However, when I check Vout with a multimeter, I get different values than what I get via serial monitor, using the code below. (e.g. For I = 0A, I get Vout = 2.5V in the serial monitor and Vout = 2.25V in the multimeter).

``````int adcZero = 511;
int adcRaw, outputVoltage, dcCurrent;
//...
outputVoltage = adcRaw*(0.0048828125); //In Volts, considering a 10-bit adc
dcCurrent = (adcRaw-adcZero)*0.0264; //In Amps, considering a 185mV/A sensitivity
print(outputVoltage);
//...
``````

Should I just ignore this? According to the datasheet, for 0V I should get 2.5V as my Vout and this is what I get in the Serial Monitor.

PS.: My multimeter is fine.

Thanks guys!

Your program assumes that the ADC reference voltage is exactly 5.0 V, which it is not. You can calibrate the ADC so that the two readings agree.

jremington:
Your program assumes that the ADC reference voltage is exactly 5.0 V, which it is not. You can calibrate the ADC so that the two readings agree.

Can you explain a bit further? How would I calibrate this? I'm kinda confused.

The sensor measures currents in a range from -5A to 5A, where -5A is equivalent to ADC state 0(Vout = 0V) and +5A is equivalent to ADC state 1023(Vout = 5V). Isn't it?

Thanks for your response! edit: Are you talking about Vcc? That makes sense. I'll double check it today.
English is not my 1st language, so it took a while to understand what you meant by reference voltage Thanks!

edit: Are you talking about Vcc?

Yes. ADC measurements are always with respect to some reference, and for a 5V Arduino, the default is Vcc. Vcc is almost never 5.0 volts exactly.