ACS712 30a current sensor

Hello,

I am trying to make a power consumption meter that will be used for various circuits (12v, 24v and 48v) and part of this consumption meter will use the ACS712 (30A) current sensor, but I am having difficulties making this part work.

I am pretty sure my code is working fine as when I replace the current sensor output with a potentiometer I can get different “current” values. But I do feel maybe there is a problem with the calculations as the results are a bit off.

When the amp meter is not connected it generally hovers around the 511 adc value which is to be expected. I have connected it in series with two different value resistors to check if it is reading correctly but the values I am getting are not consistent.
The first circuit has a 47 ohm resistor, so according to my calculations the current should be 0.106 amps. When I measure with my multimeter I read 0.051A and when I read with the ACS712 I get 0.72A
The second circuit has a 150 ohm resistor. Calculated value: 0.033A, Multimeter value: 0.0191A and ACS712 value: 0.46A.
I also have tried a circuit with a 10 ohm resistor with similar results. Also with a 9v battery and different resistors trying to increase the current.
These tests were done with three different ACS712 meters giving close values.

My questions are:
Would these deviations be due to the relatively small amps compared to the 30A range of the meters, and would get better with higher amps? Could this also explain why the measurements (even from the multimeter) are so different to the calculated values?
Or would these small differences amplify with higher amps?

Any help would be greatly appreciated,

-Steve

int currentSensor = A0;   //A0 pin as input from current sensor
int adcValue = 0;         //value for adc conversion
int offsetVoltage = 2500;  //offset voltage (mV) of the sensor
int sensitivity = 6.6;     //mV/A for 30A sensor (from datasheet)
double currentRead = 0;     //value that is calculated from adc value and sensitivity
double adcVoltage = 0;      //voltage converted from adc value
double currentSum = 0;      //adding up measurements to get average
double currentAve = 0;      //average of 100 read measurements

void setup() {
  pinMode(currentSensor, INPUT);  //A0 pin as input
  Serial.begin(9600);             //serial communication
    
}

void loop() {
  currentSum=0;                     //reset the summing value of measurements
for (int i = 0; i<100; i++){        //loop to take 100 measurements and average the results
  adcValue = analogRead(currentSensor); //read the value from the sensor
  delay(10);                            //small delay for adc to do its thing
  adcVoltage = (adcValue / 1024.0) * 5120;  //converting the adc value to a voltage 5.12V measured on power rails-                                                                       //-of breadboard
  currentRead = ((adcVoltage - offsetVoltage) / sensitivity); //converting the voltage form adc to current
  currentSum = currentSum + currentRead;                      //adding the measurements together to be averaged
}
currentAve = currentSum / 100;            //averaging the measurements
  
  Serial.print("adcValue: ");             //print the adc value with a title to serial monitor
  Serial.println(adcValue);
  Serial.print("adcVoltage: ");
  Serial.println(adcVoltage);             //print the adc voltage with a title to serial monitor
  Serial.print("Current: ");
  Serial.println(currentAve);             //Print the current with a title to the serial monitor
  delay(2000);

}

Current through a resistor depends on the voltage across it, which you didn't say. Post a diagram of your test circuit.

int sensitivity = 6.6;     //mV/A for 30A sensor (from datasheet)

The output of a 30 Amp version is 66 mV (0.066 volt) per amp.

Next consider the best resolution you will get with a 10 bit A/D. Then the remaining possible errors.

Ron