How to calibrate a current sensor


I've built a data logger for measuring performance of an electric go cart. All seems to be working well and i'm trying to calibrate my current sensor. When I read the output voltage of the transducer it reads exactly what my Arduino UNO reads. But when I convert that voltage to current using the info on the data sheet, the results are around .4 amps different from a clamp on amp meter. Measured amps is 27.4 and the amp meter says 27. I'm averaging around 100,000 samples. I tend to put more faith in a "store bought" meter, but I'm not sure what to believe.

I can easily modify my code to get exact resutls but that will require me knowing what unit is right. . The relationship is linear so a simple y = mx + b is how i'm calcualting the current.

Here's a link to my curren transducer

Here's the link to my current meter

Thoughts on how to calibrate current sensors?

Thanks in advance.

Hi, If you believe your clamp meter is accurate then use it. When you clamp make sure it is away from other wires, so you may have to extend a loop.

y = mx + c.

Adjust C to get y=0 with no current flow.

Then with current flowing adjust m to get values to agree.

But get zero correct first.

Tom.. :)

The difference is <1.5%. I would be happy with that.
If you’re not, borrow a second clamp meter.

KrisKasprzak: Measured amps is 27.4 and the amp meter says 27.

Those readings are, in fact, equal, unless you have dropped a digit from your amp meter reading, and its reading is "27.0" rather than "27". This as 27.4 rounds down to 27.

The transducer outputs 1.5 volts at full scale. Are you bringing that directly into the Arduino analog input which is 5 volts?

If the above is true, you're down to about 300 counts of resolution, effectively 8 bits, not 10. With a 50 amp sensor, the best it can do is 160 ma per count. You cannot expect hair splitting accuracy when throwing away analog resolution.

Specifications: Display Count: 2000 Auto Range: Yes Jaw Capacity: 17mm True RMS: Yes Diode: Around 3.0V V.F.C.: Yes NCV: Yes Data Hold: Yes Zero Mode: Yes LCD Backlight: Yes Auto Power Of: Around 15 Minutes Continuity Buzzer: Yes Low Battery Indication: 2.5V Input Protection: Yes Input Impedance for DCV: 10M AC Current (A) 2A/20A/100A (±2.5%+5) DC Current (A) 2A/20A/100A (±2%+3) AC Voltage (V) 2V/20V200V/600V (1.0%+3) DC Voltage (V) 200mV/2V/20V200V/600V (±0.7%+3) Resistance (Ω) 200Ω/2KΩ/20KΩ/200KΩ/2MΩ/20MΩ (±1.0%+2) Capacitance (F) 2nF/20nF/200nF/2μF/20μF/200μF/2mF/20mF(±4%+5) General Characteristic Product Net Weight: 170g Product Size: 175mm * 60mm * 33.5mm Power: AAA 1.5V x 2 (Included) LCD Size: 39.3mm * 21.5 mm Package Size: 18.6 * 7.2 * 5.9cm / 7.3 * 2.8 * 2.3in Package Weight: 311g / 11oz Package List: 1 * UNI-T UT210E Mini Clamp Meters (Batteries not Included) 1 * Test Lead 1 * English Manual

The meter only claims to have an accuracy of (±2%+3).

Give a man a clock, and he knows what time it is.

Give a man two clocks, and he's not sure.

Are the following interpretations of Sensor data sheets correct?

Given the Current Transducer Type: L01Z050S05
Nominal Current = 50A, and this is the Full Scale Current

Sensor output at 0A = 2.5V (ignore the tolerance)
Sensor output at nominal current = 2.5V + 1.5V = 4.0V

Considering the so small variation in response over a large current (0 - 50A), we have to go for a very high precision calculation. The following response equation may give some reading after the decimal point.

A(0A, 2.5V)
B(50A, 4.0V)
C(I, V)                //I = primary current; V = Output DC voltage of the Sensor

==> I = (50/1.5)*V - (50/1.5 + 50)

==> I = 33.3333333333*V - 83.3333333333

==> I = 33.3333333333*5/1023*ADC - 83.3333333333   //5V Vref for ADC

==> I*10[sup]10[/sup] = 333333333333*5/1023*ADC - 833333333333

==> I*10[sup]10[/sup] = 0x611B8BA9*ADC - 0xC206898D55

Validity Check of the above equation:
At 220V, 100W Load: Current, I = 100/220 = 0.454545 A

V = 2.513636 V

ADC = 1023/5 * 2.513636 = 514 = 0x0202

amplifiedCurrent =  0x611B8BA9*0x0202 - 0xC206898D55

amplifiedCurrent = 0xC2F94E6952 - 0xC206898D55 = 0xF2C4DBFD = 4072987645

normalCurrent = 4072987645/10+E10 = 0.40 / 0.41 (after rounding)

So, even after amplifying the signal by 1010 times, still we have error of 8.8%.

We need ADC with higher resolution; 12-bit of Arduino DUE could be used.

That, or an external ADC like the 16-bit ADS1115.

But you have to also worry about the reference voltage in that case. The Arduino normally uses Vcc as reference. That's about 5V. If the sensor uses the same 5V as reference for its output, there's no problem when using the Arduino's internal ADC, but you have a problem with the ADS1115 which uses its own absolute references.

If the sensor uses internal references to put out a more absolute value, then if the 5V reference changes, the measurement of the internal ADC changes. However in that case it's good for the ADS1115 as it also uses a fixed reference.

If there's a +2.5V offset, you lose a lot of resolution as well, as you have to measure a 0-4V range, instead of a 0-1.5v range. By lowering the reference voltage you can increase the output range, and with it get a higher resolution. This can quite easily be done with an OpAmp. Then if you add a 3x gain to the circuit you bring your 0-1.5V range up to a 0-4.5V range, and with it increase the resolution of your built-in ADC as you cover a larger range.

Mind: as you increase your ADC resolution and with it the precision of the measurement, you do not necessarily improve the accuracy of the measurement. If that sensor has a 2% error, that's the best you can ever get, and using a 3/10th of the scale of a 10-bit ADC (what you're doing now) is quite good enough, as you have 0.33% steps, well within the error of your input signal. If you bring your signal to a 0-4.5V range as suggested above you'd end up with 0.11% steps. Using a 12- or 16-bit ADC isn't going to improve on that a single bit.