Hi, I've been working on a project to use current transducers to monitor my home energy consumption. My project is based upon this project: http://www.picobay.com/projects/2009/01/real-time-web-based-power-charting.html , which was using ioBridge as a gateway, not an Arduino.
In short, I've been using a current clamp that outputs 10mV/Amp, and using a AD627 instrument amplifier (DIP package) to boost the voltages that the Arduino sees.
This is the schematic I've been using: http://www.picobay.com/projects/uploaded_images/Schematic-752398.png (I changed out the AD8220 for a AD627, the pin mappings were a little different).
I can measure the output of the circuit that I'm taking into the Arduino's A0 pin using a multimeter, and I have it adjusted to give 1.00v DC on the nose. However, measuring through the arduino, I'm getting 1.09v as my baseline. Since sensitivity is a priority, this .09v difference ends up converting to 200-something watts. Maybe someone can take a look to see what I can improve upon?
void setup() {
Serial.begin(9600);
analogReference(DEFAULT);
}
void loop() {
delay(1000);
//get the analog reading
float analog_1 = analogRead(A0);
Serial.print("Analog Reading: ");
Serial.print(analog_1);
Serial.print(", ");
//convert the analog reading to voltage, using 5v = 1024
float ct1_value = (analog_1 * 5) / 1024;
Serial.print("Converted to Voltage: ");
Serial.print(ct1_value);
Serial.print(", ");
//Because the circuit outputs 1v at a 0 reading, calculate the current. In the
//circuit, a 16k resistor is used to create a gain of 4
float amps_1 = ((ct1_value -1) * 100)/4;
//Convert over to watts, using a 75% power conversion, and a 110vac house voltage
float watts_1 = 110 * amps_1 * .75;
Serial.print(amps_1);
Serial.print("a ");
Serial.print(watts_1);
Serial.println(" watts");
}