Hello everyone.
We are making mains power analyser using arduino mega.
There is load (Bulb+Choke) connected to ac supply main.
The step down transformer connected parallel to load, then though offset bias and resistors combo the voltage input finally goes into analog pin A2.
The current sensor used is ACS and output of which goes into analog A1.
The code file is attached. (We've got that from github. This project idea is from open energy monitor site.)
The rms values of current and voltage are accurate. The apparent power is accurate.
But there's large deviation in measured REAL power and actual real power. And same for power factor.
As far as circuit is concerned I don't think there's any issue. Please go through code and tell us why we are getting power calculation inaccurate. Waiting for reply !
EmonLib.cpp (10.6 KB)
EmonLib.h (2.98 KB)
...I didn't look at the code.
Does Gethub have a forum, or is there a way to contact the developer(s).
You may not have a software problem.
Your step-down transformer may be introducing a "false" phase shift, or you may have some other hardware issue.
If apparent power is correct, the phase measurement is the obvious place to look. It's either the phase measurement or the actual power calculation.
Oh... Make sure the code is written for the 50 or 60Hz line frequency where you live.
But there's large deviation in measured REAL power and actual real power. And same for power factor.
And.... How do you know that? Do you have an independent way of measuring voltage/current/phase?
That actually might be a clue if they share the same error. i.e. If the power factor is calculated first, and the actual power calculated from that, then you don't have to look at the actual-power calculation and you can look for problems in the power-factor calculation, or the data that's fed-into the power-factor calculation. (Or vice-versa if the actual power is calculated first, and that result is used to find the power-factor.)
A couple of things you can try...
-
Modify the code to send the voltage, current, and phase information to the serial monitor to see if it's correct. If the measurements are wrong, there's your problem. If the measurements are OK, the software is making the wrong calculation from those measurements.
-
Modify the code to ignore the actual readings. Then, generate some "simulated" voltage/current/phase values in software and check to see if makes the correct power calculation. If the software is calculating correctly, it's not measuring correctly.
We are having RLC meter in lab. From that we've got actual power factor of load (choke). Inductance value and resistance value of choke are also obtained. The current is measured from ammeter. I^2 *R is real power actual. One which is on screen far different than this.
What I understood from code, power factor is calculated from real power and apparent power values. Real power is wrong and so the power factor.
Also for Load 1 bulb, power factor on screen= 0.95 and actual it should be close to 1
For load 2, choke power factor is 0.06 and actual it is 0.3 (From RLC meter in lab)
Why error is varying according to load ?
The current is measured from ammeter. I^2 *R is real power actual.
That's correct if total "R" is correct. I think the RLC meter should measure the resistance of the coil correctly, but a light bulb will measure low when it's not lit. Try calculating the lamp's resistance from it's power & voltage rating to see if it agrees with what you're measuring with the RLC meter.
One which is on screen far different than this.
If your circuit is measuring different from the ammeter, that's a clue. I'd assume the ammeter is correct and your circuit (and/or software) is wrong. That looks like a good place to start troubleshoting.
For load 2, choke power factor is 0.06
That could be right. An ideal inductor would have a power factor of zero. Is it measured at your 50/60Hz line frequency?
and actual it is 0.3 (From RLC meter in lab)
Actual power in Watts? Does your RLC meter also measure Watts?