I have a 3-axis magnetometer and a small plastic box of known size. The magnetometer is fixed to the bottom of the box, and I am trying to track the depth of the object within the box, which I have been able to successfully do by recording the magnetometer values for the Z-axis at known positions, finding a calibration curve, fitting an equation, and then solving the equation in real-time for depth.
However, the issue I have is that the magnetometer values for the Z-axis (depth) change if the object moves horizontally (in the x or y plane). Therefore, unless the object is directly in the position in which I made the calibration curve, the depth readings become inaccurate. So my question is, would anyone have an idea of how to ensure the depth readings can be consistently accurate even when the object moves horizontally?
The magnetic field strength at any point in space depends on the x, y and z coordinates of the magnet, as well as its 3D orientation, so you have at least five parameters (ignoring rotation about the line through the poles) to characterize for complete freedom of magnet placement.
Can you constrain the magnet position to be in a line?
The object that is being tracked can be constrained to 2 DoF at a minimum (only moving in the Z-axis (depth) and in one horizontal axis), if that is what you were referring to?
Any constraints make the problem much easier to solve. You will have to calibrate for movements along both axes, and you may find that it is not possible to distinguish some positions from others, based on a single Z magnetometer measurement.
Adding in the X and Y magnetometer measurements would probably help a lot, but the math is harder.
Judging on the magnetometer readings, I would probably have to use the X and Y values in order to distinguish the positions. What would you suggest in terms of the process and how to implement it?
All I can suggest is to look for patterns in the magnetometer outputs for various possible movements. I would have to have the thing in hand to do better.