IMU Magnetometer Calibration

Interesting. Since I like doing things completely from scratch like a moron, I'm curious about the exact details/math since I'll be implementing it in Python. From the linked post, I didn't see much math explanation, so I'll ask here.

Let me take an initial stab at the calibration algorithm:

  1. Collect 3D mag readings
  2. Find the average X, Y, and Z offset for the mag readings
  3. Subtract the found offsets from the mag readings (remove hard iron error)
  4. Find the average magnitude of the zero-mean mag readings
  5. Using least squares, Find the matrix that maps the zero-mean readings to a sphere with a radius equal to the average magnitude as found in step 4
  6. Apply the matrix to the zero-mean readings (remove soft iron, scaling, and axis non-orthogonality error)

Is this the correct process?