Nicla Sense ME pedestrian dead reckoning PDR / sensor fusion drift

Hello, i already asked in the Bosch forum about the pedestrian dead reckoning PDR features of BHI260AP:

"Currently, the software supporting PDR on Arduino Nicla Sense ME has not been released."

I am trying to get the trajectory and thus the position / motion of the sensor. The velocity is calculated via double integration.

velocity[0] += (acceleration[0] + last_acceleration[0]) / 2.0 * deltatime;
velocity[1] += (acceleration[1] + last_acceleration[1]) / 2.0 * deltatime;
velocity[2] += (acceleration[2] + last_acceleration[2]) / 2.0 * deltatime;

In three.js i am adding the velocity to the object oriented by quaternions:

nicla.object3D.quaternion._x = -qx;
nicla.object3D.quaternion._y = qz;
nicla.object3D.quaternion._z = qy;
nicla.object3D.quaternion._w = qw;

const rvx = new THREE.Vector3( 1, 0, 0 );
const rvy = new THREE.Vector3( 0, 1, 0 );
const rvz = new THREE.Vector3( 0, 0, 1 );

nicla.object3D.translateOnAxis(rvx, tx);
nicla.object3D.translateOnAxis(rvy, ty);
nicla.object3D.translateOnAxis(rvz, tz);

Orientation and transformation seems to be correct. But unfortunately there is a heavy drift and the object drifts out of sight in some seconds.

Do you know how to work with the PDR features of Nicla Sense ME? Or do you maybe have another suggestion for eliminating the heavy drift?

I am looking at a similar problem with skateboards. I suspect there needs to be a kalman filter added to help smooth other the drift over time.

Have you had any luck since you posted the question>

I have no solution yet. I was in contact with Bosch without success. They've got a solution but only for industry partners under NDA. But not for universities.

I tried some filtering approaches. The trajectory looks better but the latency gets too high. We could join forces if you like.

This is not currently feasible with any consumer grade IMU. The sensor noise and inaccuracy are too large.

A partial explanation of the problem is given in this blog article: Using Accelerometers to Estimate Position and Velocity | CH Robotics

Well it means its possible to do it :slight_smile:

Yeah would be keen to collaborate, im experimenting with visualising skateboarding tricks and eventually using a ML model to analyze the data in realtime.

Thanks for the link.
My math is not the best, tried github openai for the first part

// Create a function that Extracts Inertial-Frame Acceleration
type Vector3 = {
    x: number;
    y: number;
    z: number;
}
type Quaternion = {
    toRotationMatrix: () => {
        invert: () => {
            TransformCoordinates: (acceleration: Vector3, rotationMatrixInverse: any) => Vector3;
        }
    }
}
const ExtractInertialFrameAcceleration = (acceleration: Vector3, rotation: Quaternion) => {
    const rotationMatrix = rotation.toRotationMatrix();
    const rotationMatrixInverse = rotationMatrix.invert();
    return Vector3.TransformCoordinates(acceleration, rotationMatrixInverse);
}

This

function calculateLinearAcceleration(ax, ay, az, gx, gy, gz) {
  // Calculate the gravity vector
  const gravity = new THREE.Vector3(ax, ay, az);
  gravity.normalize();
  
  // Calculate the angular velocity vector
  const angularVelocity = new THREE.Vector3(gx, gy, gz);
  angularVelocity.normalize();
  
  // Calculate the linear acceleration vector
  const linearAcceleration = angularVelocity.cross(gravity);
  linearAcceleration.normalize();

  return linearAcceleration;
}

 const linearAccelleration = calculateLinearAcceleration(ax, ay, az, gx, gy, gz);
  // Calculate velocity from accelerometer values over time
  const rvx = new THREE.Vector3( 1, 0, 0 );
  const rvy = new THREE.Vector3( 0, 1, 0 );
  const rvz = new THREE.Vector3( 0, 0, 1 );
  arduinoModel.translateOnAxis(rvx, linearAccelleration.x);
  arduinoModel.translateOnAxis(rvy, linearAccelleration.y);
  arduinoModel.translateOnAxis(rvz, linearAccelleration.z);

This is a little jittery :slight_smile: