# IMU 9DOF and Quaternions ..to get 3D orientation ?

Hello,

I’m working with an Arduino and an IMU 9DOF (3-axis gyroscope, 3-axis accelerometer and 3-axis magnetometer). I want to get an independent 3D orientation of 360 degrees on inertial frame.

At first, I used the Yaw, Pitch and Roll angles and saw their limitations on “gimbal lock” around +/-90 degree. So, I did some readings and understood “mathematically” Quaternions numbers. But I don’t know how to integrate sensor’s raw data in use of quaternion numbers.
I think I'm a little lost here..

And then, after using quaternion, I will get a 3D orientation vector {X, Y, Z} and a rotation angle “W” around the vector. Are these coordinates vector on inertial frame?

Any help is welcome,

The RTIMULib library package works very well for a number of 9DOF sensors.

Orientation is defined in an inertial reference frame.

I'm still trying to get my head around quaternions.

Here are a couple interesting sites on the subject.

This one is directed toward game developers. This has the advantage of having some computer code showing conversion between Euler angle and quaternions.

This one is directed toward robot builders/programmers. I like how the use the more intuitive NED (north, east, down) coordinate systems.

This is the best overview and explanation of quaternion math I have found. Also aimed at game developers, but assumes no knowledge of advanced math and very clear.

For a complete introduction to using quaternion math to stabilize the attitude of a quadrotor, study this paper.

The key property of unit quaternions is that you just multiply them to compose 3D rotations, rather
like with DCM's except it takes fewer operations and renormalization is much easier and more stable.

The x,y,z of the quarterion describe the vector about which the 3D rotation happens and its magnitude
is the sine of half the rotation angle. This makes it easy to convert a gyro output value to a quarterion
representing the rotation delta.

Hi, thank you all for replying,
Sorry for responding late, I was busy reading all the links above, and others...

So, after understanding quaternions, I managed to get q0, q1, q2, q3 from 9DOF accel/gyro/mag's raw data (using "MadgwickQuaternionUpdate" function )

Now, I have a 3D vector (q1, q2, q3) and q0 as an orientation around this vector, and all, compared to the inertial frame.

Then, I used these functions to get 3 angles from quaternion;

``````x_angle = ( atan2 (q[3] , q[1] ) ) * 180 / PI ;
y_angle = (atan2 (q[3] , q[2] )  ) * 180 / PI ;
z_angle = (atan2 (sqrt ( q[1]*q[1] + q[2]*q[2] ) , q[3]) ) * 180 / PI ;
``````

The idea was to get readable data that could be used to interface them with processing and see, in real time simulation, the sensor's orientation...I didn't know how to do this using just quaternions.
..If there is a better solution I'm glad to hear it !

One last thing, I get very unstable data (x_angle , y_angle , z_angle), yet I'm not moving the sensor around..anyone knows how to correct this ?!

Thanks !!

Those quaternion to Euler equations are very unlikely to be correct. Take a look here for the usual ones: Conversion between quaternions and Euler angles - Wikipedia

One last thing, I get very unstable data (x_angle , y_angle , z_angle), yet I'm not moving the sensor around

Have you calibrated the gyro and the magnetometer?

Actually, I already used Euler angles and kalman filter to validate sensor's data, and I saw Euler angles limitations (gimbal lock)

Now I want to get a 3D orientation of the sensor with no limitation (no gimbal lock or whatsoever)... I understood it's not possible using trigonometry.

So, with quaternions how can I really see sensor's measurements, using processing simulation and a 3D presentation of the sensor ?
(for exp, with Euler angles, I used "RotateX" and "RotateY" )

and this is what I did basically for gyro and magnetometer calibration:
in void setup ():

`````` for (i=0;i<100;i++)
{
gyroX = IMU_getData (MPU9250_ADDRESS, GYRO_XOUT_H, GYRO_XOUT_L);
gyroY = IMU_getData (MPU9250_ADDRESS, GYRO_YOUT_H, GYRO_YOUT_L);
gyroZ = IMU_getData (MPU9250_ADDRESS, GYRO_ZOUT_H, GYRO_ZOUT_L);
magX = IMU_getData (MAG_ADDRESS, MAG_XOUT_H, MAG_XOUT_L);
magY = IMU_getData (MAG_ADDRESS, MAG_YOUT_H, MAG_YOUT_L);
magZ = IMU_getData (MAG_ADDRESS, MAG_ZOUT_H, MAG_ZOUT_L);
gyroXbias += gyroX ;
gyroYbias += gygyroY ;
gyroZbias += gyroZ ;
magXbias += magX ;
magYbias += magY ;
magZbias += magZ ;

delay (10);
}
gyroXbias = gyroXbias / 100 ;
gyroYbias = gyroYbias / 100 ;
gyroZbias = gyroZbias / 100 ;
magXbias = magXbias / 100 ;
magYbias = magYbias / 100 ;
magZbias = magZbias / 100 ;
``````

in void loop():

``````     AccelXdata = IMU_getData (MPU9250_ADDRESS, ACCEL_XOUT_H, ACCEL_XOUT_L);
AccelYdata = IMU_getData (MPU9250_ADDRESS, ACCEL_YOUT_H, ACCEL_YOUT_L);
AccelZdata = IMU_getData (MPU9250_ADDRESS, ACCEL_ZOUT_H, ACCEL_ZOUT_L);
GyroXdata = IMU_getData (MPU9250_ADDRESS, GYRO_XOUT_H, GYRO_XOUT_L);
GyroYdata = IMU_getData (MPU9250_ADDRESS, GYRO_YOUT_H, GYRO_YOUT_L);
GyroZdata = IMU_getData (MPU9250_ADDRESS, GYRO_ZOUT_H, GYRO_ZOUT_L);
MagXdata = IMU_getData (MAG_ADDRESS, MAG_XOUT_H, MAG_XOUT_L);
MagYdata = IMU_getData (MAG_ADDRESS, MAG_YOUT_H, MAG_YOUT_L);

AccelXdata = AccelXdata / accel_sensitivity ;
AccelYdata = AccelYdata / accel_sensitivity ;
AccelZdata = AccelZdata / accel_sensitivity ;
GyroXdata = ( GyroXdata - gyroXbias ) / gyro_sensitivity ;
GyroYdata = ( GyroYdata - gyroYbias ) / gyro_sensitivity ;
GyroZdata = ( GyroZdata - gyroZbias ) / gyro_sensitivity ;
MagXdata = ( MagXdata - magXbias ) *XY_mag_adj ;
MagYdata = ( MagYdata - magYbias ) *XY_mag_adj ;
MagZdata = ( MagZdata - magZbias ) *Z_mag_adj ;

MadgwickQuaternionUpdate (AccelXdata , AccelYdata , AccelZdata , GyroXdata , GyroYdata , GyroZdata , MagXdata , MagYdata , MagZdata );
``````

I get ASAX, ASAY and ASAZ from magnetometer registers (using Fuse ROM Mode)
ASAX = ASAY = 171
ASAZ = 161

according to MPU-9250-Register-Map :
XY_mag_adj = ( ( ASAX - 128 ) * 0.5 / 128 ) + 1
Z_mag_adj = ( ( ASAZ - 128 ) * 0.5 / 128 ) + 1

any idea ? anyone who can help ?

using quaternions, how can I really see sensor's rotation in processing ?

I mean, using Euler angles and kalman filter, I get angles as outputs (drawback: gimbal lock).

But, using quaternions (no gimbal lock), I get inertial frame coordinates of a 3D vector and a rotation around this vector..However, I don't know how to use them in processing to simulate sensor's rotation (I need to do that)

using quaternions, how can I really see sensor's rotation in processing ?

Converting the quaternion into proper Euler angles would be a start. The formulas in reply #5 are not correct.

But, when using Euler angles, I'll face "gimbal lock" how to get around "gimbal lock" ?

ALL angular systems have ambiguities. There is simply no unique way to use 3 angles to specify a 3D orientation.

You will have to learn to live with it.

That was very clear, thanks !!

So, to summarize, I have only two options;

• Euler angles (3 angles) with gimbal lock
• Quaternion (full 3D orientation) with no gimbal lock, but I have to deal only with coordinates (no trigonometry)