How to calibrate an accelerometer?

I'm confused about a few points on how to best calibrate an accelerometer, whose data will be used in an orientation sensor fusion algorithm.

As a summary, the most common approaches I've seen take measurements in 6 different orientations (1G in +x, -x, +y, -y, +z, -z), to then arrive at a max, min measured value. Offsets are then calculated as averages: (max - min)/2. And then a scaling factor is calculated. There's a few aspects of this that I'm unclear about, and was hoping it's okay to ask in a single question, since they are related.

1) the examples calculate the average as written above, rather than sum all vals / N vals. Isn't that approach more error-prone?

2) should the offset per axis be calculated once for each orientation, and then summed?

3) what's the best approach to calculate scale bias?

Example 1 (pseudo-code):

chordlengthX = (max - min)/2
chordlengthY = (max - min)/2
chordlengthZ = (max - min)/2

avg_rad = (chordlengthX + chordlengthY + chordlengthZ)/3

scaleX = avg_rad/chordlengthX

calibratedX = (x - offset)*scaleX

Example 2:

rawrange = max - min
refrange = refmax - refmin // 1G??

calibratedX = (((x - min)*refrange)/rawrange) +refmin

Something else altogether (as you can probably tell I need to revise my maths)?

4) if my surface is not completely level, I cannot trust my measurements since my 1G reference is not accurate?

A much, much better approach is described here: Tutorial: How to calibrate a compass (and accelerometer) with Arduino | Underwater Arduino Data Loggers

1 Like

Thanks for responding, jremington. I saw responses from you on similar posts several times on Google, and I see you mentioned in that post as well! I've actually come across it before, but gave up as it looked too complicated. After your suggestion though, I gave it another shot and this is the summary of my understanding (as far as accelerometer calibration is concerned, I was using MotionCal for the magnetometer):

  1. Install magneto
  2. record accelerometer data in mG (not sure whether I should be moving the sensor about, or record data in several(?) static orientations?)
  3. enter a value of 1000 milliGalileo as the “norm” for the gravitational field in magneto (if I'm getting data in G, I just multiply by 1000 to get value in mG?)
  4. adjust your uncalibrated accelerometer data by the scale factor matrix (default positive?) and bias that you get from magneto
  5. test again in magneto with calibrated data (hoping to get scale matrix close to identity matrix). I can't draw charts in Python yet, so it's difficult to verify the outcome of calibration, but so it goes for the time being.

Apologies for the stupid questions.

The procedure is a bit complicated, but if you are careful, it works much better than any other.

The measurement units don't matter. Record the raw data, as the corrections are applied to those.

Orient the sensor in all possible directions, 200 to 300 measurements. Ideally the directions should cover the entire 3D sphere uniformly. The sensor should be still when each measurement is made.

For the "norm", enter a value that is roughly the maximum reading (or the average vector length). You want the diagonal elements of the correction matrix to be about 1.

1 Like

Thank you again for taking the time to respond. It's a huge help! I really, really appreciate it. I'll test this tomorrow and feedback how it goes :slight_smile:

I did some tests today and just wanted to write about my results. I gave the programme data in m/s2 and these are the results I got.

After inputting those calibration values into a sketch these are the "calibrated" results I got on a second test.

And this is just some stats about the uncalibrated, and later calibrated data from the accelerometer.

Sum 193.74
Average 0.461 0.818 -0.170 0.736
Median 0.060
Noise 59.410 19.630 19.820 19.960
Mid / 0.205 -0.310 -0.170

Sum 44.78
Average 0.083 0.090 0.051 0.108
Median 0.080
Noise 59.030 19.650 19.670 19.710
Mid / 0.005 0.035 -0.025

The lower total averages are making me believe there was a positive improvement? Although, it seems the sensor was fairly accurate to begin with? I'm wondering now, if I did things properly and if all the floating-point math is too much on a ATSAMD21G?

Good job!

The initial calibration shows that there were significant offsets (up to 0.3 m/s^2 instead of 0) and that the X and Z axes differed in sensitivity by more that 1%.

Those errors have been corrected, so the effort will make a big difference, for example, much better directional accuracy when used in navigation.

If you do the same with a magnetometer, you will see much larger corrections.

1 Like

I don't know if you'll see this message, but I wanted to say again I am truly grateful for your help. Thank you so much, jremington!

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.