Hi all. I’m building a roving autonomous tank-like vehicle. I am trying to use an accelerometer to measure the robot’s acceleration in the X, Y and Z axis directions. Measuring these accelerations over time will tell me how far the robot has moved, rotated, climbed etc.
I am using the GadgetShield (http://ruggedcircuits.com/html/gadget_shield.html) from Rugged Circuits. It includes a 3-Axis Orientation/Motion Detection Sensor: The MMA7660FC is a ±1.5 g 3-Axis Accelerometer with Digital Output accessed via I2C.
I am reading from the accelerometer using the standard GadgetShield.h functions as shown below:
// Refresh the acceleration measurement samples GS.AccelSample(); // Get the accelerations. uint8_t ResultX = GS.AccelResultX(); uint8_t ResultY = GS.AccelResultY(); uint8_t ResultZ = GS.AccelResultZ();
When the robot is physically still, the above code returns ResultX=31, ResultY=36 and ResultZ=54. I expected ResultX and ResultY to both be 32 (zero acceleration).
Question 1: Why are they not 32? Why are they not the same?
A follow-on question: How do I accurately convert say ResultX to an acceleration in say metres/second/second? This is my current understanding:
The GadgetShield::s2u function means that AccelResultX/Y/Z functions return results in range 0 to 63 where
Result of 0 represents -1.5g
Result of 32 represents 0g (no acceleration)
Result of 63 represents -1.453g
Hence the accelerations can be calculated (in m/s2) as:
Accel = ( AccelResult - 32 ) * 1.5 * 9.80665 / 32
Question 2: Is this the right approach?
Thanks in advance for your help