Searched through the threads and couldnt find anything helpful.
I’m making an accelerometer for my car to measure g force. I’m using a LIS3DH on my UNO R3. I’m getting data displayed correctly but the issue that I’m having is that the accelerometer itself has to be level to get accurate readings.
I’ve had a couple ideas to work around but wasn’t sure of the best route of action.
My ideas:
when level, the readings are approximately x: 0.0 , y: 0.0, z: 1.0. Basically I’d save the offset xyz values and subtract them and then display them to show 0.0,0.0,1.0. Basically offset the values by the current position.
use a gyroscope? To track orientation to calibrate it to a certain position?
Not sure where to go from here. any thoughts are appreciated. I’m new to arduino as this is my first project. I do have experience in programming though.
As you know, accelerometers measure the acceleration due to gravity in addition to those caused by other forces. If the accelerometer is not perfectly level (for example, with the Z axis not kept perfectly vertical) some component of g will appear on the X and Y axes, introducing error into X and Y measurements.
Unfortunately, without expensive equipment (like a genuine gyro, not a MEMS rate gyro), it is almost impossible to stabilize and maintain that perfectly vertical orientation in a vehicle that is accelerating. That is why IMUs that are used for true inertial monitoring cost in the range of $10K - $50K.
so, say i wanted to mount a accelerometer in my car, but it wasnt mounted perfectly level. i couldnt just offset the values displayed in the programming so it would show the values from 0.0,0.0,1.0? maybe use a button to calibrate/recalibrate?
What exactly are you trying to measure? Forward acceleration of the vehicle? or the overall acceleration magnitude being experienced?
Your accelerometer will generate 3 outputs relating to the 3 spacial dimensions (x, y, z). Why don't you just combine these to get an overall acceleration vector. For example...
The total acceleration is best described by a vector consisting of the x, y, and z components that you have from the accelerometer. The magnitude of the acceleration is the square root of the sum of the squares of each component, i.e. sqrt( x^2 + y^2 + z^2) .
im measuring g force in each spacial dimension. The goal is to have a meter in my car to measure g force in X Y and Z axis to use to for reference while cornering and braking. jreminton's point made me really question my methods though.
It is possible to subtract the acceleration due to gravity from the total acceleration, giving what people (incorrectly) call "linear acceleration".
In practice that doesn't work with consumer grade sensors, because you need to know the 3D orientation of the accelerometer very accurately in order to do the subtraction.