I am reading voltage values from an accelerometer and want to see the values in degrees on the serial monitor for x- axis. I tried couple of things but did not work out.

When accelerometer is lying flat the table produces 1.67 V at x - axis. when I move it 28 degrees, the voltage becomes 1.82v

I wanted to convert the "temp" value to voltage and then find the tilt angle. For example as I mentioned in my original post the accelerometer is producing 1.67 volts when lying flat on the table.

The "temp" variable gets 328 decimal number. I wanted to convert this number into volts by using the following formula

Xvolt = (370 x 5) / 1024 = 1.8V

then I will subtract Xvolt - 1.67= A = 0.13V

Then A / 0.3 = B = 0.45

since, 0.3 is sensitivity of the accelerometer.

Then arctan ( B) = angles in degree.

arctan(0.45) = 24.48

But I am not the getting the correct value of the angle. I tilted the x axis of the accelerometer 30 degrees but I am not getting the result right. The arctan should give 24 degree but its not. Any thoughts!

My code is as follows

float temp = analogRead (ACC_X);

float cal_x = ((temp*5)/(1024)); // decimal number to voltage

float cal_int_x = 1.67; // accelerometer outputs when lying flat on the table

cal_x = (cal_x - cal_int_x)/(0.3); // subtraction and dividing by the sensitivity of the accelerometer

float cal_x_1 = atan (cal_x); // angle calculation in degrees should be 30 degrees but getting -0.27

But now I have one more question. With this change I get only angle up to 52 degree. So, even if I keep changing the tilt, the " degree" stays at 52 . Even if I change the tilt to 90 degrees. My code is as follows

float temp = analogRead (ACC_X);
float cal_x = ((temp*5)/(1024)); // decimal number to voltage
float cal_int_x = 1.67; // accelerometer outputs when lying flat on the table
cal_x = (cal_x - cal_int_x)/(0.3); // subtraction and dividing by the sensitivity of the accelerometer
float cal_x_1 = atan (cal_x); // angle calculation in degrees should be 30 degrees but getting -0.27
float degree = (180/pi)* cal_x_1

As already stated in reply #2, to calculate a tilt angle in the X,Z plane (Z vertical), you need accelerometer measurements along the X and Z axes. Then, with a possible change of sign, the tilt angle in radians is atan2(X,Z).

jremington:
The method you are using is not correct.

As already stated in reply #2, to calculate a tilt angle in the X,Z plane (Z vertical), you need accelerometer measurements along the X and Z axes. Then, with a possible change of sign, the tilt angle in radians is atan2(X,Z).

Hi,

I am really really a beginner. Can you change the code and point out where am I making the mistake? I am trying to get this to work for past couple of days.

The thing is I never get the 90 degree. the program goes from 10 degrees to 52 degrees and saturates evenif I increase the tilt to 90 degrees. Why the code is not recognizing the angles greater than 50 degrees. My code is as below now. I really appreciate help!

It seems like that minimum value is 39 degrees and maximum is 52 degrees. The code starts when x axis equals to 39 degrees laying flat on the table and goes to 52 degrees when accelerometer reaches 90 with respect to ground. Increase in angle after 90 the code decrements back to 39 degrees

I apologize for misleading you. I forgot that with an analog accelerometer, you need to subtract the offset, that is, the reading you get when the axis is perfectly horizontal (the 0 g reading).

For a 5 V accelerometer and 5 V ADC, that will be an ADC reading of about 512, but you must determine the offset for each axis individually by experimentation.

Once suitable values (for example, X_offset and Z_offset) are determined, the code should look something like this:

Finally, if the sensitivity (the reading for 1 g acceleration after subtracting the offset) is not the same along each axis, you should also apply a scale factor to one of the axes, so that the 1 g values are the same for each axis.

If the sensitivities are the same for each axis, then no scale factors are needed, as atan2 will cancel them.

jremington:
I apologize for misleading you. I forgot that with an analog accelerometer, you need to subtract the offset, that is, the reading you get when the axis is perfectly horizontal (the 0 g reading).

For a 5 V accelerometer and 5 V ADC, that will be an ADC reading of about 512, but you must determine the offset for each axis individually by experimentation.

Once suitable values (for example, X_offset and Z_offset) are determined, the code should look something like this:

Finally, if the sensitivity (the reading for 1 g acceleration [u]after[/u] subtracting the offset) is not the same along each axis, you should also apply a scale factor to one of the axes, so that the 1 g values are the same for each axis.
If the sensitivities are the same for each axis, then no scale factors are needed, as atan2 will cancel them.

This is how the code looks right now. It takes the offset and sensitivity into account but again it does not detect 90 degrees or angles greater than 90 degrees. It still does not show zero when the accelerometer is laying flat on the table and 90 degrees when it is at 90 degrees with respect to table. I am using a PROTRACTOR to measure the angle and test the code. Is it the right approach to go with?