I have an ADXL345 sensor.
The software is, or I tend to think, that's well build.
When I put it on flat surface the sensor and start the program in the Arduino, it gives me somewhere around (-7,7) values on x and y axis, and on z axis a somewhat constant 250, which means 1G. Till now, ok.
Let's say I move it at 45 degrees on y axis. It gives me somewhere 120 on that axis, which again is fine.
But when I reset it (the Arduino) on a non-zero-degree surface (x and y), for instance on 45 degrees on y axis, instead of giving me 120 on it, it gives me that interval of values highlighted up with 250 value on z axis.
Is this how the sensor should behave? Or is it something wrong with the code made by me?
Unless it is nothing bad at displaying this values then tell me to put the code to be analyzed.
I'm perplexed too, I have no code to look at nor can I grok
instead of giving me 120 on it, it gives me that interval of values highlighted up with 250 value on z axis.
.
What highlighted values?
Tell us some actual values for each case and the code used to generate that output - then we don't have to guess. Show us what you have and what you think is wrong, don't just try to describe code and circuits in words!
That is not how the sensor should respond. If you reset the Arduino then the sensor should produce values appropriate to it's three axis angles. We would have to see your code and wiring to know what is happening.
These sensors are programmed internally to have zero output values for zero Gs and +/-256 for one G. However, there is a tolerance for these numbers. The Z-axis may have a zero offset as much as +/-20 in a typical case or several times higher in extreme cases. Also, the count per G can vary between 232 and 286. To determine the calibration values, you will need to rotate the device slowly through it's full range then calculate the zero offset and gain value. You can apply the corrections in the Arduino sketch or input the values into the device as G gain and offset for each axis. When properly calibrated these devices are quite good.
I can guarantee my wiring is perfect. It has been tested over and over again on the breadboard, and then I made a lot of schematics with the circuit, then simulating it again, and then build the PCB, then tested again and again and I can continue with this for a long time.
The only source of problem is then the code.
You'll have to pay attention at what functions I'm using from the class.
Firstly everything starts within the setup function from the sketch with the init() function.
The setOffsetDig function is used to calibrate the sensor by using it's built-in features.
Then I make another calibration from the Arduino with the setOffsetInt function which makes an average of 100 received values, save the calibration into some variables and then substract the data that's going to come with this values. You can call it, the 2nd calibration. I've used it because I've seen that the sensor's not giving exact measurements.
And secondly, from within the loop function I'm calling the getValues method.
Let's say, if I wasn't using the 2nd calibration the sensor would give me then something around 40 on x axis, -30 on y axis and 330 on the z axis. I couldn't let this values be processed, obviously.
I have attached the reply 2 files. One is the .h file and the other one is the .cpp file.
The two attached files are from the ADXL345 library. We need the sketch you wrote to control the board.
If you're running the calibrate function from a tilted position then that is likely the problem. I would expect the calibrate to determine gain and offset values for each axis after being positioned correctly. It would take a test with each axis pointing up or down. The values would be read from the board then entered as #define statements for entry back into the board. The calibrate function would be done only once during testing. Thereafter, the board would be loaded with these values as part of the setup() routine.
You may have to go through the ADXL345 user manual in PDF form to get all the details. I get my calibrate values from a separate sketch then include them in the calculations when the board is being used.
Excuse my distortions from the graphic, but my accelerometer is quite.. ,,big" therefore, quite difficult to move.
Now, what we are interested in is just the accelerometer values, which are in the left column. The red line is the x axis, the white one is the y axis and the green one is the z axis.
You'll see when the line is straight, the accelerometer doesn't move on that axis, but standing.
But when I reset it (the Arduino) on a non-zero-degree surface (x and y), for instance on 45 degrees on y axis, instead of giving me 120 on it, it gives me that interval of values highlighted up with 250 value on z axis.
As Arctic_Eddie pointed out, your init function is calibrating the offset on startup for your x, y, and z axis. This will result in whatever orientation you startup in as being your 0, 0, 250 position, since that is the position you are calibrating to. Either do as he recommended (use a separate calibration sketch, and load those calibrated values in your primary sketch), or make sure the unit only ever starts up in it's proper orientation. If the latter isn't practical, then the former is the method you need to use.
Sounds like I found the problem. The calibration.
But honestly I don't fully understand how this method of calibration works, so can you explain me in detail how to do the math?
I have to calibrate some very noisy and cranky values(instead of giving something close to 0 it gives me something close to 80 for example)(with the second offset turned off).
In brief there are 6 orientations where an accelerometer axis is physically pointing upwards, 2 for each axis. In each orientation (+X, -X, +Y, -Y, +Z, -Z) you want to measure the relevant value for that axis. So for +Z (the normal orientation) you'd get a positive value for Z, for -Z (upside down) you'd get a negative value. You would hope these are equal and opposite, but there will be some variations. We call these values zplus and zminus:
float zsensitivity = (zplus - zminus) / 2.0 ; // the value from the z-axis detector corresponding to 1g acceleration
int zoffset = zplus + zminus ; // the value from the z-axis detector when seeing 0g
So subsequently you can correct readings from the z-axis thus:
int zraw = analogRead (...) ;
zraw -= zoffset ; // correct for offset
float zg = zraw / zsensitivity ; // value in g, multiply by 9.8 for m/s/s