I had some basic questions about the data from these devices, but as I was posting, it seems I'm now just rambling on or musing...
First, it would help if the tutorial listed the actual models somewhere rather than ADXL3xx or ADXL330 series, just so Google indexes the actual part. If it's there, then that's my poor Googling skills... http://arduino.cc/en/Tutorial/ADXL3xx
I was using a sample code, I think from Adafruit. There, a max and min value is defined and a calibration function is used to set the full range of the device. I do not understand the concept of its function. For example, limits are initially set at 512 for both. Then the device is placed with each axis parallel to gravity, and the current data out is set for that axis min/max, if it is beyond the current value. Following, subsequent readings are mapped between the min/max of +/-1000, and converted to a g-force.
I was surprised to find the min/max output differ by only ~130 (range from ~ 400 to 530), and given that I am able to get a max force of +/-5g, then my g resolution is around 0.07g. I further guess that the sampling code, in averaging 10 readings, results in a false sense of increased resolution (I see values of +/- 0.00, 0.01, 0.02,...).
I'm just wondering if I'm on track with my thinking and if the max/min range seems too small (I suspect the data is 10 bits and could potentially range from 0 to 1023)? I'm just a little confused because if the value of 400 is mapped as the minimum (-1000), this is -1g, but what happens when a reading of 399 is returned? That is out of the mapped range.