I am currently working on building a motion capture suit using MPU sensors and ESP32. In our prototype, we are trying to first capture the right hand. We currently have 3 MPU9250 sensors whose magnetometers don't work. So we are using the accelorometer and gyroscope values. Our esp code parses the values, calibrates them and then sends them to the PC over Bluetooth in quaternion form. Then we map it onto Blender.
So far, the Blender model is responsive to sensor movements. But the arm is moving very weirdly. We think it might be due to axes of the sensor being different from the blender.
- Has anyone ever tried something like this or worked with the axes kf the sensors before?
- How should we position the sensor in our arm (hand, forearm and upperarm) like should the x axis shown in the sensors be towards our fingers?
- Would it be better if we switched to MPU6060?