Fixed the Processing sketch that attempts to measure distances based on acceleration. (It was off by a factor of 10. Guess who didn’t convert from g to m/s^2.) Anyhow, instead of compensating for tilting in code, I built a rail:
Of course, this doesn’t mean I might not have to take the orientation into account at some point. Still, with the rail I can roughly measure the moved distance as the change in acceleration happens almost exclusively on the x axis.
Drifting is quite bad, though. I’ve got a good 15cm of travel on the rail and measuring movement that’s done right after the measurement starts (within a couple of seconds) seems accurate within +/- 1 cm. But when measuring a completely stationary sensor over a time of 30 seconds, the measurement may have wandered off even beyond 40cm. Currently, I’m fresh out of ideas on how to counter this.
I figured that building a data glove that can track the hands position in 3D space may be quite difficult measuring the accelration of the glove alone. Because the MPU-6050 is able to produce orientation data (as a quaternion), one could of course track the hands position from the shoulder on onward. The shoulder would be a “fixed point” with a ball joint. Having sensors on the upper and fore arm measuring the direction where the arm is pointing to (and knowing one’s arm’s length), one could calculate the location of the hand in relation to the shoulder. It’s not ideal for motion capturing either since the shoulder is capeable of subtle movements, but I think it might be the way to go.