started 14July08 MarkCutkosky
The Human Motion Tracking and Haptic Display project was started in June 2008 under a seed grant from KAUST to the Computer Science Dept. at Stanford. The goal is to create a closed-loop system that can monitor, analyze and provide immediate physical feedback to subjects regarding the motions, forces and muscle activity of their limbs. The applications include rehabilitation (e.g.
relearning how to walk after a stroke) and athletics (e.g., perfecting a complex motion sequence). The research draws upon and integrates new results in three areas:
fast algorithms for computing and matching the dynamics of complex human movements to the measured velocities of optical markers (Khatib lab);
musculoskeletal simulations that predict the muscle forces and velocities associated with human movements (Delp, Besier labs);
wearable tactile devices that utilize a combination of skin stretch and vibration to provide users with an enhanced perception of joint movement and/or muscle force (Besier and Cutkosky labs). See the HapticsForGaitRetraining page for continuation of this work.
We are using a Vicon Motion System (http://www.vicon.com) for motion capture. In order to operate our haptic device in realtime, we need to be able to pipe motion data out in realtime instead of collecting data and then post-processing. To develop this we use Vicon's real-time SDK: