Human Motion Tracking and Haptic Display

started 14July08 MarkCutkosky KaustFig1.jpg

The Human Motion Tracking and Haptic Display project was started in June 2008 under a seed grant from KAUST to the Computer Science Dept. at Stanford. The goal is to create a closed-loop system that can monitor, analyze and provide immediate physical feedback to subjects regarding the motions, forces and muscle activity of their limbs. The applications include rehabilitation (e.g. relearning how to walk after a stroke) and athletics (e.g., perfecting a complex motion sequence). The research draws upon and integrates new results in three areas:

  • fast algorithms for computing and matching the dynamics of complex human movements to the measured velocities of optical markers (Khatib lab);
  • musculoskeletal simulations that predict the muscle forces and velocities associated with human movements (Delp, Besier labs);
  • wearable tactile devices that utilize a combination of skin stretch and vibration to provide users with an enhanced perception of joint movement and/or muscle force (Besier and Cutkosky labs). See the HapticsForGaitRetraining page for continuation of this work.


System Setup

Vicon Motion Capture

We are using a Vicon Motion System ( for motion capture. In order to operate our haptic device in realtime, we need to be able to pipe motion data out in realtime instead of collecting data and then post-processing. To develop this we use Vicon's real-time SDK:

Wearable Tactile Devices

The wearable tactile devices are based on:

Hardware for Control of Tactile Devices



This site is powered by the TWiki collaboration platformCopyright &© by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback