Gesture recognition is becoming a more common interaction tool in the fields of ubiquitous and wearable computing. Designing a system to perform gesture recognition, however, can be a cumbersome task. Hidden Markov models (HMMs), a pattern recognition technique commonly used in speech recognition, can be used for recognizing certain classes of gestures. Existing HMM toolkits for speech recognition can be adapted to perform gesture recognition, but doing so requires significant knowledge of the speech recognition literature and its relation to gesture recognition. Thus, we introduce the Georgia Tech Gesture Toolkit (GT2k), which leverages Cambridge University's speech recognition toolkit, HTK, to provide tools that support gesture recognition research. GT2k provides capabilities for training models and allows for both real-time and off-line recognition.
PublicationsGeorgia Tech Gesture Toolkit: Supporting Experiments in Gesture Recognition
Tracy Westeyn, Helene Brashear, Amin Atrash and Thad Starner.
To be published in the International Conference on Perceptive and Multimodal User Interfaces 2003
Abstract PDF HTML Bibtex Request hardcopy