Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages

This paper presents FinGTrAC, a system that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on the wrist). The key contribution is in scaling up gesture recognition to hundreds of gestures while using only a sparse wearable sensor set where prior works have been able to only detect tens of hand gestures. Such sparse sensors are convenient to wear but cannot track all fingers and hence provide under-constrained information. However application specific context can fill the gap in sparse sensing and improve the accuracy of gesture classification. Rich context exists in a number of applications such as user-interfaces, sports analytics, medical rehabilitation, sign language translation etc. This paper shows the feasibility of exploiting such context in an application of American Sign Language (ASL) translation. Noisy sensor data, variations in gesture performance across users and the inability to capture data from all fingers introduce non-trivial challenges. FinGTrAC exploits a number of opportunities in data preprocessing, filtering, pattern matching, context of an ASL sentence to systematically fuse the available sensory information into a Bayesian filtering framework. Culminating into the design of a Hidden Markov Model, a Viterbi decoding scheme is designed to detect finger gestures and the corresponding ASL sentences in real time. Extensive evaluation on 10 users shows a recognition accuracy of 94.2% for 100 most frequently used ASL finger gestures over different sentences. When the size of the dictionary is extended to 200 words, the accuracy is degrades gracefully to 90% thus indicating the robustness and scalability of the multi-stage optimization framework.


  • slr_imwut.pdf

    size: 3.69 MB | mime_type: application/pdf | date: 2022-07-19 | sha256: 313350e


Work Title Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages
Open Access
  1. Yilin Liu
  2. Fengyang Jiang
  3. Mahanth Gowda
  1. IoT
  2. Wearable
  3. Gesture
  4. Bayesian Inference
License In Copyright (Rights Reserved)
Work Type Article
  1. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Publication Date September 4, 2020
Publisher Identifier (DOI)
Deposited July 19, 2022




This resource is currently not in any collection.

Work History

Version 1

  • Created
  • Added slr_imwut.pdf
  • Added Creator Yilin Liu
  • Added Creator Fengyang Jiang
  • Added Creator Mahanth Gowda
  • Published
  • Updated Work Title, Keyword, Subtitle Show Changes
    Work Title
    • Finger Gesture Tracking for Interactive Applications
    • Finger Gesture Tracking for Interactive Applications: A Pilot Study with Sign Languages
    • IoT, Wearable, Gesture, Bayesian Inference
    • A Pilot Study with Sign Languages
  • Updated