Indian Sign Language Translator
Near-real-time Indian Sign Language translation with face activation, hand detection, and sequence recognition.
Why I Built This
I wanted to build something with direct accessibility impact, not just a model that performs well on curated samples. This project focused on Indian Sign Language translation that could handle realistic background and lighting variation. The motivation was to contribute a practical communication aid while learning robust vision pipeline design end to end.
Pipeline
- Face detection as the activation mechanism.
- YOLO-v3-based hand detection for frame filtering and crop generation.
- HSV + YCbCr skin segmentation to reduce background noise.
- Transfer-learned SqueezeNet classifier over a cleaned ISL dataset.
Outcome
A near-real-time translator prototype designed for background and illumination robustness, built as a full final-year capstone with an open-source codebase.