Automatic recognition of Indian Sign Language (ISL) gestures holds transformative potential for empowering individuals with hearing and speech impairments by eliminating reliance on human interpreters. This paper presents a robust, bidirectional ISL communication framework that translates hand gestures to text and text to sign, leveraging a camera-based acquisition module, a comprehensive ISL lexicon database, and a deep neural network classifier. During operation, live gesture samples are captured and preprocessed before being matched against stored ISL alphabets and vocabulary with 96% identification accuracy. In reverse mode, typed text is synthesized into dynamic sign animations to facilitate understanding. An integrated offline learning platform, backed by an extensive gesture repository and interactive tutorials, supports new users in acquiring ISL proficiency. Implemented with Python for algorithmic processing and standard web technologies for the user interface, the system delivers high performance, real-time recognition, and educational value, demonstrating its promise for accessible communication solutions.
Keywords: ISL- Indian Sign Language, Fingerspelling, CNN- convolutional neural network, FSDC - Frame Stream Density Compression, SSIM - Structural Similarity Index Measure