The modular design of the system makes it suitable for integration into assistive communication tools for the
hearing- and speech-impaired communities. Its scalability and efficiency also enable deployment on low-cost
devices such as smartphones and embedded boards.
Future Work
Although the system performs well for static gestures, several areas can be explored further:
Dynamic Gesture Recognition: Extending the model to support real-time recognition of dynamic
gestures, including full words or sentences.
Real-Time Implementation: Integrating the system with a webcam or mobile camera for live gesture
recognition.
ISL Grammar Translation: Converting sequences of gestures into grammatically correct ISL
sentences.
Multilingual Support: Translating recognized ISL gestures into multiple spoken/written languages.
User Feedback Loop: Adding interactive feedback for correcting or retraining the model with custom
gestures.
By addressing these areas, the system can be transformed into a complete, real-time ISL translation solution.
REFERENCES
1. S. Mishra, R. Agarwal, and P. Verma, “Deep learning-based hand gesture recognition for Indian Sign
Language,” *International Journal of Computer Applications*, vol. 177, no. 20, pp. 1–6, Nov. 2020.
2. R. Kumar and M. Raj, “Indian Sign Language recognition using CNN,” in *Proc. 2019 4th International
Conference on Computer and Communication Systems (ICCCS)*, Singapore, 2019, pp. 270–274.
3. T. Sarkar, A. Bansal, and K. Sinha, “Real-time sign language interpreter using deep learning,” in *2021
IEEE International Conference on Artificial Intelligence and Smart Systems (ICAIS)*, Coimbatore,
India, 2021, pp. 235–240.
4. A. Ghosh and S. Basu, “Comparative study of ASL and ISL recognition using CNN and transfer
learning,” *Multimedia Tools and Applications*, vol. 81, pp. 31245–31263, 2022.
5. N. Prajapati, K. Patel, and M. Shah, “Sign language to text conversion for ISL using image processing,”
*International Journal of Advanced Research in Electrical, Electronics and Instrumentation
Engineering*, vol. 7, no. 4, pp. 1776–1782, Apr. 2018.
6. A. Jain and D. Patel, “Efficient static ISL alphabet recognition using CNN and data augmentation,”
*International Journal of Innovative Technology and Exploring Engineering (IJITEE)*, vol. 10, no. 6,
pp. 15–20, Apr. 2021.
7. R. Verma, P. Singh, and A. Sharma, “Hand gesture recognition for ISL using CNN and background
elimination,” in *Proceedings of the 2020 International Conference on Smart Innovations in Design,
Environment, Management, Planning and Computing (ICSIDEMPC)*, Jaipur, India, 2020, pp. 89–94.
8. R. Roy, S. Chatterjee, and N. Dey, “Recognition of ISL hand gestures using deep learning techniques,”
*Procedia Computer Science*, vol. 167, pp. 2406–2415, 2022.
9. H. Kaur and R. Gupta, “Indian Sign Language alphabet recognition system using CNN and GUI,”
*International Research Journal of Engineering and Technology (IRJET)*, vol. 8, no. 3, pp. 950–955,
Mar. 2021.
10. P. Agarwal, D. Agarwal, M. Yadav, K. Rani, A. Gupta, R. Dubey, “VIRTUAL MOUSE WITH
GESTURE CONTROL,” *IJESET*, vol. 11, no. 2, pp.185–192, Oct. 2023.
Page 525