Silent Expressions: Two-Handed Indian Sign Language Recognition Using MediaPipe and Machine Learning


Authors : Riya Awalkar; Aditi Sah; Renuka Barahate; Yash Kharche; Ashwini Magar

Volume/Issue : Volume 10 - 2025, Issue 3 - March


Google Scholar : https://tinyurl.com/ymyvvvux

Scribd : https://tinyurl.com/2vxkvwsb

DOI : https://doi.org/10.38124/ijisrt/25mar598

Google Scholar

Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.

Note : Google Scholar may take 15 to 20 days to display the article.


Abstract : Indian Sign Language (ISL) is an essential communication medium for individuals with hearing and speech impairments. This research introduces an efficient ISL recognition system that integrates deep learning with real-time hand tracking. Utilizing MediaPipe Hands for landmark detection and a Convolutional Neural Network (CNN) for classification, the model enhances recognition accuracy by incorporating two-hand detection. Additionally, pyttsx3 is used for speech synthesis, providing audio output for detected gestures. The system is designed to function in diverse environments, ensuring accessibility. Experimental evaluations demonstrate high accuracy, and the framework is adaptable for future enhancements, such as multi-language recognition and dynamic gesture interpretation.

Keywords : Indian Sign Language, Deep Learning, MediaPipe, LSTM, CNN, Sign Recognition.

References :

  1. Agarwal, S.R.; Agrawal, S.B.; Latif, A.M. Article: Sentence Formation in NLP Engine on the Basis of Indian Sign Language using Hand Gestures. Int. J. Comput. Appl. 2015, 116, 18–22.
  2. Chen, J.K. Sign Language Recognition with Unsupervised Feature Learning; CS229 Project Final Report; Stanford University: Stanford, CA, USA, 2011.
  3. Manware, A.; Raj, R.; Kumar, A.; Pawar, T. Smart Gloves as a Communication Tool for the Speech Impaired and Hearing Impaired. Int. J. Emerg. Technol. Innov. Res. 2017, 4, 78–82.
  4. Mekala, P.; Gao, Y.; Fan, J.; Davari, A. Real-time sign language recognition based on neural network architecture. In Proceedings of the IEEE 43rd Southeastern Symposium on System Theory, Auburn, AL, USA, 14–16 March 2011.
  5. Ministry of Statistics & Programme Implementation. Available online: https://pib.gov.in/PressReleasePage.aspx?P RID=1593253
  6. Nandy, A.; Prasad, J.; Mondal, S.; Chakraborty, P.; Nandi, G. Recognition of Isolated Indian Sign Language Gesture in Real Time. Commun. Comput. Inf. Sci. 2010, 70, 102–107.
  7. Papastratis, I.; Chatzikonstantinou, C.; Konstantinidis, D.; Dimitropoulos, K.; Daras, P. Artificial Intelligence Technologies for Sign Language. Sensors 2021, 21, 5843. [CrossRef] [PubMed]
  8. Sharma, M.; Pal, R.; Sahoo, A. Indian sign language recognition using neural networks and KNN classifiers. J. Eng. Appl. Sci. 2014, 9, 1255–1259.
  9. Shivashankara, S.; Srinath, S. American Sign Language Recognition System: An Optimal Approach. Int. J. Image Graph. Signal (accessed on 5 January 2022). Process. 2018, 10, 18–30.
  10. Wadhawan, A.; Kumar, P. Sign language recognition systems: A decade systematic literature review. Arch. Comput. Methods Eng. 2021, 28, 785–813. [CrossRef]
  11. Wazalwar, S.S.; Shrawankar, U. Interpretation of sign language into English using NLP techniques. J. Inf. Optim. Sci. 2017, 38, 895–910. [CrossRef]
  12. Camgoz, N.C.; Hadfield, S.; Koller, O.; Ney, H.; Bowden, R. Neural Sign Language Translation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018, Salt Lake City, UT, USA, 18–22 June 2018; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar]
  13. Muthu Mariappan, H.; Gomathi, V. Real-Time Recognition of Indian Sign Language. In Proceedings of the International Conference on Computational Intelligence in Data Science, Haryana, India, 6–7 September 2019. [Google Scholar]
  14. Mittal, A.; Kumar, P.; Roy, P.P.; Balasubramanian, R.; Chaudhuri, B.B. A Modified LSTM Model for Continuous Sign Language Recognition Using Leap Motion. IEEE Sens. J. 201919, 7056–7063. [Google Scholar] [CrossRef]
  15. De Coster, M.; Herreweghe, M.V.; Dambre, J. Sign Language Recognition with Transformer Networks. In Proceedings of the Conference on Language Resources and Evaluation (LREC 2020), Marseille, France, 13–15 May 2020; pp. 6018–6024. [Google Scholar]
  16. Liao, Y.; Xiong, P.; Min, W.; Min, W.; Lu, J. Dynamic Sign Language Recognition Based on Video Sequence with BLSTM-3D Residual Networks. IEEE Access 20197, 38044–38054. [Google Scholar] [CrossRef]
  17. Adaloglou, N.; Chatzis, T. A Comprehensive Study on Deep Learning-based Methods for Sign Language Recognition. IEEE Trans. Multimed. 202224, 1750–1762. [Google Scholar] [CrossRef]

Indian Sign Language (ISL) is an essential communication medium for individuals with hearing and speech impairments. This research introduces an efficient ISL recognition system that integrates deep learning with real-time hand tracking. Utilizing MediaPipe Hands for landmark detection and a Convolutional Neural Network (CNN) for classification, the model enhances recognition accuracy by incorporating two-hand detection. Additionally, pyttsx3 is used for speech synthesis, providing audio output for detected gestures. The system is designed to function in diverse environments, ensuring accessibility. Experimental evaluations demonstrate high accuracy, and the framework is adaptable for future enhancements, such as multi-language recognition and dynamic gesture interpretation.

Keywords : Indian Sign Language, Deep Learning, MediaPipe, LSTM, CNN, Sign Recognition.

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe