American Sign Language Recognition using Deep Learning

  • Unique Paper ID: 154907
  • Volume: 8
  • Issue: 12
  • PageNo: 579-583
  • Abstract:
  • ASL (American Sign Language) is a difficult language. It is determined by the unique gesture stander of marks on the hands. Hands convey these marks, with face expression and body position assisting. ASL is the primary language of deaf and hard-of-hearing persons in North America and other parts of the world. The use of Deep Learning to recognise static ASL gestures is proposed in this paper. The contribution consists of a problem-solving approach. Convolution Neural Network (CNN) and Deep learning has been used to classify the 24 alphabetic static letters of ASL. The classification accuracy is 99.68 percent, with a loss function error of 0.32. In compared to other comparable studies like CNN, SVM, and ANN for training, the training is quick and produces excellent results.

Copyright & License

Copyright © 2025 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{154907,
        author = {A P Purushotham and Jayanth A and Kiran Kumar S and Akilesh N S},
        title = {American Sign Language Recognition using Deep Learning},
        journal = {International Journal of Innovative Research in Technology},
        year = {},
        volume = {8},
        number = {12},
        pages = {579-583},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=154907},
        abstract = {ASL (American Sign Language) is a difficult language. It is determined by the unique gesture stander of marks on the hands. Hands convey these marks, with face expression and body position assisting. ASL is the primary language of deaf and hard-of-hearing persons in North America and other parts of the world. The use of Deep Learning to recognise static ASL gestures is proposed in this paper. The contribution consists of a problem-solving approach. Convolution Neural Network (CNN) and Deep learning has been used to classify the 24 alphabetic static letters of ASL. The classification accuracy is 99.68 percent, with a loss function error of 0.32. In compared to other comparable studies like CNN, SVM, and ANN for training, the training is quick and produces excellent results.},
        keywords = {ASL, American Sign Language, CNN, Deep Learning},
        month = {},
        }

Cite This Article

  • ISSN: 2349-6002
  • Volume: 8
  • Issue: 12
  • PageNo: 579-583

American Sign Language Recognition using Deep Learning

Related Articles