American Sign Language Recognition using Deep Learning

  • Unique Paper ID: 154907
  • Volume: 8
  • Issue: 12
  • PageNo: 579-583
  • Abstract:
  • ASL (American Sign Language) is a difficult language. It is determined by the unique gesture stander of marks on the hands. Hands convey these marks, with face expression and body position assisting. ASL is the primary language of deaf and hard-of-hearing persons in North America and other parts of the world. The use of Deep Learning to recognise static ASL gestures is proposed in this paper. The contribution consists of a problem-solving approach. Convolution Neural Network (CNN) and Deep learning has been used to classify the 24 alphabetic static letters of ASL. The classification accuracy is 99.68 percent, with a loss function error of 0.32. In compared to other comparable studies like CNN, SVM, and ANN for training, the training is quick and produces excellent results.

Cite This Article

  • ISSN: 2349-6002
  • Volume: 8
  • Issue: 12
  • PageNo: 579-583

American Sign Language Recognition using Deep Learning

Related Articles

Impact Factor
8.01 (Year 2024)

Join Our IPN

IJIRT Partner Network

Submit your research paper and those of your network (friends, colleagues, or peers) through your IPN account, and receive 800 INR for each paper that gets published.

Join Now

Recent Conferences

NCSEM 2024

National Conference on Sustainable Engineering and Management - 2024 Last Date: 15th March 2024

Submit inquiry