American Sign Language Recognition using Deep Learning
Author(s):
A P Purushotham, Jayanth A, Kiran Kumar S, Akilesh N S
Keywords:
ASL, American Sign Language, CNN, Deep Learning
Abstract
ASL (American Sign Language) is a difficult language. It is determined by the unique gesture stander of marks on the hands. Hands convey these marks, with face expression and body position assisting. ASL is the primary language of deaf and hard-of-hearing persons in North America and other parts of the world. The use of Deep Learning to recognise static ASL gestures is proposed in this paper. The contribution consists of a problem-solving approach. Convolution Neural Network (CNN) and Deep learning has been used to classify the 24 alphabetic static letters of ASL. The classification accuracy is 99.68 percent, with a loss function error of 0.32. In compared to other comparable studies like CNN, SVM, and ANN for training, the training is quick and produces excellent results.
Article Details
Unique Paper ID: 154907

Publication Volume & Issue: Volume 8, Issue 12

Page(s): 579 - 583
Article Preview & Download


Share This Article

Join our RMS

Conference Alert

NCSEM 2024

National Conference on Sustainable Engineering and Management - 2024

Last Date: 15th March 2024

Call For Paper

Volume 11 Issue 1

Last Date for paper submitting for Latest Issue is 25 June 2024

About Us

IJIRT.org enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on editor@ijirt.org

Social Media

Google Verified Reviews