Sign Language Interpreter using Deep Learning
Author(s):
Avishkar More, Roshan Pandav, Dr.Sangita Bharkad, Saurabh More , Riya sonwane , Sarang Shete
Keywords:
Abstract
The Communication with signs has long been an important means of communication among the hearing and speech impaired, who are often called deaf-blind. For these people, sharing their words with others is the only form of communication, so other people must understand their words, fingers according to American Sign Language [Fig.1] using neural networks. In our method, the frame of image first passes through the filter, then image passes through the process that predict the direction class. This app will help you learn the language. The data used is the Indian Sign Language dataset. The application, which can be used in schools or everywhere, will facilitate communication between the disabled and the dis abled. The plan can be used to make learning the language easier.
Article Details
Unique Paper ID: 160384

Publication Volume & Issue: Volume 10, Issue 1

Page(s): 804 - 810
Article Preview & Download


Share This Article

Join our RMS

Conference Alert

NCSEM 2024

National Conference on Sustainable Engineering and Management - 2024

Last Date: 15th March 2024

Call For Paper

Volume 11 Issue 1

Last Date for paper submitting for Latest Issue is 25 June 2024

About Us

IJIRT.org enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on editor@ijirt.org

Social Media

Google Verified Reviews