Sign Language Interpreter using Deep Learning
Avishkar More, Roshan Pandav, Dr.Sangita Bharkad, Saurabh More , Riya sonwane , Sarang Shete
The Communication with signs has long been an important means of communication among the hearing and speech impaired, who are often called deaf-blind. For these people, sharing their words with others is the only form of communication, so other people must understand their words, fingers according to American Sign Language [Fig.1] using neural networks. In our method, the frame of image first passes through the filter, then image passes through the process that predict the direction class. This app will help you learn the language. The data used is the Indian Sign Language dataset. The application, which can be used in schools or everywhere, will facilitate communication between the disabled and the dis abled. The plan can be used to make learning the language easier.
Article Details
Unique Paper ID: 160384

Publication Volume & Issue: Volume 10, Issue 1

Page(s): 804 - 810
Article Preview & Download

Share This Article

Conference Alert


AICTE Sponsored National Conference on Smart Systems and Technologies

Last Date: 25th November 2023

SWEC- Management


Last Date: 7th November 2023

Go To Issue

Call For Paper

Volume 10 Issue 1

Last Date for paper submitting for March Issue is 25 June 2023

About Us enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on

Social Media

Google Verified Reviews