AR Canvas with Python

  • Unique Paper ID: 155987
  • PageNo: 489-495
  • Abstract:
  • Writing is a unified type of communication that allows us to successfully communicate our ideas. Today's standard means of recording information include typing and writing. With a marker or a finger, characters or words are written in the empty area. The pen does not move up and down as it does in typical writing techniques. Human gestures can now control the digital world thanks to the development of clever wearable gadgets. These wearable technologies are capable of recognising and comprehending human activities. Gesture recognition is the process of recognising and interpreting a continuous sequential gesture stream from a collection of input data. Gestures are nonverbal cues that help computers grasp what they're saying. Vision perceives human motions, and computer vision is used to analyse diverse gestures. The project takes advantage of this gap by concentrating on the development of a motion-to-text converter that might be used as software for intelligent wearable gadgets that allow users to write from the air. The technology will employ computer vision to track the route of the finger, allowing for writing from above. The created text may be utilised for a variety of applications, including sending messages and e-mails. For the deaf, it will be a strong way of communication. It's an efficient communication approach that eliminates the need to write, reducing mobile and laptop use.

Copyright & License

Copyright © 2026 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{155987,
        author = {Piyush Garg and Yashika Choudhary and Utkarsh Pandita and Samarth Singh Thakur and Veena Jadhav and Dr. Rohini Jadhav},
        title = {AR Canvas with Python},
        journal = {International Journal of Innovative Research in Technology},
        year = {},
        volume = {9},
        number = {2},
        pages = {489-495},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=155987},
        abstract = {Writing is a unified type of communication that allows us to successfully communicate our ideas. Today's standard means of recording information include typing and writing. With a marker or a finger, characters or words are written in the empty area. The pen does not move up and down as it does in typical writing techniques. Human gestures can now control the digital world thanks to the development of clever wearable gadgets. These wearable technologies are capable of recognising and comprehending human activities. Gesture recognition is the process of recognising and interpreting a continuous sequential gesture stream from a collection of input data. Gestures are nonverbal cues that help computers grasp what they're saying. Vision perceives human motions, and computer vision is used to analyse diverse gestures. The project takes advantage of this gap by concentrating on the development of a motion-to-text converter that might be used as software for intelligent wearable gadgets that allow users to write from the air. The technology will employ computer vision to track the route of the finger, allowing for writing from above. The created text may be utilised for a variety of applications, including sending messages and e-mails. For the deaf, it will be a strong way of communication. It's an efficient communication approach that eliminates the need to write, reducing mobile and laptop use.},
        keywords = {Character Recognition, Object Detection, Real-Time Gesture Control System, Real-Time Gesture Control System, Smart Wearables. },
        month = {},
        }

Cite This Article

Garg, P., & Choudhary, Y., & Pandita, U., & Thakur, S. S., & Jadhav, V., & Jadhav, D. R. (). AR Canvas with Python. International Journal of Innovative Research in Technology (IJIRT), 9(2), 489–495.

Related Articles