THE DESIGN OF HAND GESTURE CONTROLLED VIRTUAL MOUSE USING CONVOLUTIONAL NEURAL NETWORK TECHNIQUE
Author(s):
NIHARIKA K, K BHUVANESH, Mohan M, Muhammad Zidan K M, Nagaraj M. Lutimath
Keywords:
Computer Vision, Deep Learning, Human- Computer Interaction (HCI), MediaPipe, PyAutoGUI
Abstract
This exploratory paper suggests a method for using computer vision and deep learning models to implement the functionalities of the cursor based on hand gestures. These models aim to replicate the swaying motion of human hand gestures and hold the key to further improving the performance of such computer vision solutions. In today's technological environment, several technologies are constantly evolving. One such promising idea is the human-machine interface. The concept is to use hand gestures to emulate mouse functionality on the screen without the use of any hardware, just by utilizing finger motions, a process known as gesture recognition. In this paper, we introduce a novel Human-Computer Interaction (HCI) strategy. For the implementation of this system, Python will be utilized, and its dependencies include OpenCV, MediaPipe, and the latest packages such as PyAutoGUI.
Article Details
Unique Paper ID: 159999
Publication Volume & Issue: Volume 9, Issue 12
Page(s): 944 - 947
Article Preview & Download
Share This Article
Conference Alert
NCSST-2021
AICTE Sponsored National Conference on Smart Systems and Technologies
Last Date: 25th November 2021
SWEC- Management
LATEST INNOVATION’S AND FUTURE TRENDS IN MANAGEMENT