Facial Gesture and Head Movement-Based control System for Accessibility

  • Unique Paper ID: 170443
  • PageNo: 449-453
  • Abstract:
  • The integration of advanced computer vision into human-computer interaction (HCI) has opened new possibilities for accessibility solutions. This paper introduces a robust and non-invasive system for hands-free computer control, utilizing facial gestures and head movements. The system employs Mediapipe’s Face Mesh and Open CV to capture and interpret real-time facial landmarks, enabling functionalities such as cursor control, mouse clicks via smile detection, scrolling activation and direction through eye closures and head tilts, and window closure triggered by tongue gestures. Designed with accessibility as its core focus, this system offers an intuitive, cost-effective alternative for individuals with limited mobility. Experimental evaluations highlight its high accuracy and responsiveness across diverse environments, emphasizing its adaptability and potential for enhancing digital inclusivity. The proposed approach combines precision, efficiency, and practicality, making it a promising solution for accessible HCI applications.

Copyright & License

Copyright © 2026 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{170443,
        author = {B.Vijitha and M.Lavanya and M.Varshith Sai and N.Saiteja},
        title = {Facial Gesture and Head Movement-Based control System for Accessibility},
        journal = {International Journal of Innovative Research in Technology},
        year = {2024},
        volume = {11},
        number = {7},
        pages = {449-453},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=170443},
        abstract = {The integration of advanced computer vision into human-computer interaction (HCI) has opened new possibilities for accessibility solutions. This paper introduces a robust and non-invasive system for hands-free computer control, utilizing facial gestures and head movements. The system employs Mediapipe’s Face Mesh and Open CV to capture and interpret real-time facial landmarks, enabling functionalities such as cursor control, mouse clicks via smile detection, scrolling activation and direction through eye closures and head tilts, and window closure triggered by tongue gestures. Designed with accessibility as its core focus, this system offers an intuitive, cost-effective alternative for individuals with limited mobility. Experimental evaluations highlight its high accuracy and responsiveness across diverse environments, emphasizing its adaptability and potential for enhancing digital inclusivity. The proposed approach combines precision, efficiency, and practicality, making it a promising solution for accessible HCI applications.},
        keywords = {Data Acquisition, Facial Landmark Detection, Gesture Interpretation, Command Execution},
        month = {December},
        }

Cite This Article

B.Vijitha, , & M.Lavanya, , & Sai, M., & N.Saiteja, (2024). Facial Gesture and Head Movement-Based control System for Accessibility. International Journal of Innovative Research in Technology (IJIRT), 11(7), 449–453.

Related Articles