CineFeel: Artificial Intelligence (AI) Driven Multisensory Cinema App for the Visually Impaired

  • Unique Paper ID: 196327
  • Volume: 12
  • Issue: 11
  • PageNo: 2953-2964
  • Abstract:
  • Movie viewing is usually a difficult experience to people who cannot see since they are deprived of the exciting and detailed experience that movies are typically associated with. Although there are already solutions to the problem, such as audio description which provide some assistance, they usually have a number of limitations and do not establish a strong emotional connection with films. This paper purpose, an AI based multisensory cinema application that aims to improve the movie experience of individuals with visual disabilities. Based on smart AI applications such as Natural Language Processing, Speech Synthesis and Computer vision, the system generates real time audio descriptions, emotional descriptions and sensory information. In so doing, CineFeel makes users have a better understanding of scenes, character feeling and what is happening on the screen. Test findings indicate that the new systems highly increase the level of interaction and emotional attachment by the users as opposed to working with audio alone. Ultimately, CineFeel desires to make the experience of watching movies more comfortable, entertaining, and pleasant to those with poor eye sight.

Copyright & License

Copyright © 2026 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{196327,
        author = {MR. Shriyansh R. Fasate and Ms. Vaibhavi V. Raibole and Mr. Durvesh K Hukare and Ms. Himani S. More},
        title = {CineFeel: Artificial Intelligence (AI) Driven Multisensory Cinema App for the Visually Impaired},
        journal = {International Journal of Innovative Research in Technology},
        year = {2026},
        volume = {12},
        number = {11},
        pages = {2953-2964},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=196327},
        abstract = {Movie viewing is usually a difficult experience to people who cannot see since they are deprived of the exciting and detailed experience that movies are typically associated with. Although there are already solutions to the problem, such as audio description which provide some assistance, they usually have a number of limitations and do not establish a strong emotional connection with films. This paper purpose, an AI based multisensory cinema application that aims to improve the movie experience of individuals with visual disabilities. Based on smart AI applications such as Natural Language Processing, Speech Synthesis and Computer vision, the system generates real time audio descriptions, emotional descriptions and sensory information. In so doing, CineFeel makes users have a better understanding of scenes, character feeling and what is happening on the screen. Test findings indicate that the new systems highly increase the level of interaction and emotional attachment by the users as opposed to working with audio alone. Ultimately, CineFeel desires to make the experience of watching movies more comfortable, entertaining, and pleasant to those with poor eye sight.},
        keywords = {Artificial Intelligence, visually impaired, Multisensory experience, Computer vision, Audio description, Accessibility.},
        month = {April},
        }

Cite This Article

Fasate, M. S. R., & Raibole, M. V. V., & Hukare, M. D. K., & More, M. H. S. (2026). CineFeel: Artificial Intelligence (AI) Driven Multisensory Cinema App for the Visually Impaired. International Journal of Innovative Research in Technology (IJIRT), 12(11), 2953–2964.

Related Articles