Ultrasonic Lidar-Based 2D Room Mapping

  • Unique Paper ID: 187738
  • PageNo: 6655-6660
  • Abstract:
  • Environmental mapping is essential for robotics, automation, and spatial awareness, but traditional Lidar systems are costly and complex, requiring specialized hardware and computational power. This paper presents a budget-friendly alternative using ultrasonic sensors, an Arduino microcontroller, and Python-based visualization to create a functional 2D map of a room. The system uses two HC-SR04 ultrasonic sensors mounted on an SG90 servo motor, enabling 360-degree rotational scanning to capture distance data across an environment. As the servo moves, the sensors take readings, transmitting the data via serial communication, where Python processes the inputs, converts them into Cartesian coordinates, and dynamically visualizes them using Matplotlib. While ultrasonic sensors have limitations such as reflections, angular resolution constraints, and noise artifacts, their affordability makes them ideal for foundational research and educational projects. Experimental results confirm significant improvements over traditional fixed-time mapping methods, offering real-time adaptability and optimized environmental representation. Combining affordable hardware with intelligent software processing, this research showcases an accessible alternative for environmental mapping, making advanced spatial awareness techniques available to students, researchers, and hobbyists.

Copyright & License

Copyright © 2026 Authors retain the copyright of this article. This article is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

BibTeX

@article{187738,
        author = {Atharva Kavade and Dr. Ashutosh Marathe and Hardik Khade and Vivek Kendre and Sushant Katare},
        title = {Ultrasonic Lidar-Based 2D Room Mapping},
        journal = {International Journal of Innovative Research in Technology},
        year = {2025},
        volume = {12},
        number = {6},
        pages = {6655-6660},
        issn = {2349-6002},
        url = {https://ijirt.org/article?manuscript=187738},
        abstract = {Environmental mapping is essential for robotics, automation, and spatial awareness, but traditional Lidar systems are costly and complex, requiring specialized hardware and computational power. This paper presents a budget-friendly alternative using ultrasonic sensors, an Arduino microcontroller, and Python-based visualization to create a functional 2D map of a room. The system uses two HC-SR04 ultrasonic sensors mounted on an SG90 servo motor, enabling 360-degree rotational scanning to capture distance data across an environment. As the servo moves, the sensors take readings, transmitting the data via serial communication, where Python processes the inputs, converts them into Cartesian coordinates, and dynamically visualizes them using Matplotlib. While ultrasonic sensors have limitations such as reflections, angular resolution constraints, and noise artifacts, their affordability makes them ideal for foundational research and educational projects. Experimental results confirm significant improvements over traditional fixed-time mapping methods, offering real-time adaptability and optimized environmental representation. Combining affordable hardware with intelligent software processing, this research showcases an accessible alternative for environmental mapping, making advanced spatial awareness techniques available to students, researchers, and hobbyists.},
        keywords = {Traffic control, Traffic light system, Traffic management, Intelligent transport systems, Smart surveillance, Computer Vision, Machine Learning, Object detection, YOLO.},
        month = {November},
        }

Cite This Article

Kavade, A., & Marathe, D. A., & Khade, H., & Kendre, V., & Katare, S. (2025). Ultrasonic Lidar-Based 2D Room Mapping. International Journal of Innovative Research in Technology (IJIRT), 12(6), 6655–6660.

Related Articles