Sensor Fusion for Localization and Mapping
Parth Malvi, Prof. Dhanashri Patil, Dr. Vitthal S. Gutte
Sensor Fusion, Localization, Mapping, Lidar, Cameras, IMU, Wheel Encoders, Extended Kalman Filter, Region Proximity Method, Autonomous Navigation
In the field of robotics and unmanned systems navigation, proper location finding and map-making are important for efficient operation, especially in ever-changing and intricate landscapes. However, relying on only one sensor often results in poor accuracy and robustness. To address this issue, this article focuses on exploring the idea of sensor fusion to achieve localization and mapping. By utilizing multiple sensors such as lidar, cameras, IMUs, and wheel encoders, a comprehensive perception of surrounding conditions can be obtained. In this paper, various fusion algorithms are analyzed, such as the extended Kalman filter or region proximity method, and they are integrated to leverage the capabilities of each sensor in compensating for their respective weaknesses. The performance of the proposed sensor fusion methodology in enhancing localization accuracy and mapping integrity is tested through the simulation experiments as well as real-Life implementations. The findings underscore the significance of multi-sensor fusion in overcoming the constraints of single-sensor systems and advancing the capabilities of autonomous robots in navigating challenging environments.
Article Details
Unique Paper ID: 164128

Publication Volume & Issue: Volume 10, Issue 12

Page(s): 1929 - 1936
Article Preview & Download

Share This Article

Join our RMS

Conference Alert

NCSEM 2024

National Conference on Sustainable Engineering and Management - 2024

Last Date: 15th March 2024

Call For Paper

Volume 11 Issue 1

Last Date for paper submitting for Latest Issue is 25 June 2024

About Us enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on

Social Media

Google Verified Reviews