In the field of robotics and unmanned systems navigation, proper location finding and map-making are important for efficient operation, especially in ever-changing and intricate landscapes. However, relying on only one sensor often results in poor accuracy and robustness. To address this issue, this article focuses on exploring the idea of sensor fusion to achieve localization and mapping. By utilizing multiple sensors such as lidar, cameras, IMUs, and wheel encoders, a comprehensive perception of surrounding conditions can be obtained. In this paper, various fusion algorithms are analyzed, such as the extended Kalman filter or region proximity method, and they are integrated to leverage the capabilities of each sensor in compensating for their respective weaknesses. The performance of the proposed sensor fusion methodology in enhancing localization accuracy and mapping integrity is tested through the simulation experiments as well as real-Life implementations. The findings underscore the significance of multi-sensor fusion in overcoming the constraints of single-sensor systems and advancing the capabilities of autonomous robots in navigating challenging environments.
Article Details
Unique Paper ID: 164128
Publication Volume & Issue: Volume 10, Issue 12
Page(s): 1929 - 1936
Article Preview & Download
Share This Article
Join our RMS
Conference Alert
NCSEM 2024
National Conference on Sustainable Engineering and Management - 2024