Sensor Fusion for Localization and Mapping

  • Unique Paper ID: 164128
  • Volume: 10
  • Issue: 12
  • PageNo: 1929-1936
  • Abstract:
  • In the field of robotics and unmanned systems navigation, proper location finding and map-making are important for efficient operation, especially in ever-changing and intricate landscapes. However, relying on only one sensor often results in poor accuracy and robustness. To address this issue, this article focuses on exploring the idea of sensor fusion to achieve localization and mapping. By utilizing multiple sensors such as lidar, cameras, IMUs, and wheel encoders, a comprehensive perception of surrounding conditions can be obtained. In this paper, various fusion algorithms are analyzed, such as the extended Kalman filter or region proximity method, and they are integrated to leverage the capabilities of each sensor in compensating for their respective weaknesses. The performance of the proposed sensor fusion methodology in enhancing localization accuracy and mapping integrity is tested through the simulation experiments as well as real-Life implementations. The findings underscore the significance of multi-sensor fusion in overcoming the constraints of single-sensor systems and advancing the capabilities of autonomous robots in navigating challenging environments.

Cite This Article

  • ISSN: 2349-6002
  • Volume: 10
  • Issue: 12
  • PageNo: 1929-1936

Sensor Fusion for Localization and Mapping

Related Articles