Intensity-SLAM: Intensity Assisted Localization and Mapping for Large Scale Environment
Abstract
Simultaneous Localization And Mapping (SLAM) is a task to estimate the robot location and to reconstruct the environment based on observation from sensors such as LIght Detection And Ranging (LiDAR) and camera. It is widely used in robotic applications such as autonomous driving and drone delivery. Traditional LiDAR-based SLAM algorithms mainly leverage the geometric features from the scene context, while the intensity information from LiDAR is ignored. Some recent deep-learning-based SLAM algorithms consider intensity features and train the pose estimation network in an end-to-end manner. However, they require significant data collection effort and their generalizability to environments other than the trained one remains unclear. In this letter we introduce intensity features to a SLAM system. And we propose a novel full SLAM framework that leverages both geometry and intensity features. The proposed SLAM involves both intensity-based front-end odometry estimation and intensity-based back-end optimization. Thorough experiments are performed including both outdoor autonomous driving and indoor warehouse robot manipulation. The results show that the proposed method outperforms existing geometric-only LiDAR SLAM methods.
BibTeX
@article{Wang-2021-126651,author = {Han Wang and Chen Wang and Lihua Xie},
title = {Intensity-SLAM: Intensity Assisted Localization and Mapping for Large Scale Environment},
journal = {IEEE Robotics and Automation Letters},
year = {2021},
month = {April},
volume = {6},
number = {2},
pages = {1715 - 1721},
}