Research Divisions
Research Progress
Research Programs
Location: Home>Research>Research Progress
GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex terrain
Author: Update times: 2021-12-31                          | Print | Close | Text Size: A A A

Simultaneous localization and mapping is a fundamental process in robot navigation. We focus on LiDAR to complete this process in ground robots traveling on complex terrain by proposing GR-LOAM, a method to estimate robot ego-motion by fusing LiDAR, inertial measurement unit (IMU), and encoder measurements in a tightly coupled scheme. First, we derive a odometer increment model that fuses the IMU and encoder measurements to estimate the robot pose variation on a manifold. Then, we apply point cloud segmentation and feature extraction to obtain distinctive edge and planar features. Moreover, we propose an evaluation algorithm for the sensor measurements to detect abnormal data and reduce their corresponding weight during optimization. By jointly optimizing the cost derived from the LiDAR, IMU, and encoder measurements in a local window, we obtain low-drift odometry even on complex terrain. We use the estimated relative pose in the local window to reevaluate the matching distance across features and remove dynamic objects and outliers, thus refining the features before being fed to a mapping thread and increasing the mapping efficiency. In the back end, GR-LOAM uses the refined point cloud and tightly couples the IMU and encoder measurements with ground constraints to further refine the estimated pose by aligning the features on a global map. Results from extensive experiments performed in indoor and outdoor environments using real ground robot demonstrate the high accuracy and robustness of the proposed GR-LOAM for state estimation of ground robots.


This work is published on Robotics and Autonomous Systems 140(2021):1-13.

Copyright © 2003 - 2013. Shenyang Institute of Automation (SIA), Chinese Academy of Sciences
All rights reserved. Reproduction in whole or in part without permission is prohibited.
Phone: 86 24 23970012 Email: