PaGO-LOAM
A LiDAR odometry framework that can easily test the ground-segmentation algorithms.
The page for PaGO-LOAM: Robust Ground-Optimized LiDAR Odometry, a LiDAR odometry framework based on LeGO-LOAM, in which it is easy and straightforward to test whether ground segmentation algorithms help extract features and improve SLAM performance. By leveraging the SOTA ground segmentation method Patchwork, the system takes in the raw point cloud and ground point cloud for estimating the odometry.
Requirements
wget -O ~/Downloads/gtsam.zip https://github.com/borglab/gtsam/archive/4.0.0-alpha2.zip
cd ~/Downloads/ && unzip gtsam.zip -d ~/Downloads/
cd ~/Downloads/gtsam-4.0.0-alpha2/
mkdir build && cd build
cmake ..
sudo make install
- If you want to test the ground segmentation method Patchwork like us,
sudo apt-get install ros-melodic-jsk-recognition
sudo apt-get install ros-melodic-jsk-common-msgs
sudo apt-get install ros-melodic-jsk-rviz-plugins
Build
- In utility.h, there are 5 parameters you could set
extern const string pointCloudTopic = "/benchmark/ground_estimate";
extern const bool alterGround = true;
extern const bool loopClosureEnableFlag = false;
std::string RESULT_PATH ="/data/bagfiles/KITTI_BAG/pago_pose_result.txt";
std::string TIME_PATH ="/data/bagfiles/KITTI_BAG/pago_pose_time.txt";
- pointCloudTopic should have the format as (Refer to cloud_msgs/ground_estimate.msg):
std_msgs/Header header
sensor_msgs/PointCloud2 curr
sensor_msgs/PointCloud2 ground
‘curr‘ for raw point cloud, ‘ground‘ for ground segmented point cloud.
- alterGround is for deciding whether to use another ground segmentation method such as Patchwork or one from baseline LeGO-LOAM.
- loopClosureEnableFlag for loop-closure.
- RESULT_PATH and TIME_PATH for the odometry estimation result.
- Compile
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone https://github.com/url-kaist/AlterGround-LeGO-LOAM
cd .. && catkin build pago_loam
Warning
Our codes are modified to use LiDAR points from KITTI dataset that were de-skewed, so you should do de-skewing first or restore/change the de-skewing/KITTI-adaptive code to use other dataset(i.e, codes are changed as follows)
- utility.h
extern const string pointCloudTopic = "/kitti/velo/pointcloud"; <- you should check your own bag file topic
//param for vel-64
extern const int N_SCAN = 64;
extern const int Horizon_SCAN = 1800;
extern const float ang_res_x = 0.2;
extern const float ang_res_y = 0.427;
extern const float ang_bottom = 24.9;
extern const int groundScanInd = 50;
- featureAssociation.cpp
float s 10 * (pi->intensity - int(pi->intensity)); -> float s = 1;
// to delete all the code that corrects point cloud distortion
TransformToEnd(&cornerPointsLessSharp->points[i], &cornerPointsLessSharp->points[i]); -> removed
TransformToEnd(&surfPointsLessFlat->points[i], &surfPointsLessFlat->points[i]); -> removed
We refer to the modification here: https://github.com/Mitchell-Lee-93/kitti-lego-loam
KITTI Example
- Download KITTI Odometry dataset
- Convert Lidar bin files to rosbag files using lidar2rosbag_KITTI (If you are in urobot students, just use pre-made bagfiles!)
- Run the ground segmentation algorithm. (e.g. Patchwork)
roslaunch patchwork pub_for_legoloam.launch
- Run the PaGO-LOAM and KITTI bagfile
roslaunch pago_loam run.launch
rosbag play /${Your dataset}/${kitti_bagfile}.bag --clock
Odometry Estimation
Using evo package, you could estimate the pose errors e.g:
- LeGO-LOAM fundamentally doesn’t give the first 2~3 poses because it doesn’t estimate the odometry poses when initializes. So please compare the # of difference between ground-truth and estimated odometry, and remove the first 2~3 poses of ground-truth.
evo_ape kitti KITTI_00_gt.txt pago_pose_result.txt -va --plot --plot_mode xz --save_results results/KITTI_pago.zip --align
- Or, by converting the result to TUM format using the attached file kitti_to_tum.ipnyb, you can run the evo without removing the poses.
evo_ape tum KITTI_00_gt.txt pago_pose_result.txt -va --plot --plot_mode xz --save_results results/KITTI_pago.zip --align
The result on Seq.02 of the KITTI dataset