• Stars
    star
    1,958
  • Rank 23,672 (Top 0.5 %)
  • Language
    C++
  • License
    GNU General Publi...
  • Created about 3 years ago
  • Updated 6 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

R3LIVE

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

1. Introduction

R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, which takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation. R3LIVE is built upon our previous work R2LIVE, is contained of two subsystems: the LiDAR-inertial odometry (LIO) and the visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometric structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points).

The source code of this package is released under GPLv2 license. We only allow it free for personal and academic usage. For commercial use, please contact me <ziv.lin.ljrATgmail.com> and Dr. Fu Zhang <fuzhangAThku.hk> to negotiate a different license.

1.1 Our paper

Our paper has been accepted to ICRA2022, which is available online on this page or be downloaded here.

1.2 Our accompanying videos

Our accompanying videos are now available on YouTube (click below images to open) and Bilibili1, 2.

video video

1.3 Our associate dataset: R3LIVE-dataset

Our associate dataset R3LIVE-dataset that use for evaluation is also available online. You can access and download our datasets via this Github repository.

1.4 Our open-source hardware design

All of the mechanical modules of our handheld device that use for data collection are designed as FDM printable, with the schematics of the design are also open-sourced in this Github repository.

2. What can R3LIVE do?

2.1 Strong robustness in various challenging scenarios

R3LIVE is robust enough to work well in various of LiDAR-degenerated scenarios (see following figures):

And even in simultaneously LiDAR degenerated and visual texture-less environments (see Experiment-1 of our paper).

video video

2.2 Real-time RGB maps reconstruction

R3LIVE is able to reconstruct the precise, dense, 3D, RGB-colored maps of surrounding environment in real-time (watch this video).

video

2.3 Ready for 3D applications

To make R3LIVE more extensible, we also provide a series of offline utilities for reconstructing and texturing meshes, which further reduce the gap between R3LIVE and various 3D applications (watch this video).

video video

3. Prerequisites

3.1 ROS

Following this ROS Installation to install ROS and its additional pacakge:

sudo apt-get install ros-XXX-cv-bridge ros-XXX-tf ros-XXX-message-filters ros-XXX-image-transport ros-XXX-image-transport*

NOTICE: remember to replace "XXX" on above command as your ROS distributions, for example, if your use ROS-kinetic, the command should be:

sudo apt-get install ros-kinetic-cv-bridge ros-kinetic-tf ros-kinetic-message-filters ros-kinetic-image-transport*

3.2. livox_ros_driver

Follow this livox_ros_driver Installation.

3.3 CGAL and pcl_viewer (optional)

sudo apt-get install libcgal-dev pcl-tools

3.4 OpenCV >= 3.3

You can use the following command to check your OpenCV version, if your openCV version lower than OpenCV-3.3, we recommend you to update your you openCV version if you meet errors in complying our codes. Otherwise, skip this step ^_^

pkg-config --modversion opencv

We have successfully test our algorithm with version 3.3.1, 3.4.16, 4.2.1 and 4.5.3.

Notice: We have noticed that a large number of users meet a crash of problem after launching our package due to the mismatch of openCV version (see issue #11, issue #20 and issue #23, and etc.). If you meet with similar problems, please make sure that the OpenCV you complied are same as the OpenCV you run with. This is very very important to launch R3LIVE correctly.

4. Build R3LIVE on ROS:

Clone this repository and catkin_make:

cd ~/catkin_ws/src
git clone https://github.com/hku-mars/r3live.git
cd ../
catkin_make
source ~/catkin_ws/devel/setup.bash

5. Run our examples

5.1 Download our rosbag files (r3live_dataset)

Our datasets for evaluation can be download from our Google drive or Baidu-NetDisk [百度网盘] (code提取码: wwxw). We have released totally 9 rosbag files for evaluating r3live, with the introduction of these datasets can be found on this page.

5.2 Run our examples

After you have downloaded our bag files, you can now run our example ^_^

roslaunch r3live r3live_bag.launch
rosbag play YOUR_DOWNLOADED.bag

If everything is correct, you will get the result that matches our paper and the results posted on this page.

5.3 Save the maps to your disk

R3LIVE allow you to save the maps you build at anytime you wanted. You just need to click on the "Control panel" and press 'S' or 's' key.

video

5.3 Reconstruct and texture your mesh

After you have save your offline map on your disk (default save in directory: ${HOME}/r3live_output), you can launch our utility to reconstruct and texture your mesh.

roslaunch r3live r3live_reconstruct_mesh.launch

5.4 Visualize your saved maps.

As default, your offline map (and reconstructed mesh) will be saved in the directory ${HOME}/r3live_output, you can open it with pcl_viewer (and meshlab).

Install pcl_viewer and meshlab:

sudo apt-get install pcl-tools meshlab

Visualizing your offline point cloud maps (with suffix *.pcd):

cd ${HOME}/r3live_output
pcl_viewer rgb_pt.pcd

Visualizing your reconstructed mesh (with suffix *.ply):

cd ${HOME}/r3live_output
meshlab textured_mesh.ply

6. Sample and run your own data

6.1 Livox-ros-driver for R2/R3LIVE

Since the LiDAR data and IMU data published by the official Livox-ros-driver is with the timestamp of LiDAR (started from 0 in each recording), and the timestamp of the image is usually recorded with the timestamp of the operation system. To make them working under the same time-based, we modified the source code of Livox-ros-driver, which is available at here. We suggest you replace the official driver with it when sampling your own data for R3LIVE.

6.2 Sensor calibration

In order to launch R3LIVE on your own hardware setup, you need to have a carefully calibration of the extrinsic among LiDAR, Camera and IMU. We recommend you using the following repo to kindly calibrate your sensors:

livox_camera_calib: A robust, high accuracy extrinsic calibration tool between high resolution LiDAR (e.g. Livox) and camera in targetless environment.

7. Support of the spinning LiDAR

Even though our proposed method is unrelated to what kind of LiDAR you used, it is impossible for us making R3LIVE compatible with all kinds of existing LiDARs. To launch R3LIVE with spinning LIDAR, it requires you to take some effort in modifying the source code of our LiDAR front-end (see LiDAR_front_end.cpp). Here we give an example to test our LIO-subsystem with an Ouster-2 64 Line Spinning LIDAR.

7.1 Example-1: Ouster OS2-64

Download our recorded rosbag file from here.

roslaunch r3live r3live_bag_ouster.launch 
rosbag play ouster_example_for_LIO_test.bag 

Notice: We manually disable our VIO-subsystem due the missed of calibration files in this example.

Finally, we are still working on making R3LIVE compatible with more spinning LiDARs. More and more examples will be released in the future.

8. Access our open source hardware design

In order to facilitate our users in reproducing our work, we also make our hardware design public available, where you can download all of our CAD source files in rxlive_handheld.

9. Report our problems and bugs

We know our packages might not totally stable in this stage, and we are keep working on improving the performance and reliability of our codes. So, if you have met any bug or problem, please feel free to draw an issue and I will respond ASAP.

  For reporting our problems and bugs, please attach both your hardware and software environment if possible (printed by R3LIVE, see the following figure), which will be a great help for me in locating your problems.

video

10. Acknowledgments

In the development of R3LIVE, we stand on the shoulders of the following repositories:

  1. R2LIVE: A robust, real-time tightly-coupled multi-sensor fusion package.
  2. FAST-LIO: A computationally efficient and robust LiDAR-inertial odometry package.
  3. ikd-Tree: A state-of-art dynamic KD-Tree for 3D kNN search.
  4. livox_camera_calib: A robust, high accuracy extrinsic calibration tool between high resolution LiDAR (e.g. Livox) and camera in targetless environment.
  5. LOAM-Livox: A robust LiDAR Odometry and Mapping (LOAM) package for Livox-LiDAR.
  6. openMVS: A library for computer-vision scientists and especially targeted to the Multi-View Stereo reconstruction community.
  7. VCGlib: An open source, portable, header-only Visualization and Computer Graphics Library.
  8. CGAL: A C++ Computational Geometry Algorithms Library.

License

The source code of this package is released under GPLv2 license. We only allow it free for personal and academic usage. For commercial use, please contact me <ziv.lin.ljrATgmail.com> and Dr. Fu Zhang <fuzhangAThku.hk> to negotiate a different license.

We are still working on improving the performance and reliability of our codes. For any technical issues, please contact me via email Jiarong Lin < ziv.lin.ljrATgmail.com >.

If you use any code of this repo in your academic research, please cite at least one of our papers:

[1] Lin, Jiarong, and Fu Zhang. "R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package." 
[2] Xu, Wei, et al. "Fast-lio2: Fast direct lidar-inertial odometry."
[3] Lin, Jiarong, et al. "R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping." 
[4] Xu, Wei, and Fu Zhang. "Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter."
[5] Cai, Yixi, Wei Xu, and Fu Zhang. "ikd-Tree: An Incremental KD Tree for Robotic Applications."
[6] Lin, Jiarong, and Fu Zhang. "Loam-livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV."

More Repositories

1

FAST_LIO

A computationally efficient and robust LiDAR-inertial odometry (LIO) package
C++
2,549
star
2

loam_livox

A robust LiDAR Odometry and Mapping (LOAM) package for Livox-LiDAR
C++
1,435
star
3

FAST-LIVO

A Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry (LIVO).
C++
1,086
star
4

livox_camera_calib

This repository is used for automatic calibration between high resolution LiDAR and camera in targetless scenes.
C++
863
star
5

LiDAR_IMU_Init

[IROS2022] Robust Real-time LiDAR-inertial Initialization Method.
C++
834
star
6

Point-LIO

C++
745
star
7

r2live

R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping package
C++
721
star
8

BALM

An efficient and consistent bundle adjustment for lidar mapping
C++
700
star
9

ikd-Tree

This repository provides implementation of an incremental k-d tree for robotic applications.
C++
607
star
10

ImMesh

ImMesh: An Immediate LiDAR Localization and Meshing Framework
C++
590
star
11

STD

A 3D point cloud descriptor for place recognition
C++
548
star
12

VoxelMap

[RA-L 2022] An efficient and probabilistic adaptive voxel mapping method for LiDAR odometry
C++
479
star
13

mlcc

Fast and Accurate Extrinsic Calibration for Multiple LiDARs and Cameras
C++
479
star
14

FAST-LIVO2

FAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry
471
star
15

HBA

[RAL 2023] A globally consistent LiDAR map optimization module
C++
437
star
16

IKFoM

A computationally efficient and convenient toolkit of iterated Kalman filter.
C++
420
star
17

M-detector

C++
362
star
18

LTAOM

C++
325
star
19

ROG-Map

C++
294
star
20

MARSIM

MARSIM: A light-weight point-realistic simulator for LiDAR-based UAVs
C++
283
star
21

D-Map

D-Map provides an efficient occupancy mapping approach for high-resolution LiDAR sensors.
C++
280
star
22

decentralized_loam

207
star
23

joint-lidar-camera-calib

Joint intrinsic and extrinsic LiDAR-camera calibration.
C++
194
star
24

SLAM-HKU-MaRS-LAB

In this repository, we present our research works of HKU-MaRS lab that related to SLAM
191
star
25

Voxel-SLAM

C++
185
star
26

Swarm-LIO2

Swarm-LIO2: Decentralized, Efficient LiDAR-inertial Odometry for UAV Swarms
158
star
27

dyn_small_obs_avoidance

C++
154
star
28

IPC

Integrated Planning and Control for Quadrotor Navigation in Presence of Sudden Crossing Objects and Disturbances
C++
147
star
29

btc_descriptor

137
star
30

PULSAR

C++
102
star
31

lidar_car_platfrom

48
star
32

iBTC

39
star
33

crossgap_il_rl

Python
38
star
34

multi_lidar_calib

28
star
35

Livox_handheld

25
star
36

mapping_eval

2
star