• Stars
    star
    228
  • Rank 175,267 (Top 4 %)
  • Language
    C++
  • License
    GNU General Publi...
  • Created over 2 years ago
  • Updated 9 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Square-Root Robocentric Visual-Inertial Odometry with Online Spatiotemporal Calibration

R-VIO2

R-VIO2 is a novel square root information-based robocentric visual-inertial navigation algorithm using a monocular camera and a single IMU for consistent 3D motion tracking. It is developed based on our robocentric VIO model, while different with our previous work R-VIO, we have derived and used i) our square-root robocentric formulation and ii) QR-based update combined with back substitution to improve the numerical stability and computational efficiency of the estimator. Moreover, the spatiotemporal calibration is performed online to robustify the performance of estimator in the presence of unknown parameter errors. Especially, this implementation can run in two modes: VIO or SLAM, where the former does not estimate any map points during the navigation (our RA-L2022 paper), while the latter estimates a small set of map points in favor of localization (the frontend developed for our ICRA2021 paper).

If you find this work relevant to or use it for your research, please consider citing the following papers:

  • Zheng Huai and Guoquan Huang, Square-Root Robocentric Visual-Inertial Odometry with Online Spatiotemporal Calibration, IEEE Robotics and Automation Letters (RA-L), 2022: download.
@article{huai2022square,
  title={Square-root robocentric visual-inertial odometry with online spatiotemporal calibration},
  author={Huai, Zheng and Huang, Guoquan},
  journal={IEEE Robotics and Automation Letters},
  volume={7},
  number={4},
  pages={9961--9968},
  year={2022},
  publisher={IEEE}
}
  • Zheng Huai and Guoquan Huang, Markov Parallel Tracking and Mapping for Probabilistic SLAM, IEEE International Conference on Robotics and Automation (ICRA), 2021: download.
@inproceedings{huai2021markov,
  title     = {Markov parallel tracking and mapping for probabilistic SLAM},
  author    = {Huai, Zheng and Huang, Guoquan},
  booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},
  pages     = {11661--11667},
  year      = {2021}
}

1. Prerequisites

ROS

Download and install instructions can be found at: http://wiki.ros.org/kinetic/Installation/Ubuntu.

Eigen

Download and install instructions can be found at: http://eigen.tuxfamily.org. Tested with v3.1.0.

OpenCV

Download and install instructions can be found at: http://opencv.org. Tested with v3.3.1.

2. Build and Run

First git clone the repository and catkin_make it. Especially, rvio2_mono is used to run with rosbag in real time, while rvio2_mono_eval is used for evaluation purpose which preloads the rosbag and reads it as a txt file. A config file and a launch file are required for running R-VIO2 (for example, rvio2_euroc.yaml and euroc.launch are for EuRoC dataset). The default mode is VIO, while you can switch to SLAM mode by setting the maximum number of SLAM features to nonzero from the config file (see rvio2_euroc.yaml). To visualize the outputs, please use rviz.

Start ROS:

Terminal 1: roscore
Terminal 2: rviz (AND OPEN rvio2_rviz.rviz IN THE CONFIG FOLDER)

Run rvio2_mono:

Terminal 3: rosbag play --pause V1_01_easy.bag (AND SKIP SOME DATA IF NEEDED)
Terminal 4: roslaunch rvio2 euroc.launch

Run rvio2_mono_eval:

Terminal 3: roslaunch rvio2 euroc_eval.launch (PRESET PATH_TO_ROSBAG IN euroc_eval.launch)

Note that this implementation currently requires the sensor platform to start from stationary. Therefore, when testing the Machine Hall sequences you should skip the wiggling phase at the beginning. In particular, if you would like to run rvio2_mono_eval, the rosbag data to be skipped can be set in the config file (see rvio2_euroc.yaml).

3. License

This code is released under GNU General Public License v3 (GPL-3.0).

More Repositories

1

open_vins

An open source platform for visual-inertial navigation research.
C++
2,111
star
2

R-VIO

Robocentric Visual-Inertial Odometry
C++
737
star
3

kalibr_allan

IMU Allan standard deviation charts for use with Kalibr and inertial kalman filters.
MATLAB
577
star
4

MINS

An efficient and robust multisensor-aided inertial navigation system with online calibration that is capable of fusing IMU, camera, LiDAR, GPS/GNSS, and wheel sensors. Use cases: VINS/VIO, GPS-INS, LINS/LIO, multi-sensor fusion for localization and mapping (SLAM). This repository also provides multi-sensor simulation and data.
C++
427
star
5

cpi

Closed-form Preintegration for Graph-based Visual-Inertial Navigation
C++
249
star
6

ov_plane

A monocular plane-aided visual-inertial odometry
C++
197
star
7

calc

Convolutional Autoencoder for Loop Closure
Python
190
star
8

lips

LiDAR-Inertial 3D Plane Simulator
MATLAB
150
star
9

suo_slam

Symmetry and Uncertainty-Aware Object SLAM for 6DoF Object Pose Estimation
Python
133
star
10

android-camera-calibration

Updated (opencv3 and camera2 API) android camera calibration application
C++
118
star
11

calc2.0

CALC2.0: Combining Appearance, Semantic and Geometric Information for Robust and Efficient Visual Loop Closure
Python
95
star
12

vicon2gt

Vicon-IMU fusion for groundtruth trajectory generation.
C++
95
star
13

ov_maplab

Interface for OpenVINS with the maplab project
C++
82
star
14

ocekf-slam

Observability-Constrained (OC)-EKF for 2D SLAM
MATLAB
79
star
15

android-dataset-recorder

Dataset collection app that will collect both IMG and IMU measurements for offline processing
Java
60
star
16

icalib.github.io

Inertial Aided Multi-Sensor Calibration
60
star
17

ov_secondary

Secondary posegraph adapted for interfacing with OpenVINS, based on VINS-Mono / VINS-Fusion.
C++
59
star
18

ar_table_dataset

Small-scale indoor table AR visual-inertial datasets with 6DoF groundtruth.
Python
53
star
19

reach_ros_node

ROS driver for the Reach RTK GNSS module by Emlid
Python
36
star
20

mast_project

Underwater Camera and Sonar SLAM (Kevin and Linde's MAST class project)
C++
36
star
21

pointgrey_ladybug

ROS Driver for Pointgrey Ladybug Cameras
C++
30
star
22

clatt

cooperative localization and target tracking
MATLAB
28
star
23

android_sensors_driver

ROS Driver for Android Sensors (opencv3 and camera1 API)
Java
26
star
24

xsens_standalone

Python Standalone library for use with the xsens IMU
Python
19
star
25

rosbags

Github mirror of https://gitlab.com/ternaris/rosbags
Python
18
star
26

kitti_parser

C++ parser for the RAW KITTI dataset, with callbacks
C++
15
star
27

gps_path_pub

Handy publishing of a path and and frame from a single GPS sensor.
C++
13
star
28

CSO

Calibration the rigid transformation between the stereo and odometry
C++
12
star
29

img_imu_record

Easy recording of image and imu data to disk
C++
9
star
30

orb_slam_mapmerge

A Versatile and Accurate Monocular SLAM (with map merging)
C++
7
star
31

apriltags-cpp

A simple ros wrapper for apriltag-cpp
C++
7
star
32

vins_source

Helper package with launch files for getting imu and video sources.
CMake
6
star
33

microstrain_comm

IMU driver for the Microstrain 3DM-GX3®-25. Converted to run on the ROS framework.
C
5
star
34

Microstrain-3DM-GX3-35

ROS driver for Microstrain 3DM-GX3-35 IMU
C
3
star
35

firefly

AscTec Firefly Documentation
1
star
36

ZED_Odom_Grabber

Grab zed stereo images and odometry from turtlebot at the same time.
C++
1
star
37

dvs128-viewer

A driver for dvs128 camera
Java
1
star