ImMesh
ImMesh: An Immediate LiDAR Localization and Meshing Framework
1. Introduction
ImMesh is a novel LiDAR(-inertial) odometry and meshing framework, which takes advantage of input of LiDAR data, achieving the goal of simultaneous localization and meshing in real-time. ImMesh comprises four tightly-coupled modules: receiver, localization, meshing, and broadcaster. The localization module utilizes the prepossessed sensor data from the receiver, estimates the sensor pose online by registering LiDAR scans to maps, and dynamically grows the map. Then, our meshing module takes the registered LiDAR scan for incrementally reconstructing the triangle mesh on the fly. Finally, the real-time odometry, map, and mesh are published via our broadcaster.
Any question?
For any technical problem about this work, please feel free to contact me via www.jiaronglin.com :)
Date of code release
Our paper is currently under review, and the code of ImMesh will be released as our work is accepted. However, our previous SLAM works R3LIVE, VoxelMap, FAST-LIO, R2LIVE, and ikd-Tree are now available on our github.
- R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
- VoxelMap: An efficient and probabilistic adaptive(coarse-to-fine) voxel mapping method for 3D LiDAR
- FAST-LIO: A computationally efficient and robust LiDAR-inertial odometry (LIO) package
- R2LIVE: a robust, real-time tightly-coupled multi-sensor fusion framework
- ikd-Tree: an incremental k-d tree designed for robotic applications
1.1 Our paper
Our preprint paper is now can be downloaded here.
1.2 Our accompanying videos
Our accompanying videos are now available on YouTube (click below images to open) and Bilibili1, 2, 3.
2. What can ImMesh do?
2.1 Simultaneous LiDAR localization and mesh reconstruction on the fly
2.2 ImMesh for LiDAR point cloud reinforement
2.3 ImMesh for rapid, lossless texture reconstruction
3. Contact us
If you have any questions about this work, please feel free to contact me via www.jiaronglin.com and Dr. Fu Zhang <fuzhangAThku.hk> via email.