GMMLoc
Dense Map Based Visual Localization. [project]
Paper and Video
Related publication:
@article{huang2020gmmloc,
title={GMMLoc: Structure Consistent Visual Localization with Gaussian Mixture Models},
author={Huang, Huaiyang and Ye, Haoyang and Sun, Yuxiang and Liu, Ming},
journal={IEEE Robotics and Automation Letters},
volume={5},
number={4},
pages={5043--5050},
year={2020},
publisher={IEEE}
}
Demo videos:
Prerequisites
We have tested this library in Ubuntu 18.04. Prerequisites for installation:
apt-get install libopencv-dev
- miscs:
apt-get install python-wstool python-catkin-tools
- evo (optional)
pip install evo --upgrade --no-binary evo
Installation
Initialize a workspace:
mkdir -p /EXAMPLE/CATKIN/WORK_SPACE
cd /EXAMPLE/CATKIN/WORK_SPACE
mkdir src
catkin init
catkin config --extend /opt/ros/melodic
catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release
catkin config --merge-devel
Clone the code:
cd src
git clone [email protected]:hyhuang1995/gmmloc.git
If using SSH keys for github, prepare the dependencies via:
wstool init . ./gmmloc/gmmloc_ssh.rosinstall
wstool update
or using https instead:
wstool init . ./gmmloc/gmmloc_https.rosinstall
wstool update
Compile with:
catkin build gmmloc_ros
Running Examples
We provide examples on EuRoC Vicon Room sequences. For example, to run the demo on V1_03_difficult:
-
Download the sequence (ASL Format)
-
Replace the /PATH/TO/EUROC/DATASET/ in v1.launch with where the sequence is decompressed:
<param name="data_path" value="/PATH/TO/EUROC/DATASET/$(arg seq)/mav0/" />
- Launch:
roslaunch v1.launch seq:=V1_03_difficult
Evaluation
If evo is installed, we provide script for evaluating on Vicon Room sequences.
roscd gmmloc_ros
./scripts/evaluate_euroc.sh
and the results would be saved to gmmloc_ros/expr. By default, we follow the evaluation protocol of DSO to perform evaluation without multi-threading. If you would like to run the script in online mode, uncomment this line in the script:
rosparam set /gmmloc/online True
Credits
Our implementation is built on top of ORB-SLAM2, we thank Raul et al. for their great work.