• Stars
    star
    180
  • Rank 211,865 (Top 5 %)
  • Language
    Python
  • Created over 4 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Simulation-based Lidar Super-resolution for Ground Vehicles

Lidar Super-resolution

This repository contains code for lidar super-resolution with ground vehicles driving on roadways, which relies on a driving simulator to enhance the apparent resolution of a physical lidar. To increase the resolution of the point cloud captured by a sparse 3D lidar, we convert this problem from 3D Euclidean space into an image super-resolution problem in 2D image space, which is solved using a deep convolutional neural network. By projecting a point cloud onto a range image, we can efficiently enhance the resolution of such an image using a deep neural network. We train the network purely using computer-generated data (i.e., CARLA simulator). A video of the package can be found here.

Dependency

The package depends on Numpy, Tensorflow-keras, and ROS. ROS only is used for visualization.

Compile

Download the package to your workspace and compile to code with catkin_make (for visualization only).

Data Download

Download the demo data into your Documents folder in your home directory. The data directory should look like this /home/$USER/Documents/SuperResolution. The SuperResolution is the project directory and contains .npy files. You can also change the directory settings in data.py. We simulate an Ouster OS1-64 lidar in the CARLA simulator. The simulated point clouds are projected onto range images and used for training. We test the trained neulral network on real Ouster data to see its performance.

Demo data:

carla_ouster_range_image.npy # simulated dataset for network training (Simulated Ouster OS1-64 lidar in CARLA, train 16-beam to 64-beam)
ouster_range_image.npy # real-world dataset for testing. For example, you have OS1-16 data and want to increase its resolution to 64-beam
ouster_range_image-from-16-to-64_prediction.npy # predicted high-res data using the network trained on the simulated dataset
weights.h5 # an example weight file for the Ouster 16 to 64 neural network

Train Neural Network

Run run.py to start training the neural network using the provided data or your data.

python run.py

Training Tips

You may want to train a network and then perform inference on your own data after you can run the sample data on your computer. To achieve the best performance, the simulated high-resolution lidar in the simulator should be mounted similarly to the mounting of your low-resolution lidar in the real world. For example, if your real lidar is mounted horizontally at 1.5-meter height. The simulated lidar should be placed like this in the simulator. If your real lidar is mounted horizontally, but your simulated lidar is mounted vertically, you will not get good performance as the view of the two lidars are completely different. Also, the field of view of the real lidar and the simulated lidar should be the same.

Data augmentation improves the performance of the network greatly. When you collect some data by simulating a high-resolution lidar, you can perform some operations like scaling the range values or flipping the range image.

Prepare Own Data

There is a script provided in this package, rosbag2npy.py, which converts point cloud messages in a rosbag into range images. The detailed usage of the script can be found in the comments of it.

Visualization

Run the code following to visualize your predictions:

roslaunch lidar_super_resolution visualize.launch

drawing

Cite

Thank you for citing our paper if you use any of this code:

@inproceedings{superresolution2020shan,
  title={Simulation-based Lidar Super-resolution for Ground Vehicles},
  author={Shan, Tixiao and Wang, Jinkun and Chen, Fanfei and Szenher, Paul and Englot, Brendan},
  journal={arXiv preprint arXiv:2004.05242}
  year={2020}
}

More Repositories

1

LeGO-LOAM

LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain
C++
2,336
star
2

DiSCo-SLAM

C++
223
star
3

turtlebot_exploration_3d

Autonomous Exploration package for a Turtulebot equiped with RGBD Sensor(Kinect, Xtion)
C++
151
star
4

la3dm

Learning-aided 3D mapping
C++
123
star
5

DRL_graph_exploration

Autonomous Exploration Under Uncertainty via Deep Reinforcement Learning on Graphs
C++
115
star
6

jackal_dataset_20170608

Bag files captured using a Clearpath Jackal Robot, which is equipped with a Velodyne VLP-16 and low-end IMU sensor. The published point cloud topic is \velodyne_points. The published IMU data topic is \imu\data.
65
star
7

DRL_robot_exploration

Self-Learning Exploration and Mapping for Mobile Robots via Deep Reinforcement Learning
Python
64
star
8

Distributional_RL_Navigation

[IROS 2023] Robust Unmanned Surface Vehicle Navigation with Distributional Reinforcement Learning
Python
43
star
9

spin_hokuyo

spinning Hokuyo form 3D point cloud
C++
41
star
10

em_exploration

Autonomous Exploration with Expectation-Maximization
C++
40
star
11

Multi_Robot_Distributional_RL_Navigation

[ICRA 2024] Decentralized Multi-Robot Navigation for Autonomous Surface Vehicles with Distributional Reinforcement Learning
Python
39
star
12

bluerov

BlueROV2 ROS Package
C
28
star
13

Stochastic_Road_Network

[UR 2023] Robust Route Planning with Distributional Reinforcement Learning in a Stochastic Road Network Environment
Python
16
star
14

Mission-Oriented-GP-Motion-Planning

C++
12
star
15

waypoint_navigation

C++
8
star
16

Multi-Robot-EM-Exploration

Python
6
star
17

RobustFieldAutonomyLab.github.io

Robust Field Autonomy Lab Website
HTML
5
star
18

videoray

C++
3
star
19

motion_hokuyo

C++
3
star
20

loam

loam package, adopted...
C++
3
star
21

spin_hokuyo-release

2
star
22

explo_turtlebot_dev

Shawn
C++
2
star
23

eth_icp_mapper_indigo

eth_icp_mapper adopted to work in indigo
C++
1
star
24

turtlebot_exploration_3d-release

release for turtlebot exploration 3d package in ROS --RFAL 2016
1
star
25

multi_robot_exploration

EmberScript
1
star