PointRNN: Point Recurrent Neural Network for Moving Point Cloud Processing
PointRNN
Structure
PointGRU
PointLSTM
Moving Point Cloud Prediction
Installation
The code is tested with Red Hat Enterprise Linux Workstation release 7.7 (Maipo), g++ (GCC) 5.3.1, TensorFlow v1.12, CUDA 9.0 and cuDNN v7.4.
Install TensorFlow v1.12:
pip install tensorflow-gpu==1.12
Compile the CUDA layers for PointNet++, which we used for furthest point sampling (FPS) and radius neighbouring search, and Chamfer Distance (CD) and Earth Mover's Distance (EMD):
cd modules/tf_ops/3d_interpolation && make
cd modules/tf_ops/approxmatch && make
cd modules/tf_ops/grouping && make
cd modules/tf_ops/nn_distance && make
cd modules/tf_ops/sampling && make
Before compiling, plese correctly set the CUDA_HOME and CUDNN_HOME in each Makefile under the 3d_interpolation, approxmatch, grouping, nn_distance and sampling directories, resplectively.
CUDA_HOME := /usr/local/cuda-9.0
CUDNN_HOME := /usr/local/cudnn7.4-9.0
Datasets
We provide the test sets for evaluating moving point cloud prediction:
- Moving MNIST Point Cloud (1 digit) β 2. Moving MNIST Point Cloud (2 digits) β 3. Argoverse β 4. nuScenes
License
The code is released under MIT License.
Citation
If you find our work useful in your research, please consider citing:
@article{fan19pointrnn,
author = {Hehe Fan and Yi Yang},
title = {PointRNN: Point Recurrent Neural Network for Moving Point Cloud Processing},
journal = {arXiv},
volume = {1910.08287},
year = {2019}
}
Related Repos
- PointRNN PyTorch implementation: https://github.com/hehefan/PointRNN-PyTorch
- PointNet++ TensorFlow implementation: https://github.com/charlesq34/pointnet2
Visualization
Dataset | 1 MNIST | 2 MNIST | Argoverse | Argoverse | nuScenes | nuScenes |
---|---|---|---|---|---|---|
Input | ||||||
Ground truth | ||||||
PointRNN | ||||||
PointGRU | ||||||
PointLSTM |