Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces (ICML 2021)
Personal Web Pages | Pytorch-Version
This repository contains the code to reproduce the results from the paper.
You can find detailed usage instructions for training your own models and using pretrained models below.
If you find our code or paper useful, please consider citing
@inproceedings{NeuralPull,
title = {Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces},
author = {Baorui, Ma and Zhizhong, Han and Yu-Shen, Liu and Matthias, Zwicker},
booktitle = {International Conference on Machine Learning (ICML)},
year = {2021}
}
Pytorch Version
This work was originally implemented by tensorflow, we have implemented a pytroch version of Neural-Pull that is easier to use. The pytorch version of Neural-Pull provide references to researchers who are interested in Pytorch, if you are more accessible to the pytorch code, please use pytorch repository and star it, thanks.
This pytorch version is the original version provided by Baorui Ma, refactored by JunSheng Zhou to make it easier to understand.
Surface Reconstruction Demo
Single Image Reconstruction Demo
Installation
First you have to make sure that you have all dependencies in place. The simplest way to do so, is to use anaconda.
You can create an anaconda environment called tensorflow1
using
conda env create -f NeuralPull.yaml
conda activate tensorflow1
Next, for evaluation of the models,compile the extension modules, which are provided by Occupancy Networks. You can do this via
python setup.py build_ext --inplace
To compile the dmc extension, you have to have a cuda enabled device set up.
If you experience any errors, you can simply comment out the dmc_*
dependencies in setup.py
.
You should then also comment out the dmc
imports in im2mesh/config.py
.
Dataset and pretrained model
-
You can download our preprocessed data and pretrained model.Included in the link:
--Our pre-train model on ABC and FAMOUS dataset.
--Preprocessing data of ABC and FAMOUS(sample points and ground truth points).
--Our reconstruction results.
-
To make it easier for you to test the code, we have prepared exmaple data in the exmaple_data folder.
Building the dataset
Alternatively, you can also preprocess the dataset yourself. To this end, you have to follow the following steps:
- Put your own pointcloud files in 'input_dir' folder, each pointcloud file in a separate .xyz.npy file.
- Set an empty folder 'out_dir' to place the processed data, note, the folder need to be empty, because this folder will be deleted before the program runs.
You are now ready to build the dataset:
python sample_query_point --out_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --CUDA 0 --dataset other --input_dir ./data/abc_noisefree/04_pts/
Training
You can train a new network from scratch, run
- Surface Reconstruction
python NeuralPull.py --data_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --out_dir /data1/mabaorui/AtlasNetOwn/plane_cd_sur/ --class_idx 02691156 --train --dataset shapenet
- Single Image Reconstruction
python NeuralPull_SVG.py --data_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --out_dir /data1/mabaorui/AtlasNetOwn/plane_cd_sur/ --class_idx 02691156 --train --class_name plane
- Train the dataset yourself
python NeuralPull.py --data_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --out_dir /data1/mabaorui/AtlasNetOwn/plane_cd_sur/ --class_idx 02691156 --train --dataset other
Evaluation
For evaluation of the models and generation meshes using a trained model, use
- Surface Reconstruction
python NeuralPull.py --data_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --out_dir /data1/mabaorui/AtlasNetOwn/plane_cd_sur/ --class_idx 02691156 --dataset shapenet
- Single Image Reconstruction
python NeuralPull_SVG.py --data_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --out_dir /data1/mabaorui/AtlasNetOwn/plane_cd_sur/ --class_idx 02691156 --class_name plane
- Evaluation the dataset yourself
python NeuralPull.py --data_dir /data1/mabaorui/AtlasNetOwn/data/plane_precompute_2/ --out_dir /data1/mabaorui/AtlasNetOwn/plane_cd_sur/ --class_idx 02691156 --dataset other
Script Parameters Explanation
Parameters | Description |
---|---|
train | train or test a network. |
data_dir | preprocessed data. |
out_dir | store network parameters when training or to load pretrained network parameters when testing. |
class_idx | the class to train or test when using shapenet dataset, other dataset, default. |
class_name | the class to train or test when using shapenet dataset, other dataset, default. |
dataset | shapenet,famous,ABC or other(your dataset) |
Pytorch Implementation of Neural-Pull
Notably, the code in Pytorch implementation is not released by the official lab, it is achieved by @wzxshgz123's diligent work. His intention is only to provide references to researchers who are interested in Pytorch implementation of Neural-Pull. There is no doubt that his unconditional dedication should be appreciated.