• Stars
    star
    293
  • Rank 141,748 (Top 3 %)
  • Language
    Python
  • License
    BSD 2-Clause "Sim...
  • Created over 2 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Inverse rendering by optimizing neural SDF and materials from photometric images

IRON: Inverse Rendering by Optimizing Neural SDFs and Materials from Photometric Images

Note: this repo is still under construction.

Project page: https://kai-46.github.io/IRON-website/

example results

Usage

Create environment

git clone https://github.com/Kai-46/iron.git && cd iron && . ./create_env.sh

Download data

. ./download_data.sh

Training and testing

. ./train_scene.sh drv/dragon

Once training is done, you will see the recovered mesh and materials under the folder ./exp_iron_stage2/drv/dragon/mesh_and_materials_50000/. At the same time, the rendered test images are under the folder ./exp_iron_stage2/drv/dragon/render_test_50000/

Relight the 3D assets using envmaps

Check test_mitsuba/render_rgb_envmap_mat.py.

Evaluation

Check evaluation/eval_mesh.py and evaluation/eval_image_folder.py.

Render synthetic data using Mitsuba

Check render_synthetic_data/render_rgb_flash_mat.py. To make renderings more shiny, try scaling up the specular albedo and scaling down the specular roughness; to make renderings more diffuse, try the opposite.

Camera parameters convention

We use the OpenCV camera convention just like NeRF++; you might want to use the camera visualization and debugging tools in that codebase to inspect if there's any issue with the camera parameters. Note we also assume the objects are inside the unit sphere.

Citations

@inproceedings{iron-2022,
  title={IRON: Inverse Rendering by Optimizing Neural SDFs and Materials from Photometric Images},
  author={Zhang, Kai and Luan, Fujun and Li, Zhengqi and Snavely, Noah},
  booktitle={IEEE Conf. Comput. Vis. Pattern Recog.},
  year={2022}
}

Example results

dragon.mp4

example results

Acknowledgements

We would like to thank the authors of IDR and NeuS for open-sourcing their projects.

More Repositories