• Stars
    star
    4,076
  • Rank 10,656 (Top 0.3 %)
  • Language
    Python
  • License
    MIT License
  • Created over 4 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A PyTorch implementation of NeRF (Neural Radiance Fields) that reproduces the results.

NeRF-pytorch

NeRF (Neural Radiance Fields) is a method that achieves state-of-the-art results for synthesizing novel views of complex scenes. Here are some videos generated by this repository (pre-trained models are provided below):

This project is a faithful PyTorch implementation of NeRF that reproduces the results while running 1.3 times faster. The code is based on authors' Tensorflow implementation here, and has been tested to match it numerically.

Installation

git clone https://github.com/yenchenlin/nerf-pytorch.git
cd nerf-pytorch
pip install -r requirements.txt
Dependencies (click to expand)

Dependencies

  • PyTorch 1.4
  • matplotlib
  • numpy
  • imageio
  • imageio-ffmpeg
  • configargparse

The LLFF data loader requires ImageMagick.

You will also need the LLFF code (and COLMAP) set up to compute poses if you want to run on your own real data.

How To Run?

Quick Start

Download data for two example datasets: lego and fern

bash download_example_data.sh

To train a low-res lego NeRF:

python run_nerf.py --config configs/lego.txt

After training for 100k iterations (~4 hours on a single 2080 Ti), you can find the following video at logs/lego_test/lego_test_spiral_100000_rgb.mp4.


To train a low-res fern NeRF:

python run_nerf.py --config configs/fern.txt

After training for 200k iterations (~8 hours on a single 2080 Ti), you can find the following video at logs/fern_test/fern_test_spiral_200000_rgb.mp4 and logs/fern_test/fern_test_spiral_200000_disp.mp4


More Datasets

To play with other scenes presented in the paper, download the data here. Place the downloaded dataset according to the following directory structure:

β”œβ”€β”€ configs                                                                                                       
β”‚Β Β  β”œβ”€β”€ ...                                                                                     
β”‚Β Β                                                                                              
β”œβ”€β”€ data                                                                                                                                                                                                       
β”‚Β Β  β”œβ”€β”€ nerf_llff_data                                                                                                  
β”‚Β Β  β”‚Β Β  └── fern                                                                                                                             
β”‚Β Β  β”‚Β   └── flower  # downloaded llff dataset                                                                                  
β”‚Β Β  β”‚Β   └── horns   # downloaded llff dataset
|   |   └── ...
|   β”œβ”€β”€ nerf_synthetic
|   |   └── lego
|   |   └── ship    # downloaded synthetic dataset
|   |   └── ...

To train NeRF on different datasets:

python run_nerf.py --config configs/{DATASET}.txt

replace {DATASET} with trex | horns | flower | fortress | lego | etc.


To test NeRF trained on different datasets:

python run_nerf.py --config configs/{DATASET}.txt --render_only

replace {DATASET} with trex | horns | flower | fortress | lego | etc.

Pre-trained Models

You can download the pre-trained models here. Place the downloaded directory in ./logs in order to test it later. See the following directory structure for an example:

β”œβ”€β”€ logs 
β”‚Β Β  β”œβ”€β”€ fern_test
β”‚Β Β  β”œβ”€β”€ flower_test  # downloaded logs
β”‚   β”œβ”€β”€ trex_test    # downloaded logs

Reproducibility

Tests that ensure the results of all functions and training loop match the official implentation are contained in a different branch reproduce. One can check it out and run the tests:

git checkout reproduce
py.test

Method

NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis
Ben Mildenhall*1, Pratul P. Srinivasan*1, Matthew Tancik*1, Jonathan T. Barron2, Ravi Ramamoorthi3, Ren Ng1
1UC Berkeley, 2Google Research, 3UC San Diego
*denotes equal contribution

A neural radiance field is a simple fully connected network (weights are ~5MB) trained to reproduce input views of a single scene using a rendering loss. The network directly maps from spatial location and viewing direction (5D input) to color and opacity (4D output), acting as the "volume" so we can use volume rendering to differentiably render new views

Citation

Kudos to the authors for their amazing results:

@misc{mildenhall2020nerf,
    title={NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis},
    author={Ben Mildenhall and Pratul P. Srinivasan and Matthew Tancik and Jonathan T. Barron and Ravi Ramamoorthi and Ren Ng},
    year={2020},
    eprint={2003.08934},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}

However, if you find this implementation or pre-trained models helpful, please consider to cite:

@misc{lin2020nerfpytorch,
  title={NeRF-pytorch},
  author={Yen-Chen, Lin},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished={\url{https://github.com/yenchenlin/nerf-pytorch/}},
  year={2020}
}

More Repositories

1

DeepLearningFlappyBird

Flappy Bird hack using Deep Reinforcement Learning (Deep Q-learning).
Python
6,471
star
2

awesome-NeRF

A curated list of awesome neural radiance fields papers
TeX
2,492
star
3

awesome-adversarial-machine-learning

A curated list of awesome adversarial machine learning resources
1,699
star
4

pix2pix-tensorflow

TensorFlow implementation of "Image-to-Image Translation Using Conditional Adversarial Networks".
Python
935
star
5

awesome-watchos

A curated list of awesome watchOS frameworks, libraries, sample apps.
Ruby
445
star
6

nerf-supervision-public

Python
185
star
7

iNeRF-public

Python
162
star
8

research-advice

28
star
9

vision2action-ICRA

15
star
10

evf-public

Experience-embedded Visual Foresight, CoRL 2019
Python
14
star
11

paper-notes

Notes for papers or blog posts about ML, Robotics, CV.
14
star
12

mira

Python
12
star
13

Deep360Pilot-optical-flow

Code for extracting optical flow features for deep 360 pilot
Python
8
star
14

orthographic-ngp

Cuda
7
star
15

link

Jupyter Notebook
3
star
16

YahooParallaxScrollEffect

Implementation of Yahoo! Weather's parallax scroll effect
Swift
3
star
17

the-military-grind

My memories for military service.
CSS
3
star
18

alpaca-turbo

Python
2
star
19

fid

Python
2
star
20

adversarial-deep-rl

2
star
21

yenchenlin.github.io

JavaScript
2
star
22

gear-vr-360-player

clone from Oculus sample
Makefile
1
star
23

biggan-opt

Jupyter Notebook
1
star
24

dev-setup

My development environment setup
1
star
25

old-the-military-grind

This book will chronicle my four-month-grind when serving military in Taiwan.
HTML
1
star
26

old-blog

CSS
1
star
27

omniglot-45-5

Omniglot 45-5 split dataset
Python
1
star