• Stars
    star
    226
  • Rank 175,506 (Top 4 %)
  • Language
    Python
  • License
    MIT License
  • Created over 2 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Minimal PyTorch implementations of NeRF and pixelNeRF.

PyTorch NeRF and pixelNeRF

NeRF: Open NeRF in Colab

Tiny NeRF: Open Tiny NeRF in Colab

pixelNeRF: Open pixelNeRF in Colab

This repository contains minimal PyTorch implementations of the NeRF model described in "NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis" and the pixelNeRF model described in "pixelNeRF: Neural Radiance Fields from One or Few Images". While there are other PyTorch implementations out there (e.g., this one and this one for NeRF, and the authors' official implementation for pixelNeRF), I personally found them somewhat difficult to follow, so I decided to do a complete rewrite of NeRF myself. I tried to stay as close to the authors' text as possible, and I added comments in the code referring back to the relevant sections/equations in the paper. The final result is a tight 355 lines of heavily commented code (301 slocβ€”"source lines of code"β€”on GitHub) all contained in a single file. For comparison, this PyTorch implementation has approximately 970 sloc spread across several files, while this PyTorch implementation has approximately 905 sloc.

run_tiny_nerf.py trains a simplified NeRF model inspired by the "Tiny NeRF" example provided by the NeRF authors. This NeRF model does not use fine sampling and the MLP is smaller, but the code is otherwise identical to the full model code. At only 153 sloc, it might be a good place to start for people who are completely new to NeRF. If you prefer your code more object-oriented, check out run_nerf_alt.py and run_tiny_nerf_alt.py.

A Colab notebook for the full model can be found here, while a notebook for the tiny model can be found here. The generate_nerf_dataset.py script was used to generate the training data of the ShapeNet car (see "Generating the ShapeNet datasets" for additional details).

For the following test view:

run_nerf.py generated the following after 20,100 iterations (a few hours on a P100 GPU):

Loss: 0.00022201683896128088

while run_tiny_nerf.py generated the following after 19,600 iterations (~35 minutes on a P100 GPU):

Loss: 0.0004151524917688221

The advantages of streamlining NeRF's code become readily apparent when trying to extend NeRF. For example, training a pixelNeRF model only required making a few changes to run_nerf.py bringing it to 368 sloc (notebook here). For comparison, the official pixelNeRF implementation has approximately 1,300 pixelNeRF-specific (i.e., not related to the image encoder or dataset) sloc spread across several files. The generate_pixelnerf_dataset.py script was used to generate the training data of ShapeNet cars (see "Generating the ShapeNet datasets" for additional details).

For the following source object and view:

and target view:

run_pixelnerf.py generated the following after 73,243 iterations (~12 hours on a P100 GPU; the full pixelNeRF model was trained for 400,000 iterations, which took six days):

Loss: 0.004468636587262154

The "smearing" is an artifact caused by the bounding box sampling method.

Generating the ShapeNet datasets

  1. Download the data (the ShapeNet server is pretty slow, so this will take a while):
SHAPENET_BASE_DIR=<path/to/your/shapenet/root>
nohup wget --quiet -P ${SHAPENET_BASE_DIR} http://shapenet.cs.stanford.edu/shapenet/obj-zip/ShapeNetCore.v2.zip > shapenet.log &
  1. Unzip the data:
cd ${SHAPENET_BASE_DIR}
nohup unzip -q ShapeNetCore.v2.zip > shapenet.log &
  1. After the file is done unzipping, remove the ZIP:
rm ShapeNetCore.v2.zip
  1. Change the SHAPENET_DIR variable in generate_nerf_dataset.py and generate_pixelnerf_dataset.py to <path/to/your/shapenet/root>/ShapeNetCore.v2.

More Repositories

1

Deep-Semantic-Similarity-Model

My Keras implementation of the Deep Semantic Similarity Model (DSSM)/Convolutional Latent Semantic Model (CLSM) described here: http://research.microsoft.com/pubs/226585/cikm2014_cdssm_final.pdf.
Python
519
star
2

Michael-s-Data-Science-Curriculum

This is the companion curriculum to my guide to becoming a data scientist.
395
star
3

RankNet

My (slightly modified) Keras implementation of RankNet and PyTorch implementation of LambdaRank.
Python
246
star
4

Recurrent-Convolutional-Neural-Network-Text-Classifier

My (slightly modified) Keras implementation of the Recurrent Convolutional Neural Network (RCNN) described here: http://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/view/9745.
Python
184
star
5

Solr-LTR

From Zero to Learning to Rank in Apache Solr
174
star
6

batter-pitcher-2vec

A model for learning distributed representations of MLB players.
Jupyter Notebook
80
star
7

strike-with-a-pose

A simple GUI tool for generating adversarial poses of objects.
Python
77
star
8

baller2vec

A multi-entity Transformer for multi-agent spatiotemporal modeling.
Python
63
star
9

Michael-s-Guide-to-Becoming-a-Data-Scientist

I was once asked about transitioning to a career in data science by three different UChicago grad students over a short period of time, so I decided to put together this outline in case anyone else was curious.
40
star
10

baller2vecplusplus

A look-ahead multi-entity Transformer for modeling coordinated agents.
Python
38
star
11

pytorch-geodesic-loss

A PyTorch criterion for computing the distance between rotation matrices.
Python
31
star
12

Color-Names

An improved version of the color name model described here: http://lewisandquark.tumblr.com/post/160776374467/new-paint-colors-invented-by-neural-network.
Python
24
star
13

vqvae-pytorch

A minimal PyTorch implementation of the VQ-VAE model described in "Neural Discrete Representation Learning".
Python
21
star
14

LMIR

Pure Python implementations of the language models for information retrieval surveyed here: https://dl.acm.org/doi/10.1145/383952.384019.
Python
13
star
15

Football-o-Genetics

An application for "evolving" near-optimal offensive play calling strategies.
Java
10
star
16

paved2paradise

Cost-effective and scalable LiDAR simulation by factoring the real world.
Python
8
star
17

deformer

An order-agnostic distribution estimating Transformer.
Python
6
star
18

shallow-deep-learning

The code and slides for my "A Shallow Introduction to Deep Learning" workshop.
Python
6
star
19

pytorch-ipdf

Minimal PyTorch implementation of implicit-PDF.
Python
5
star
20

Hangouts-NLP

A program that performs a number of different natural language processing analyses on Google Hangouts instant messaging data.
Python
5
star
21

Sequences-With-Sentences

A convolutional recurrent neural network that can handle data sequences containing a mixture of fixed size and variable size (e.g., text) inputs at each time step.
Python
4
star
22

ScatterPlot3D

An application for visualizing and exploring three-dimensional scatter plot data.
Java
2
star
23

boformer

Python
2
star
24

pytorch-volume-rotator

Applies explicit 3D transformations to feature volumes in PyTorch.
Python
2
star
25

aquamam

An autoregressive, quaternion manifold model for rapidly estimating complex SO(3) distributions.
Python
1
star
26

parking-lot-pointnetplusplus

Train a PointNet++ bounding box regression model on parking lot samples obtained by following the Paved2Paradise protocol.
Python
1
star