• Stars
    star
    654
  • Rank 68,870 (Top 2 %)
  • Language
    Python
  • License
    GNU General Publi...
  • Created over 5 years ago
  • Updated 9 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Noise Conditional Score Networks (NeurIPS 2019, Oral)

Generative Modeling by Estimating Gradients of the Data Distribution

This repo contains the official implementation for the NeurIPS 2019 paper Generative Modeling by Estimating Gradients of the Data Distribution,

by Yang Song and Stefano Ermon. Stanford AI Lab.

Note: The method has been greatly stabilized by the subsequent work Improved Techniques for Training Score-Based Generative Models (code) and more recently extended by Score-Based Generative Modeling through Stochastic Differential Equations (code). This codebase is therefore not recommended for new projects anymore.


We describe a new method of generative modeling based on estimating the derivative of the log density function (a.k.a., Stein score) of the data distribution. We first perturb our training data by different Gaussian noise with progressively smaller variances. Next, we estimate the score function for each perturbed data distribution, by training a shared neural network named the Noise Conditional Score Network (NCSN) using score matching. We can directly produce samples from our NSCN with annealed Langevin dynamics.

Dependencies

  • PyTorch

  • PyYAML

  • tqdm

  • pillow

  • tensorboardX

  • seaborn

Running Experiments

Project Structure

main.py is the common gateway to all experiments. Type python main.py --help to get its usage description.

usage: main.py [-h] [--runner RUNNER] [--config CONFIG] [--seed SEED]
               [--run RUN] [--doc DOC] [--comment COMMENT] [--verbose VERBOSE]
               [--test] [--resume_training] [-o IMAGE_FOLDER]

optional arguments:
  -h, --help            show this help message and exit
  --runner RUNNER       The runner to execute
  --config CONFIG       Path to the config file
  --seed SEED           Random seed
  --run RUN             Path for saving running related data.
  --doc DOC             A string for documentation purpose
  --verbose VERBOSE     Verbose level: info | debug | warning | critical
  --test                Whether to test the model
  --resume_training     Whether to resume training
  -o IMAGE_FOLDER, --image_folder IMAGE_FOLDER
                        The directory of image outputs

There are four runner classes.

  • AnnealRunner The main runner class for experiments related to NCSN and annealed Langevin dynamics.
  • BaselineRunner Compared to AnnealRunner, this one does not anneal the noise. Instead, it uses a single fixed noise variance.
  • ScoreNetRunner This is the runner class for reproducing the experiment of Figure 1 (Middle, Right)
  • ToyRunner This is the runner class for reproducing the experiment of Figure 2 and Figure 3.

Configuration files are stored in configs/. For example, the configuration file of AnnealRunner is configs/anneal.yml. Log files are commonly stored in run/logs/doc_name, and tensorboard files are in run/tensorboard/doc_name. Here doc_name is the value fed to option --doc.

Training

The usage of main.py is quite self-evident. For example, we can train an NCSN by running

python main.py --runner AnnealRunner --config anneal.yml --doc cifar10

Then the model will be trained according to the configuration files in configs/anneal.yml. The log files will be stored in run/logs/cifar10, and the tensorboard logs are in run/tensorboard/cifar10.

Sampling

Suppose the log files are stored in run/logs/cifar10. We can produce samples to folder samples by running

python main.py --runner AnnealRunner --test -o samples

Checkpoints

We provide pretrained checkpoints run.zip. Extract the file to the root folder. You should be able to produce samples like the following using this checkpoint.

Dataset Sampling procedure
MNIST MNIST
CelebA Celeba
CIFAR-10 CIFAR10

Evaluation

Please refer to Appendix B.2 of our paper for details on hyperparameters and model selection. When computing inception and FID scores, we first generate images from our model, and use the official code from OpenAI and the original code from TTUR authors to obtain the scores.

References

Large parts of the code are derived from this Github repo (the official implementation of the sliced score matching paper)

If you find the code / idea inspiring for your research, please consider citing the following

@inproceedings{song2019generative,
  title={Generative Modeling by Estimating Gradients of the Data Distribution},
  author={Song, Yang and Ermon, Stefano},
  booktitle={Advances in Neural Information Processing Systems},
  pages={11895--11907},
  year={2019}
}

and / or

@inproceedings{song2019sliced,
  author    = {Yang Song and
               Sahaj Garg and
               Jiaxin Shi and
               Stefano Ermon},
  title     = {Sliced Score Matching: {A} Scalable Approach to Density and Score
               Estimation},
  booktitle = {Proceedings of the Thirty-Fifth Conference on Uncertainty in Artificial
               Intelligence, {UAI} 2019, Tel Aviv, Israel, July 22-25, 2019},
  pages     = {204},
  year      = {2019},
  url       = {http://auai.org/uai2019/proceedings/papers/204.pdf},
}

More Repositories

1

cs228-notes

Course notes for CS228: Probabilistic Graphical Models.
SCSS
1,908
star
2

ddim

Denoising Diffusion Implicit Models
Python
1,400
star
3

SDEdit

PyTorch implementation for SDEdit: Image Synthesis and Editing with Stochastic Differential Equations
Python
965
star
4

CSDI

Codes for "CSDI: Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation"
Jupyter Notebook
278
star
5

ncsnv2

The official PyTorch implementation for NCSNv2 (NeurIPS 2020)
Python
275
star
6

Wifi_Activity_Recognition

Code for IEEE Communication Magazine (A Survey on Behaviour Recognition Using WiFi Channle State Information)
Jupyter Notebook
248
star
7

Variational-Ladder-Autoencoder

Implementation of VLAE
Python
215
star
8

MA-AIRL

Multi-Agent Adversarial Inverse Reinforcement Learning, ICML 2019.
Python
193
star
9

sliced_score_matching

Code for reproducing results in the sliced score matching paper (UAI 2019)
Python
137
star
10

neuralsort

Code for "Stochastic Optimization of Sorting Networks using Continuous Relaxations", ICLR 2019.
Python
134
star
11

a-nice-mc

Code for "A-NICE-MC: Adversarial Training for MCMC"
Jupyter Notebook
126
star
12

tile2vec

Implementation and examples for Tile2Vec
Python
111
star
13

flow-gan

Code for "Flow-GAN: Combining Maximum Likelihood and Adversarial Learning in Generative Models", AAAI 2018.
Python
105
star
14

GraphScoreMatching

Official implementation for the paper: Permutation Invariant Graph Generation via Score-Based Generative Modeling
Python
104
star
15

Sequential-Variational-Autoencoder

Implementation of Sequential Variational Autoencoder
Python
84
star
16

multiagent-gail

Python
80
star
17

markov-chain-gan

Code for "Generative Adversarial Training for Markov Chains" (ICLR 2017 Workshop)
Python
80
star
18

ssdkl

Code that accompanies the paper Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance
Python
73
star
19

smile-mi-estimator

PyTorch implementation for the ICLR 2020 paper "Understanding the Limitations of Variational Mutual Information Estimators"
Jupyter Notebook
72
star
20

MetaIRL

Meta-Inverse Reinforcement Learning with Probabilistic Context Variables
Python
69
star
21

PatchDrop

PyTorch Implementation of `Learning to Process Fewer Pixels` - [CVPR20 (Oral)]
Python
66
star
22

generative_adversary

Code for the unrestricted adversarial examples paper (NeurIPS 2018)
Python
63
star
23

pirank

PiRank: Learning to Rank via Differentiable Sorting
Python
59
star
24

graphite

Code for Graphite iterative graph generation
Python
56
star
25

CalibratedModelBasedRL

Code for "Calibrated Model-Based Deep Reinforcement Learning", ICML 2019.
Python
54
star
26

ODS

Code for "Diversity can be Transferred: Output Diversification for White- and Black-box Attacks"
Python
52
star
27

subsets

Code for Reparameterizable Subset Sampling via Continuous Relaxations, IJCAI 2019.
Python
49
star
28

necst

Neural Joint-Source Channel Coding
Python
45
star
29

cs323-notes

Course notes for CS323: Automated Reasoning
CSS
40
star
30

mintnet

MintNet: Building Invertible Neural Networks with Masked Convolutions
Python
38
star
31

f-EBM

Code for "Training Deep Energy-Based Models with f-Divergence Minimization" ICML 2020
Python
35
star
32

alignflow

Python
33
star
33

higher_order_invariance

Code for "Accelerating Natural Gradient with Higher-Order Invariance"
MATLAB
29
star
34

lagvae

Lagrangian VAE
Python
28
star
35

BiasAndGeneralization

Jupyter Notebook
26
star
36

fast_feedforward_computation

Official code for "Accelerating Feedforward Computation via Parallel Nonlinear Equation Solving", ICML 2021
Jupyter Notebook
25
star
37

BCD-Nets

Code for `BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery`, Neurips 2021
Python
24
star
38

STGAN

PyTorch Implementation of STGAN for Cloud Removal in Satellite Images.
Python
24
star
39

self-similarity-prior

Self-Similarity Priors: Neural Collages as Differentiable Fractal Representations
Jupyter Notebook
23
star
40

Crop_Yield_Prediction

Python
23
star
41

NDA

Python
23
star
42

sparse_gen

Code for "Modeling Sparse Deviations for Compressed Sensing using Generative Models", ICML 2018
Python
23
star
43

dail

The Official Implementation of Domain Adaptive Imitation Learning (DAIL)
Python
22
star
44

lag-fairness

Python
22
star
45

f-dre

Featurized Density Ratio Estimation
Jupyter Notebook
20
star
46

bgm

Code for "Boosted Generative Models", AAAI 2018.
Python
20
star
47

fairgen

Fair Generative Modeling via Weak Supervision
Jupyter Notebook
19
star
48

best-arm-delayed

Code for "Best arm identification in multi-armed bandits with delayed feedback", AISTATS 2018.
Python
19
star
49

WikipediaPovertyMapping

Implementation of Geolocated Articles Processing and Poverty Mapping - [KDD19]
Jupyter Notebook
18
star
50

dre-infinity

Density Ratio Estimation via Infinitesimal Classification (AISTATS 2022 Oral)
Python
16
star
51

Neural-PDE-Solver

Python
15
star
52

SPN_Variational_Inference

PyTorch implementation for "Probabilistic Circuits for Variational Inference in Discrete Graphical Models", NeurIPS 2020
Python
15
star
53

acl

Code for "Adversarial Constraint Learning for Structured Prediction"
Python
14
star
54

f-wgan

Code for "Bridging the Gap between f-GANs and Wasserstein GANs", ICML 2020
Jupyter Notebook
14
star
55

HyperSPN

PyTorch implementation for "HyperSPNs: Compact and Expressive Probabilistic Circuits", NeurIPS 2021
Python
13
star
56

EfficientObjectDetection

PyTorch Implementation of Efficient Object Detection in Large Images
Python
8
star
57

streamline-vi-csp

C
7
star
58

bayes-opt

Python
4
star
59

BestArmIdentification

Python
3
star
60

pestat

Keep pestat great
Shell
3
star
61

permanent_adaptive

Python
3
star
62

rbpf_fireworks

Python
2
star
63

PretrainingWikiSatNet

Python
2
star
64

weighted-rademacher

Python
2
star
65

gac

Python
2
star