• Stars
    star
    262
  • Rank 153,139 (Top 4 %)
  • Language
    Python
  • License
    MIT License
  • Created about 4 years ago
  • Updated about 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

The official PyTorch implementation for NCSNv2 (NeurIPS 2020)

Improved Techniques for Training Score-Based Generative Models

This repo contains the official implementation for the paper Improved Techniques for Training Score-Based Generative Models.

by Yang Song and Stefano Ermon, Stanford AI Lab.

Note: The method has been extended by the subsequent work Score-Based Generative Modeling through Stochastic Differential Equations (code) that allows better sample quality and exact log-likelihood computation.


We significantly improve the method proposed in Generative Modeling by Estimating Gradients of the Data Distribution. Score-based generative models are flexible neural networks trained to capture the score function of an underlying data distributionβ€”a vector field pointing to directions where the data density increases most rapidly. We present new techniques to improve the performance of score-based generative models, scaling them to high resolution images that are previously impossible. Without requiring adversarial training, they can produce sharp and diverse image samples that rival GANs.

samples

(From left to right: Our samples on FFHQ 256px, LSUN bedroom 128px, LSUN tower 128px, LSUN church_outdoor 96px, and CelebA 64px.)

Running Experiments

Dependencies

Run the following to install all necessary python packages for our code.

pip install -r requirements.txt

Project structure

main.py is the file that you should run for both training and sampling. Execute python main.py --help to get its usage description:

usage: main.py [-h] --config CONFIG [--seed SEED] [--exp EXP] --doc DOC
               [--comment COMMENT] [--verbose VERBOSE] [--test] [--sample]
               [--fast_fid] [--resume_training] [-i IMAGE_FOLDER] [--ni]

optional arguments:
  -h, --help            show this help message and exit
  --config CONFIG       Path to the config file
  --seed SEED           Random seed
  --exp EXP             Path for saving running related data.
  --doc DOC             A string for documentation purpose. Will be the name
                        of the log folder.
  --comment COMMENT     A string for experiment comment
  --verbose VERBOSE     Verbose level: info | debug | warning | critical
  --test                Whether to test the model
  --sample              Whether to produce samples from the model
  --fast_fid            Whether to do fast fid test
  --resume_training     Whether to resume training
  -i IMAGE_FOLDER, --image_folder IMAGE_FOLDER
                        The folder name of samples
  --ni                  No interaction. Suitable for Slurm Job launcher

Configuration files are in config/. You don't need to include the prefix config/ when specifying --config . All files generated when running the code is under the directory specified by --exp. They are structured as:

<exp> # a folder named by the argument `--exp` given to main.py
β”œβ”€β”€ datasets # all dataset files
β”œβ”€β”€ logs # contains checkpoints and samples produced during training
β”‚   └── <doc> # a folder named by the argument `--doc` specified to main.py
β”‚      β”œβ”€β”€ checkpoint_x.pth # the checkpoint file saved at the x-th training iteration
β”‚      β”œβ”€β”€ config.yml # the configuration file for training this model
β”‚      β”œβ”€β”€ stdout.txt # all outputs to the console during training
β”‚      └── samples # all samples produced during training
β”œβ”€β”€ fid_samples # contains all samples generated for fast fid computation
β”‚   └── <i> # a folder named by the argument `-i` specified to main.py
β”‚      └── ckpt_x # a folder of image samples generated from checkpoint_x.pth
β”œβ”€β”€ image_samples # contains generated samples
β”‚   └── <i>
β”‚       └── image_grid_x.png # samples generated from checkpoint_x.pth       
└── tensorboard # tensorboard files for monitoring training
    └── <doc> # this is the log_dir of tensorboard

Training

For example, we can train an NCSNv2 on LSUN bedroom by running the following

python main.py --config bedroom.yml --doc bedroom

Log files will be saved in <exp>/logs/bedroom.

Sampling

If we want to sample from NCSNv2 on LSUN bedroom, we can edit bedroom.yml to specify the ckpt_id under the group sampling, and then run the following

python main.py --sample --config bedroom.yml -i bedroom

Samples will be saved in <exp>/image_samples/bedroom.

We can interpolate between different samples (see more details in the paper). Just set interpolation to true and an appropriate n_interpolations under the group of sampling in bedroom.yml. We can also perform other tasks such as inpainting. Usages should be quite obvious if you read the code and configuration files carefully.

Computing FID values quickly for a range of checkpoints

We can specify begin_ckpt and end_ckpt under the fast_fid group in the configuration file. For example, by running the following command, we can generate a small number of samples per checkpoint within the range begin_ckpt-end_ckpt for a quick (and rough) FID evaluation.

python main.py --fast_fid --config bedroom.yml -i bedroom

You can find samples in <exp>/fid_samples/bedroom.

Pretrained Checkpoints

Link: https://drive.google.com/drive/folders/1217uhIvLg9ZrYNKOR3XTRFSurt4miQrd?usp=sharing

You can produce samples using it on all datasets we tested in the paper. It assumes the --exp argument is set to exp.

References

If you find the code/idea useful for your research, please consider citing

@inproceedings{song2020improved,
  author    = {Yang Song and Stefano Ermon},
  editor    = {Hugo Larochelle and
               Marc'Aurelio Ranzato and
               Raia Hadsell and
               Maria{-}Florina Balcan and
               Hsuan{-}Tien Lin},
  title     = {Improved Techniques for Training Score-Based Generative Models},
  booktitle = {Advances in Neural Information Processing Systems 33: Annual Conference
               on Neural Information Processing Systems 2020, NeurIPS 2020, December
               6-12, 2020, virtual},
  year      = {2020}
}

and/or our previous work

@inproceedings{song2019generative,
  title={Generative Modeling by Estimating Gradients of the Data Distribution},
  author={Song, Yang and Ermon, Stefano},
  booktitle={Advances in Neural Information Processing Systems},
  pages={11895--11907},
  year={2019}
}

More Repositories

1

cs228-notes

Course notes for CS228: Probabilistic Graphical Models.
SCSS
1,863
star
2

ddim

Denoising Diffusion Implicit Models
Python
1,300
star
3

SDEdit

PyTorch implementation for SDEdit: Image Synthesis and Editing with Stochastic Differential Equations
Python
933
star
4

ncsn

Noise Conditional Score Networks (NeurIPS 2019, Oral)
Python
630
star
5

CSDI

Codes for "CSDI: Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation"
Jupyter Notebook
253
star
6

Wifi_Activity_Recognition

Code for IEEE Communication Magazine (A Survey on Behaviour Recognition Using WiFi Channle State Information)
Jupyter Notebook
237
star
7

Variational-Ladder-Autoencoder

Implementation of VLAE
Python
216
star
8

MA-AIRL

Multi-Agent Adversarial Inverse Reinforcement Learning, ICML 2019.
Python
181
star
9

sliced_score_matching

Code for reproducing results in the sliced score matching paper (UAI 2019)
Python
133
star
10

neuralsort

Code for "Stochastic Optimization of Sorting Networks using Continuous Relaxations", ICLR 2019.
Python
133
star
11

a-nice-mc

Code for "A-NICE-MC: Adversarial Training for MCMC"
Jupyter Notebook
126
star
12

tile2vec

Implementation and examples for Tile2Vec
Python
110
star
13

flow-gan

Code for "Flow-GAN: Combining Maximum Likelihood and Adversarial Learning in Generative Models", AAAI 2018.
Python
104
star
14

GraphScoreMatching

Official implementation for the paper: Permutation Invariant Graph Generation via Score-Based Generative Modeling
Python
97
star
15

Sequential-Variational-Autoencoder

Implementation of Sequential Variational Autoencoder
Python
84
star
16

multiagent-gail

Python
80
star
17

markov-chain-gan

Code for "Generative Adversarial Training for Markov Chains" (ICLR 2017 Workshop)
Python
79
star
18

ssdkl

Code that accompanies the paper Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance
Python
72
star
19

MetaIRL

Meta-Inverse Reinforcement Learning with Probabilistic Context Variables
Python
68
star
20

smile-mi-estimator

PyTorch implementation for the ICLR 2020 paper "Understanding the Limitations of Variational Mutual Information Estimators"
Jupyter Notebook
68
star
21

PatchDrop

PyTorch Implementation of `Learning to Process Fewer Pixels` - [CVPR20 (Oral)]
Python
66
star
22

generative_adversary

Code for the unrestricted adversarial examples paper (NeurIPS 2018)
Python
63
star
23

pirank

PiRank: Learning to Rank via Differentiable Sorting
Python
60
star
24

graphite

Code for Graphite iterative graph generation
Python
55
star
25

CalibratedModelBasedRL

Code for "Calibrated Model-Based Deep Reinforcement Learning", ICML 2019.
Python
54
star
26

ODS

Code for "Diversity can be Transferred: Output Diversification for White- and Black-box Attacks"
Python
53
star
27

subsets

Code for Reparameterizable Subset Sampling via Continuous Relaxations, IJCAI 2019.
Python
49
star
28

necst

Neural Joint-Source Channel Coding
Python
44
star
29

cs323-notes

Course notes for CS323: Automated Reasoning
CSS
40
star
30

mintnet

MintNet: Building Invertible Neural Networks with Masked Convolutions
Python
38
star
31

f-EBM

Code for "Training Deep Energy-Based Models with f-Divergence Minimization" ICML 2020
Python
35
star
32

alignflow

Python
33
star
33

higher_order_invariance

Code for "Accelerating Natural Gradient with Higher-Order Invariance"
MATLAB
29
star
34

lagvae

Lagrangian VAE
Python
28
star
35

BiasAndGeneralization

Jupyter Notebook
26
star
36

BCD-Nets

Code for `BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery`, Neurips 2021
Python
24
star
37

fast_feedforward_computation

Official code for "Accelerating Feedforward Computation via Parallel Nonlinear Equation Solving", ICML 2021
Jupyter Notebook
24
star
38

Crop_Yield_Prediction

Python
23
star
39

NDA

Python
23
star
40

sparse_gen

Code for "Modeling Sparse Deviations for Compressed Sensing using Generative Models", ICML 2018
Python
23
star
41

self-similarity-prior

Self-Similarity Priors: Neural Collages as Differentiable Fractal Representations
Jupyter Notebook
22
star
42

dail

The Official Implementation of Domain Adaptive Imitation Learning (DAIL)
Python
22
star
43

lag-fairness

Python
22
star
44

STGAN

PyTorch Implementation of STGAN for Cloud Removal in Satellite Images.
Python
22
star
45

bgm

Code for "Boosted Generative Models", AAAI 2018.
Python
20
star
46

best-arm-delayed

Code for "Best arm identification in multi-armed bandits with delayed feedback", AISTATS 2018.
Python
19
star
47

f-dre

Featurized Density Ratio Estimation
Jupyter Notebook
18
star
48

WikipediaPovertyMapping

Implementation of Geolocated Articles Processing and Poverty Mapping - [KDD19]
Jupyter Notebook
18
star
49

fairgen

Fair Generative Modeling via Weak Supervision
Jupyter Notebook
18
star
50

Neural-PDE-Solver

Python
15
star
51

SPN_Variational_Inference

PyTorch implementation for "Probabilistic Circuits for Variational Inference in Discrete Graphical Models", NeurIPS 2020
Python
15
star
52

acl

Code for "Adversarial Constraint Learning for Structured Prediction"
Python
14
star
53

f-wgan

Code for "Bridging the Gap between f-GANs and Wasserstein GANs", ICML 2020
Jupyter Notebook
14
star
54

HyperSPN

PyTorch implementation for "HyperSPNs: Compact and Expressive Probabilistic Circuits", NeurIPS 2021
Python
13
star
55

dre-infinity

Density Ratio Estimation via Infinitesimal Classification (AISTATS 2022 Oral)
Python
13
star
56

EfficientObjectDetection

PyTorch Implementation of Efficient Object Detection in Large Images
Python
8
star
57

streamline-vi-csp

C
7
star
58

bayes-opt

Python
4
star
59

BestArmIdentification

Python
3
star
60

permanent_adaptive

Python
3
star
61

rbpf_fireworks

Python
2
star
62

PretrainingWikiSatNet

Python
2
star
63

pestat

Keep pestat great
Shell
2
star
64

weighted-rademacher

Python
2
star
65

gac

Python
2
star