• Stars
    star
    512
  • Rank 86,323 (Top 2 %)
  • Language
    Python
  • Created over 6 years ago
  • Updated 12 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch

Variational Autoencoder & Conditional Variational Autoenoder on MNIST

VAE paper: Auto-Encoding Variational Bayes

CVAE paper: Semi-supervised Learning with Deep Generative Models


In order to run conditional variational autoencoder, add --conditional to the the command. Check out the other commandline options in the code for hyperparameter settings (like learning rate, batch size, encoder/decoder layer depth and size).


Results

All plots obtained after 10 epochs of training. Hyperparameters accordning to default settings in the code; not tuned.

z ~ q(z|x) and q(z|x,c)

The modeled latent distribution after 10 epochs and 100 samples per digit.

VAE CVAE

p(x|z) and p(x|z,c)

Randomly sampled z, and their output. For CVAE, each c has been given as input once.

VAE CVAE