GANs implementation using MNIST data
This repo is a collection of the implementations of many GANs. In order to make the codes easy to read and follow, I minimize the code and run on the same MNIST dataset.
What does the MNIST data look like?
Toy implementations are organized as following:
1. Base Method
2. Loss or Structure Modifications
- Least Squares GAN (LSGAN)
- Wasserstein GAN (WGAN)
- Self-Attention GAN (SAGAN)
- Progressive-Growing GAN (PGGAN)
3. Can be Conditional
4. Image to Image Transformation
Installation
$ git clone https://github.com/MorvanZhou/mnistGANs
$ cd mnistGANs/
$ pip3 install -r requirements.txt
GAN
DCGAN
Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks
LSGAN
Least Squares Generative Adversarial Networks
WGAN
WGANpg
Improved Training of Wasserstein GANs
WGANdiv
Wasserstein Divergence for GANs
SAGAN
Self-Attention Generative Adversarial Networks
PGGAN
PROGRESSIVE GROWING OF GANS FOR IMPROVED QUALITY, STABILITY, AND VARIATION
CGAN
Conditional Generative Adversarial Nets
ACGAN
Conditional Image Synthesis with Auxiliary Classifier GANs
InfoGAN
InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
StyleGAN
A Style-Based Generator Architecture for Generative Adversarial Networks
CCGAN
Semi-Supervised Learning with Context-Conditional Generative Adversarial Networks
Pix2Pix
Image-to-Image Translation with Conditional Adversarial Networks
CycleGAN
Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks
SRGAN
Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network