• Stars
    star
    506
  • Rank 86,616 (Top 2 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 6 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Pytorch implementation of LARGE SCALE GAN TRAINING FOR HIGH FIDELITY NATURAL IMAGE SYNTHESIS (BigGAN)

BigGAN-PyTorch

Pytorch implementation of LARGE SCALE GAN TRAINING FOR HIGH FIDELITY NATURAL IMAGE SYNTHESIS (BigGAN)

train imagenet

for 128*128*3 resolution

python main.py --batch_size 64  --dataset imagenet --adv_loss hinge --version biggan_imagenet --image_path /data/datasets

python main.py --batch_size 64  --dataset lsun --adv_loss hinge --version biggan_lsun --image_path /data1/datasets/lsun/lsun

python main.py --batch_size 64  --dataset lsun --adv_loss hinge --version biggan_lsun --parallel True --gpus 0,1,2,3 --use_tensorboard True

Different

  • not use cross-replica BatchNorm (Ioffe & Szegedy, 2015) in G

Compatability

  • CPU
  • GPU

Pretrained Models

LSUN Pretrained model Download

Some methods in the paper to avoid model collapse, please see the paper and retrain your model.

Performance

  • Infact, as mentioned in the paper, the model will collapse
  • I use LSUN datasets to train this model maybe cause bad performance due to the class of classroom is more complex than �ImageNet

Results

LSUN DATASETS(two classes): classroom and church_outdoor

  • iter 82200 (128x128) batch_size 64
  • iter 128200
  • iter 365000
  • iter 800000
  • iter 900000