• Stars
    star
    1,427
  • Rank 32,971 (Top 0.7 %)
  • Language
    Lua
  • License
    Other
  • Created almost 9 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A torch implementation of http://arxiv.org/abs/1511.06434

DCGAN.torch: Train your own image generator

  1. Train your own network
    1. Train a face generator using the Celeb-A dataset
    2. Train Bedrooms, Bridges, Churches etc. using the LSUN dataset
    3. Train a generator on your own set of images.
    4. Train on the ImageNet dataset
  2. Use a pre-trained generator to generate images.
    1. Generate samples of 64x64 pixels
    2. Generate large artsy images (tried up to 4096 x 4096 pixels)
    3. Walk in the space of samples
  3. Vector Arithmetic of images in latent space

Prerequisites

  • Computer with Linux or OSX
  • Torch-7
  • For training, an NVIDIA GPU is strongly recommended for speed. CPU is supported but training is very slow.

Installing dependencies

Without GPU

With NVIDIA GPU

  • Install CUDA, and preferably CuDNN (optional).
  • Install Torch: http://torch.ch/docs/getting-started.html#_
  • Install optnet to reduce memory footprint for large images: luarocks install optnet
  • Optional, if you installed CuDNN, install cudnn bindings with luarocks install cudnn

Display UI

Optionally, for displaying images during training and generation, we will use the display package.

  • Install it with: luarocks install https://raw.githubusercontent.com/szym/display/master/display-scm-0.rockspec
  • Then start the server with: th -ldisplay.start
  • Open this URL in your browser: http://localhost:8000

You can see training progress in your browser window. It will look something like this: display

1. Train your own network

1.1 Train a face generator using the Celeb-A dataset

Preprocessing

mkdir celebA; cd celebA

Download img_align_celeba.zip from http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html under the link "Align&Cropped Images".

unzip img_align_celeba.zip; cd ..
DATA_ROOT=celebA th data/crop_celebA.lua

Training

DATA_ROOT=celebA dataset=folder th main.lua

1.2. Train Bedrooms, Bridges, Churches etc. using the LSUN dataset

LSUN dataset is shipped as an LMDB database. First, install LMDB on your system.

  • On OSX with Homebrew: brew install lmdb
  • On Ubuntu: sudo apt-get install liblmdb-dev

Then install a couple of Torch packages.

luarocks install lmdb.torch
luarocks install tds

Preprocessing (with bedroom class as an example)

Download bedroom_train_lmdb from the LSUN website.

Generate an index file:

DATA_ROOT=[path_to_lmdb] th data/lsun_index_generator.lua

Training

DATA_ROOT=[path-to-lmdb] dataset=lsun th main.lua

The code for the LSUN data loader is hardcoded for bedrooms. Change this line to another LSUN class to generate other classes.

1.3. Train a generator on your own set of images.

Preprocessing

  • Create a folder called myimages.
  • Inside that folder, create a folder called images and place all your images inside it.

Training

DATA_ROOT=myimages dataset=folder th main.lua

1.4. Train on the ImageNet dataset

Preprocessing

Follow instructions from this link.

Training

DATA_ROOT=[PATH_TO_IMAGENET]/train dataset=folder th main.lua

All training options:

   dataset = 'lsun',       -- imagenet / lsun / folder
   batchSize = 64,
   loadSize = 96,
   fineSize = 64,
   nz = 100,               -- #  of dim for Z
   ngf = 64,               -- #  of gen filters in first conv layer
   ndf = 64,               -- #  of discrim filters in first conv layer
   nThreads = 1,           -- #  of data loading threads to use
   niter = 25,             -- #  of iter at starting learning rate
   lr = 0.0002,            -- initial learning rate for adam
   beta1 = 0.5,            -- momentum term of adam
   ntrain = math.huge,     -- #  of examples per epoch. math.huge for full dataset
   display = 1,            -- display samples while training. 0 = false
   display_id = 10,        -- display window id.
   gpu = 1,                -- gpu = 0 is CPU mode. gpu=X is GPU mode on GPU X
   name = 'experiment1',
   noise = 'normal',       -- uniform / normal
   epoch_save_modulo = 1,  -- save checkpoint ever # of epoch

2. Use a pre-trained generator to generate images.

The generate script can operate in CPU or GPU mode.

to run it on the CPU, use:

gpu=0 net=[checkpoint-path] th generate.lua

for using a GPU, use:

gpu=1 net=[checkpoint-path] th generate.lua

Pre-trained network can be downloaded from here:

##2.1. Generate samples of 64x64 pixels

gpu=0 batchSize=64 net=celebA_25_net_G.t7 th generate.lua

The batchSize parameter controls the number of images to generate. If you have display running, the image will be shown there. The image is also saved to generation1.png in the same folder.

faces_pregen

##2.2. Generate large artsy images (tried up to 4096 x 4096 pixels)

gpu=0 batchSize=1 imsize=10 noisemode=linefull net=bedrooms_4_net_G.t7 th generate.lua

Controlling the imsize parameter will control the size of the output image. Larger the imsize, larger the output image.

line2d_pregen

##2.3. Walk in the space of samples

gpu=0 batchSize=16 noisemode=line net=bedrooms_4_net_G.t7 th generate.lua

controlling the batchSize parameter changes how big of a step you take.

interp_pregen

Vector Arithmetic

net=[modelfile] gpu=0 qlua arithmetic.lua

vector_arithmetic

More Repositories

1

ganhacks

starter from "How to Train a GAN?" at NIPS2016
10,908
star
2

convnet-benchmarks

Easy benchmarking of all publicly accessible implementations of convnets
Python
2,675
star
3

cvpr2015

Jupyter Notebook
869
star
4

cudnn.torch

Torch-7 FFI bindings for NVIDIA CuDNN
Lua
408
star
5

imagenet-multiGPU.torch

an imagenet example in torch.
Lua
395
star
6

torch-android

Torch-7 for Android
CMake
275
star
7

talks

Jupyter Notebook
261
star
8

net2net.torch

Implementation of http://arxiv.org/abs/1511.05641 that lets one build a larger net starting from a smaller one.
Lua
159
star
9

imagenetloader.torch

some old code that i wrote, might be useful to others
Shell
88
star
10

deepmind-atari

Lua
67
star
11

lua---audio

Module for torch to support audio i/o as well as do common operations like dFFT, generate spectrograms etc.
C
67
star
12

inception.torch

Torch port of https://github.com/google/inception
Jupyter Notebook
66
star
13

torch-signal

Signal processing toolbox for Torch 7
Lua
48
star
14

cuda-convnet2.torch

Torch7 bindings for cuda-convnet2 kernels!
Cuda
40
star
15

matio-ffi.torch

A LuaJIT FFI interface to MATIO and simple bindings for torch
Lua
39
star
16

galaxyzoo

Entry for GalaxyZoo challenge
Lua
35
star
17

eyescream

JavaScript
35
star
18

nextml

35
star
19

examplepackage.torch

A hello-world for torch packages
CMake
23
star
20

sunfish.lua

tiny and basic chess engine for lua. Port of https://github.com/thomasahle/sunfish
Lua
20
star
21

kaggle_retinopathy_starter.torch

A starter kit in Torch for Kaggle Diabetic Retinopathy Detection
Lua
19
star
22

neon.torch

Nervana Neon kernels in Torch
Lua
18
star
23

torch-ship-binaries

A page describing how to ship torch binaries without sharing the source code of your scripts.
17
star
24

nnjs

JavaScript
16
star
25

deep_gitstats

Based on SciPy's normalized git stats, adapted for Deep Learning frameworks
Jupyter Notebook
16
star
26

cifar.torch

Lua
15
star
27

torch.js

nodejs bindings for libTH (tensor library that powers torch). for fun!
JavaScript
14
star
28

fakecuda

A convenient package for the lazy torch programmer to leave all your :cuda() calls as-is when running on CPU
Lua
14
star
29

rgbd_streamer

Python
12
star
30

mscoco.torch

Lua
11
star
31

torch-docker

Dockerfile to create an image for Torch7
Shell
10
star
32

NeuralNetworks.jl

hacking torch-like neural networks in Julia
Julia
10
star
33

torch-cheatsheet

A quick page for everything Torch
9
star
34

fftw3-ffi

A LuaJIT FFI interface to FFTW3
Lua
5
star
35

thnb

iTorch notebooks
4
star
36

lzmqstatic

Self-contained statically linked zeromq bindings for lua
C++
3
star
37

nvblog_rnnlstm

HTML
3
star
38

fairmark1

Lua
2
star
39

cunnsparse

Lua
2
star
40

yasa

Yet another Sentiment analyzer. This one uses convolution networks.
Lua
1
star
41

cunnCUDA

some depreceated, ugly and old modules
Cuda
1
star
42

housenumbers_classifier

An attempt on the Stanford Housenumbers dataset
Lua
1
star
43

Bar__ZEbulLonX22L.torch

wtf
1
star