• Stars
    star
    455
  • Rank 96,175 (Top 2 %)
  • Language
    Lua
  • License
    MIT License
  • Created over 8 years ago
  • Updated about 7 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Torch implementations of various types of autoencoders

Autoencoders

This repository is a Torch version of Building Autoencoders in Keras, but only containing code for reference - please refer to the original blog post for an explanation of autoencoders. Training hyperparameters have not been adjusted. The following models are implemented:

  • AE: Fully-connected autoencoder
  • SparseAE: Sparse autoencoder
  • DeepAE: Deep (fully-connected) autoencoder
  • ConvAE: Convolutional autoencoder
  • UpconvAE: Upconvolutional autoencoder - also known by several other names (bonus)
  • DenoisingAE: Denoising (convolutional) autoencoder [1, 2]
  • CAE: Contractive autoencoder (bonus) [3]
  • Seq2SeqAE: Sequence-to-sequence autoencoder
  • VAE: Variational autoencoder [4, 5]
  • CatVAE: Categorical variational autoencoder (bonus) [6, 7]
  • AAE: Adversarial autoencoder (bonus) [8]
  • WTA-AE: Winner-take-all autoencoder (bonus) [9]

Different models can be chosen using th main.lua -model <modelName>.

The denoising criterion can be used to replace the standard (autoencoder) reconstruction criterion by using the denoising flag. For example, a denoising AAE (DAAE) [10] can be set up using th main.lua -model AAE -denoising. The corruption process is additive Gaussian noise *~ N(0, 0.5)*.

MCMC sampling [10] can be used for VAEs, CatVAEs and AAEs with th main.lua -model <modelName> -mcmc <steps>. To see the effects of MCMC sampling with this simple setup it is best to choose a large standard deviation, e.g. -sampleStd 5, for the Gaussian distribution to draw the initial samples from.

Requirements

The following luarocks packages are required:

  • mnist
  • dpnn (for DenoisingAE)
  • rnn (for Seq2SeqAE)

Citation

If you find this library useful and would like to cite it, the following would be appropriate:

@misc{Autoencoders,
  author = {Arulkumaran, Kai},
  title = {Kaixhin/Autoencoders},
  url = {https://github.com/Kaixhin/Autoencoders},
  year = {2016}
}

References

[1] Vincent, P., Larochelle, H., Bengio, Y., & Manzagol, P. A. (2008, July). Extracting and composing robust features with denoising autoencoders. In Proceedings of the 25th international conference on Machine learning (pp. 1096-1103). ACM.
[2] Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y., & Manzagol, P. A. (2010). Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. Journal of Machine Learning Research, 11(Dec), 3371-3408.
[3] Rifai, S., Vincent, P., Muller, X., Glorot, X., & Bengio, Y. (2011). Contractive auto-encoders: Explicit invariance during feature extraction. In Proceedings of the 28th international conference on machine learning (ICML-11) (pp. 833-840).
[4] Kingma, D. P., & Welling, M. (2013). Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114.
[5] Rezende, D. J., Mohamed, S., & Wierstra, D. (2014). Stochastic Backpropagation and Approximate Inference in Deep Generative Models. In Proceedings of The 31st International Conference on Machine Learning (pp. 1278-1286).
[6] Jang, E., Gu, S., & Poole, B. (2016). Categorical Reparameterization with Gumbel-Softmax. arXiv preprint arXiv:1611.01144.
[7] Maddison, C. J., Mnih, A., & Teh, Y. W. (2016). The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables. arXiv preprint arXiv:1611.00712.
[8] Makhzani, A., Shlens, J., Jaitly, N., Goodfellow, I., & Frey, B. (2015). Adversarial autoencoders. arXiv preprint arXiv:1511.05644.
[9] Makhzani, A., & Frey, B. J. (2015). Winner-take-all autoencoders. In Advances in Neural Information Processing Systems (pp. 2791-2799).
[10] Arulkumaran, K., Creswell, A., & Bharath, A. A. (2016). Improving Sampling from Generative Autoencoders with Markov Chains. arXiv preprint arXiv:1610.09296.

More Repositories

1

Rainbow

Rainbow: Combining Improvements in Deep Reinforcement Learning
Python
1,424
star
2

grokking-pytorch

The Hitchiker's Guide to PyTorch
1,020
star
3

dockerfiles

Compilation of Dockerfiles with automated builds enabled on the Docker Registry
Dockerfile
503
star
4

PlaNet

Deep Planning Network: Control from pixels by latent planning with learned dynamics
Python
337
star
5

imitation-learning

Imitation learning algorithms
Python
297
star
6

Atari

Persistent advantage learning dueling double DQN for the Arcade Learning Environment
Lua
263
star
7

ACER

Actor-critic with experience replay
Python
251
star
8

FGLab

Future Gadget Laboratory
HTML
223
star
9

spinning-up-basic

Basic versions of agents from Spinning Up in Deep RL written in PyTorch
Python
197
star
10

FCN-semantic-segmentation

Fully convolutional networks for semantic segmentation
Python
185
star
11

NoisyNet-A3C

Noisy Networks for Exploration
Python
178
star
12

nninit

Weight initialisation schemes for Torch7 neural network modules
Lua
100
star
13

rlenvs

Reinforcement learning environments for Torch7
Lua
93
star
14

FGMachine

Future Gadget Machine
JavaScript
68
star
15

malmo-challenge

Malmo Collaborative AI Challenge - Team Pig Catcher
Python
65
star
16

torch-pastalog

A Torch interface for pastalog - simple, realtime visualization of neural network training performance
Lua
45
star
17

GUDRL

Generalised UDRL
Python
37
star
18

Dist-A3C

Distributed A3C
Python
35
star
19

EC

Episodic Control
Python
19
star
20

human-level-control

Presentation on Human-Level Control Through Deep Reinforcement Learning
HTML
13
star
21

Easy21

Reinforcement Learning Assignment: Easy21
Lua
11
star
22

end-to-end

Presentation on End-to-End Training of Deep Visuomotor Policies
HTML
9
star
23

docker-torch-mega

Docker image for Torch with CUDA support + extra Torch libraries
7
star
24

cuda-workshop

CUDA Workshop
Cuda
6
star
25

SARCOS

ML models trained on the SARCOS dataset
Python
6
star
26

IncSFA

Incremental Slow Feature Analysis
Lua
4
star
27

sybilsystem

MATLAB Deep Learning Library
MATLAB
1
star
28

MCAC

Minimal Criterion Artist Collective
Python
1
star
29

GlassMate

Team Inforaptor's project for IC Hack '14
Java
1
star
30

bakapunk

A tool for finding similar songs in your music library
JavaScript
1
star