• This repository has been archived on 10/Nov/2022
  • Stars
    star
    240
  • Rank 167,298 (Top 4 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created about 6 years ago
  • Updated about 6 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Code for "Understanding and Improving Interpolation in Autoencoders via an Adversarial Regularizer"

Adversarially Constrained Autoencoder Interpolations (ACAI)

Code for the paper "Understanding and Improving Interpolation in Autoencoders via an Adversarial Regularizer" by David Berthelot, Colin Raffel, Aurko Roy, and Ian Goodfellow.

This is not an officially supported Google product.

Setup

Config with virtualenv

sudo apt install virtualenv

cd <path_to_code>
virtualenv --system-site-packages env2
. env2/bin/activate
pip install -r requirements.txt

Config environment variables

Choose a folder where to save the datasets, for example ~/Data

export AE_DATA=~/Data

Installing datasets

python create_datasets.py

Training

CUDA_VISIBLE_DEVICES=0 python acai.py \
--train_dir=TEMP \
--latent=16 --latent_width=2 --depth=16 --dataset=celeba32

All training from the paper can be found in folder runs.

Models

These are the maintained models:

  • aae.py
  • acai.py
  • baseline.py
  • denoising.py
  • dropout.py
  • vae.py
  • vqvae.py

Classifiers / clustering

  • classifier_fc.py: fully connected single layer from raw pixels, see runs/classify.sh for examples.
  • Auto-encoder classification is trained at the same as the auto-encoder.
  • cluster.py: K-means clustering, see runs/cluster.sh for examples.

Utilities

  • create_datasets.py: see Installing datsets for more info.

Unofficial implementations

  • Kyle McDonald created a Pytorch version of ACAI here.

More Repositories

1

self-attention-gan

Python
976
star
2

realistic-ssl-evaluation

Open source release of the evaluation benchmark suite described in "Realistic Evaluation of Deep Semi-Supervised Learning Algorithms"
Python
452
star
3

guided-evolutionary-strategies

Guided Evolutionary Strategies
Jupyter Notebook
263
star
4

mpnn

Open source implementation of "Neural Message Passing for Quantum Chemistry"
Python
220
star
5

tensorfuzz

A library for performing coverage guided fuzzing of neural networks
Python
204
star
6

nngp

Deep neural network kernel for Gaussian process
Python
194
star
7

l2hmc

TensorFlow implementation for training MCMC samplers from the paper: Generalizing Hamiltonian Monte Carlo with Neural Network
Jupyter Notebook
180
star
8

deep-molecular-massspec

Mass Spectrometry for Small Molecules using Deep Learning
Python
110
star
9

long-term-video-prediction-without-supervision

Implementation of Hierarchical Long-term Video Prediction without Supervision
Python
91
star
10

data-linter

The Data Linter identifies potential issues (lints) in your ML training data.
Python
84
star
11

conv-sv

The Singular Values of Convolutional Layers
Python
71
star
12

ncp

Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors
Python
63
star
13

mean-field-cnns

Jupyter Notebook
35
star
14

mirage-rl

Code to reproduce the experiments in The Mirage of Action-Dependent Baselines in Reinforcement Learning.
Python
17
star
15

LeaveNoTrace

Leave No Trace is an algorithm for safe reinforcement learning.
Python
15
star
16

fisher-rao-regularization

Python
10
star
17

wip-lambada-lm

LSTM language model on LAMBADA dataset
Python
9
star
18

hyperbolictext

TensorFlow source code for learning embeddings of text sequences in an unsupervised manner.
Python
8
star
19

wip-constrained-extractor

Work in progress inference, learning, and evaluation code for extractive summarization.
Python
6
star
20

flying-shapes

A potentially infinite dataset of coloured shapes which bounce around on a black background.
Python
4
star
21

metaq

Python
3
star