• Stars
    star
    287
  • Rank 144,232 (Top 3 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created almost 8 years ago
  • Updated almost 8 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Unsupervised clustering with (Gaussian mixture) VAEs

VAE-Clustering

A collection of experiments that shines light on VAE (containing discrete latent variables) as a clustering algorithm.

We evaluate the unsupervised clustering performance of three closely-related sets of deep generative models:

  1. Kingma's M2 model
  2. A modified-M2 model that implicitly contains a non-degenerate Gaussian mixture latent layer
  3. An explicit Gaussian Mixture VAE model

Details about the three models and why to compare them are provided in this blog post.

Results

M2 performs poorly as an unsupervised clustering algorithm. We suspect this is attributable to conflicting wishes to use the categorical variable as part of the generative model versus the inference model. By implicitly enforcing the a hidden layer to have a proper Gaussian mixture distribution, the modified-M2 model tips the scale in favor of using the categorical variable as part of the generative model. By using an explicit Gaussian Mixture VAE model, we achieve enable better inference, which leads to higher stability during training and even a stronger incentive to use the categorical variable in the generative model.

Code set-up

The experiments are implemented using TensorFlow. Since all of the three aforementioned models share very similar formulations, the shared subgraphs are placed in shared_subgraphs.py. The utils.py file contains some additional functions used during training. The remaining *.py files simply implement the three main model classes and other variants that we tried.

We recommend first reading the Jupyter Notebook on nbviewer in the Chrome browser.

Dependencies

  1. tensorflow
  2. tensorbayes
  3. numpy
  4. scipy

More Repositories

1

dirt-t

A DIRT-T Approach to Unsupervised Domain Adaptation (ICLR 2018)
Python
174
star
2

nn-bayesian-optimization

We use a modified neural network instead of Gaussian process for Bayesian optimization.
Python
105
star
3

cvae

Conditional variational autoencoder implementation in Torch
Jupyter Notebook
102
star
4

tensorsketch

A lightweight library for tensorflow 2.0
Python
66
star
5

vae-experiments

Code for some of the experiments I did with variational autoencoders on multi-modality and atari video prediction. Atari video prediction is work-in-progress.
Lua
62
star
6

micro-projects

A collection of small code snippets for learning how to code
Jupyter Notebook
58
star
7

tensorbayes

Deep variational inference in tensorflow
Python
56
star
8

began

Boundary equilibrium GAN implementation in Tensorflow
Python
15
star
9

kaos

Deep variational inference library for Keras
Python
15
star
10

fast-style-transfer

Fast style transfer in TensorFlow
Python
14
star
11

tensorflow-gp

Implementation of gaussian processes and bayesian optimization in tensorflow
Jupyter Notebook
11
star
12

one-bit-vae

A silly and weirdly useful experiment where I attempt to encode one bit of information with a VAE
Jupyter Notebook
11
star
13

variational-autoencoder

Basic implementation of variational autoencoders in Torch
Jupyter Notebook
9
star
14

acgan-biased

Experiments verifying that AC-GAN downsamples points near decision boundary (NIPS BDL 2017)
Python
9
star
15

deep-generative-models

Deep generative models in Tensorflow
Python
6
star
16

ConvFeFe

The best neural network
Python
4
star
17

bcde

Bottleneck Conditional Density Estimation (ICML 2017)
Python
4
star
18

vda-hax

Simple tricks to improve visual domain adaptation for MNIST -> SVHN
Python
3
star