• Stars
    star
    116
  • Rank 303,894 (Top 6 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 6 years ago
  • Updated about 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Latent Normalizing Flows for Discrete Sequences

This code provides an implementation for and reproduces results from Latent Normalizing Flows for Discrete Sequences Zachary Ziegler, Alexander Rush ICML 2019

Dependencies

The code was tested with python 3.6, pytorch 0.4.1, torchtext 0.2.3, and CUDA 9.2.

Character-level language modeling:

PTB data with Mikolov preprocessing is checked in.

Baseline and proposed models can be trained with:

python main.py --dataset ptb --run_name charptb_baselinelstm --model_type baseline --dropout_p 0.1 --optim sgd --lr 20
python main.py --dataset ptb --run_name charptb_discreteflow_af-af 
python main.py --dataset ptb --run_name charptb_discreteflow_af-scf --hiddenflow_scf_layers
python main.py --dataset ptb --run_name charptb_discreteflow_iaf-scf --dropout_p 0 --hiddenflow_flow_layers 3 --hiddenflow_scf_layers --prior_type IAF

Evaluation on the test set is run with e.g.

python main.py --dataset ptb --run_name charptb_baselinelstm --model_type baseline --dropout_p 0.1 --optim sgd --lr 20 --load_dir output/charptb_baselinelstm/saves/<save> --evaluate

Polyphonic music:

Commands to train the baseline and proposed models are

Nottingham:

python main.py --indep_bernoulli --dataset nottingham --run_name nottingham_baselinelstm --model_type baseline --dropout_p 0.1 --optim sgd --lr 20 --patience 4
python main.py --indep_bernoulli --dataset nottingham --run_name nottingham_discreteflow_af-af --patience 2 --ELBO_samples 1 --B_train 5 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15
python main.py --indep_bernoulli --dataset nottingham --run_name nottingham_discreteflow_af-scf --patience 2 --ELBO_samples 1 --B_train 5 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --hiddenflow_scf_layers
python main.py --indep_bernoulli --dataset nottingham --run_name nottingham_discreteflow_iaf-scf --patience 2 --ELBO_samples 1 --B_train 5 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --hiddenflow_scf_layers --prior_type IAF

Piano_midi

python main.py --indep_bernoulli --dataset piano_midi --run_name piano_midi_baselinelstm --model_type baseline --dropout_p 0.1 --optim sgd --lr 20 --patience 4
python main.py --indep_bernoulli --dataset piano_midi --run_name piano_midi_discreteflow_af-af --patience 2 --ELBO_samples 1 --B_train 1 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --nll_samples 4 --grad_accum 8
python main.py --indep_bernoulli --dataset piano_midi --run_name piano_midi_discreteflow_af-scf --patience 2 --ELBO_samples 1 --B_train 1 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --nll_samples 4 --grad_accum 8 --hiddenflow_scf_layers
python main.py --indep_bernoulli --dataset piano_midi --run_name piano_midi_discreteflow_iaf-scf --patience 2 --ELBO_samples 1 --B_train 1 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --nll_samples 4 --grad_accum 8 --hiddenflow_scf_layers --prior_type IAF

Musedata

python main.py --indep_bernoulli --dataset muse_data --run_name muse_data_baselinelstm --model_type baseline --dropout_p 0.1 --optim sgd --lr 20 --patience 4
python main.py --indep_bernoulli --dataset muse_data --run_name muse_data_discreteflow_af-af --patience 2 --ELBO_samples 1 --B_train 1 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --nll_samples 4 --grad_accum 8
python main.py --indep_bernoulli --dataset muse_data --run_name muse_data_discreteflow_af-scf --patience 2 --ELBO_samples 1 --B_train 1 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --nll_samples 4 --grad_accum 8 --hiddenflow_scf_layers
python main.py --indep_bernoulli --dataset muse_data --run_name muse_data_discreteflow_iaf-scf --patience 2 --ELBO_samples 1 --B_train 1 --B_val 1 --initial_kl_zero 20 --kl_rampup_time 15 --nll_samples 4 --grad_accum 8 --hiddenflow_scf_layers --prior_type IAF

JSB_chorales

python main.py --indep_bernoulli --dataset jsb_chorales --run_name jsb_chorales_baselinelstm --model_type baseline --dropout_p 0.1 --optim sgd --lr 20 --patience 4
python main.py --indep_bernoulli --dataset jsb_chorales --run_name jsb_chorales_discreteflow_af-af --patience 2 --ELBO_samples 1 --B_train 5 --B_val 64 --initial_kl_zero 20 --kl_rampup_time 15
python main.py --indep_bernoulli --dataset jsb_chorales --run_name jsb_chorales_discreteflow_af-scf --patience 2 --ELBO_samples 1 --B_train 5 --B_val 64 --initial_kl_zero 20 --kl_rampup_time 15 --hiddenflow_scf_layers
python main.py --indep_bernoulli --dataset jsb_chorales --run_name jsb_chorales_discreteflow_iaf-scf --patience 2 --ELBO_samples 1 --B_train 5 --B_val 64 --initial_kl_zero 20 --kl_rampup_time 15 --hiddenflow_scf_layers --prior_type IAF

More Repositories

1

annotated-transformer

An annotated implementation of the Transformer paper.
Jupyter Notebook
5,683
star
2

seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
Lua
1,257
star
3

im2markup

Neural model for converting Image-to-Markup (by Yuntian Deng yuntiandeng.com)
Lua
1,203
star
4

pytorch-struct

Fast, general, and tested differentiable structured prediction in PyTorch
Jupyter Notebook
1,107
star
5

sent-conv-torch

Text classification using a convolutional neural network.
Lua
448
star
6

namedtensor

Named Tensor implementation for Torch
Jupyter Notebook
443
star
7

var-attn

Latent Alignment and Variational Attention
Python
326
star
8

sent-summary

300
star
9

neural-template-gen

Python
262
star
10

struct-attn

Code for Structured Attention Networks https://arxiv.org/abs/1702.00887
Lua
237
star
11

NeuralSteganography

STEGASURAS: STEGanography via Arithmetic coding and Strong neURAl modelS
Python
183
star
12

urnng

Python
176
star
13

botnet-detection

Topological botnet detection datasets and graph neural network applications
Python
169
star
14

data2text

Lua
158
star
15

sa-vae

Python
154
star
16

compound-pcfg

Python
127
star
17

cascaded-generation

Cascaded Text Generation with Markov Transformers
Python
127
star
18

boxscore-data

HTML
111
star
19

decomp-attn

Decomposable Attention Model for Sentence Pair Classification (from https://arxiv.org/abs/1606.01933)
Lua
95
star
20

encoder-agnostic-adaptation

Encoder-Agnostic Adaptation for Conditional Language Generation
Python
79
star
21

genbmm

CUDA kernels for generalized matrix-multiplication in PyTorch
Jupyter Notebook
79
star
22

DeepLatentNLP

61
star
23

nmt-android

Neural Machine Translation on Android
Lua
59
star
24

BSO

Lua
54
star
25

hmm-lm

Python
42
star
26

seq2seq-talk

TeX
39
star
27

Talk-Latent

TeX
31
star
28

regulatory-prediction

Code and Data to accompany "Dilated Convolutions for Modeling Long-Distance Genomic Dependencies", presented at the ICML 2017 Workshop on Computational Biology
Python
28
star
29

harvardnlp.github.io

JavaScript
26
star
30

strux

Python
18
star
31

lie-access-memory

Lua
17
star
32

annotated-attention

Jupyter Notebook
15
star
33

DataModules

A state-less module system for torch-like languages
Python
8
star
34

rush-nlp

JavaScript
8
star
35

seq2seq-attn-web

CSS
8
star
36

tutorial-deep-latent

TeX
7
star
37

MemN2N

Torch implementation of End-to-End Memory Networks (https://arxiv.org/abs/1503.08895)
Lua
6
star
38

image-extraction

Extract images from PDFs
Jupyter Notebook
4
star
39

paper-explorer

JavaScript
3
star
40

readcomp

Entity Tracking Improves Cloze-style Reading Comprehension
Python
3
star
41

banded

Sparse banded diagonal matrices for pytorch
Cuda
2
star
42

torax

Python
2
star
43

cs6741

HTML
2
star
44

simple-recs

Python
1
star
45

poser

Python
1
star
46

iclr

1
star
47

cs6741-materials

1
star