There are no reviews yet. Be the first to send feedback to the community and the maintainers!
annotated-transformer
An annotated implementation of the Transformer paper.seq2seq-attn
Sequence-to-sequence model with LSTM encoder/decoders and attentionim2markup
Neural model for converting Image-to-Markup (by Yuntian Deng yuntiandeng.com)pytorch-struct
Fast, general, and tested differentiable structured prediction in PyTorchsent-conv-torch
Text classification using a convolutional neural network.namedtensor
Named Tensor implementation for Torchvar-attn
Latent Alignment and Variational Attentionsent-summary
neural-template-gen
struct-attn
Code for Structured Attention Networks https://arxiv.org/abs/1702.00887NeuralSteganography
STEGASURAS: STEGanography via Arithmetic coding and Strong neURAl modelSurnng
botnet-detection
Topological botnet detection datasets and graph neural network applicationsdata2text
sa-vae
compound-pcfg
cascaded-generation
Cascaded Text Generation with Markov TransformersTextFlow
boxscore-data
decomp-attn
Decomposable Attention Model for Sentence Pair Classification (from https://arxiv.org/abs/1606.01933)encoder-agnostic-adaptation
Encoder-Agnostic Adaptation for Conditional Language Generationgenbmm
CUDA kernels for generalized matrix-multiplication in PyTorchDeepLatentNLP
nmt-android
Neural Machine Translation on AndroidBSO
hmm-lm
seq2seq-talk
Talk-Latent
regulatory-prediction
Code and Data to accompany "Dilated Convolutions for Modeling Long-Distance Genomic Dependencies", presented at the ICML 2017 Workshop on Computational Biologyharvardnlp.github.io
strux
lie-access-memory
annotated-attention
DataModules
A state-less module system for torch-like languagesrush-nlp
seq2seq-attn-web
tutorial-deep-latent
MemN2N
Torch implementation of End-to-End Memory Networks (https://arxiv.org/abs/1503.08895)image-extraction
Extract images from PDFspaper-explorer
banded
Sparse banded diagonal matrices for pytorchtorax
cs6741
simple-recs
poser
iclr
cs6741-materials
Love Open Source and this site? Check out how you can help us