• Stars
    star
    5,476
  • Rank 7,417 (Top 0.2 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created over 6 years ago
  • Updated 5 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

An annotated implementation of the Transformer paper.

Code for The Annotated Transformer blog post:

http://nlp.seas.harvard.edu/annotated-transformer/

Open In Colab

image

Package Dependencies

Use requirements.txt to install library dependencies with pip:

pip install -r requirements.txt

Notebook Setup

The Annotated Transformer is created using jupytext.

Regular notebooks pose problems for source control - cell outputs end up in the repo history and diffs between commits are difficult to examine. Using jupytext, there is a python script (.py file) that is automatically kept in sync with the notebook file by the jupytext plugin.

The python script is committed contains all the cell content and can be used to generate the notebook file. The python script is a regular python source file, markdown sections are included using a standard comment convention, and outputs are not saved. The notebook itself is treated as a build artifact and is not commited to the git repository.

Prior to using this repo, make sure jupytext is installed by following the installation instructions here.

To produce the .ipynb notebook file using the markdown source, run (under the hood, the notebook build target simply runs jupytext --to ipynb the_annotated_transformer.py):

make notebook

To produce the html version of the notebook, run:

make html

make html is just a shortcut for for generating the notebook with jupytext --to ipynb the_annotated_transformer.py followed by using the jupyter nbconvert command to produce html using jupyter nbconvert --to html the_annotated_transformer.ipynb

Formatting and Linting

To keep the code formatting clean, the annotated transformer git repo has a git action to check that the code conforms to PEP8 coding standards.

To make this easier, there are two Makefile build targets to run automatic code formatting with black and flake8.

Be sure to install black and flake8.

You can then run:

make black

(or alternatively manually call black black --line-length 79 the_annotated_transformer.py) to format code automatically using black and:

make flake

(or manually call flake8 `flake8 --show-source the_annotated_transformer.py) to check for PEP8 violations.

It's recommended to run these two commands and fix any flake8 errors that arise, when submitting a PR, otherwise the github actions CI will report an error.

More Repositories

1

seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
Lua
1,252
star
2

im2markup

Neural model for converting Image-to-Markup (by Yuntian Deng yuntiandeng.com)
Lua
1,194
star
3

pytorch-struct

Fast, general, and tested differentiable structured prediction in PyTorch
Jupyter Notebook
1,101
star
4

sent-conv-torch

Text classification using a convolutional neural network.
Lua
447
star
5

namedtensor

Named Tensor implementation for Torch
Jupyter Notebook
439
star
6

var-attn

Latent Alignment and Variational Attention
Python
326
star
7

sent-summary

299
star
8

neural-template-gen

Python
261
star
9

struct-attn

Code for Structured Attention Networks https://arxiv.org/abs/1702.00887
Lua
235
star
10

NeuralSteganography

STEGASURAS: STEGanography via Arithmetic coding and Strong neURAl modelS
Python
182
star
11

urnng

Python
176
star
12

botnet-detection

Topological botnet detection datasets and graph neural network applications
Python
165
star
13

data2text

Lua
158
star
14

sa-vae

Python
155
star
15

compound-pcfg

Python
126
star
16

cascaded-generation

Cascaded Text Generation with Markov Transformers
Python
126
star
17

TextFlow

Python
115
star
18

boxscore-data

HTML
109
star
19

decomp-attn

Decomposable Attention Model for Sentence Pair Classification (from https://arxiv.org/abs/1606.01933)
Lua
95
star
20

encoder-agnostic-adaptation

Encoder-Agnostic Adaptation for Conditional Language Generation
Python
79
star
21

genbmm

CUDA kernels for generalized matrix-multiplication in PyTorch
Jupyter Notebook
78
star
22

DeepLatentNLP

60
star
23

nmt-android

Neural Machine Translation on Android
Lua
59
star
24

BSO

Lua
54
star
25

hmm-lm

Python
42
star
26

seq2seq-talk

TeX
38
star
27

Talk-Latent

TeX
31
star
28

regulatory-prediction

Code and Data to accompany "Dilated Convolutions for Modeling Long-Distance Genomic Dependencies", presented at the ICML 2017 Workshop on Computational Biology
Python
28
star
29

harvardnlp.github.io

JavaScript
26
star
30

strux

Python
18
star
31

lie-access-memory

Lua
17
star
32

annotated-attention

Jupyter Notebook
15
star
33

DataModules

A state-less module system for torch-like languages
Python
8
star
34

seq2seq-attn-web

CSS
8
star
35

rush-nlp

JavaScript
7
star
36

tutorial-deep-latent

TeX
7
star
37

MemN2N

Torch implementation of End-to-End Memory Networks (https://arxiv.org/abs/1503.08895)
Lua
6
star
38

image-extraction

Extract images from PDFs
Jupyter Notebook
4
star
39

paper-explorer

JavaScript
3
star
40

readcomp

Entity Tracking Improves Cloze-style Reading Comprehension
Python
3
star
41

banded

Sparse banded diagonal matrices for pytorch
Cuda
2
star
42

torax

Python
2
star
43

cs6741

HTML
2
star
44

simple-recs

Python
1
star
45

poser

Python
1
star
46

iclr

1
star
47

cs6741-materials

1
star