Stanford CS 224n Natural Language Processing with Deep Learning
Self study on Stanford CS 224n, Winter 2020. Special thanks to Stanford and Professor Chris Manning for making this great resources online and free to the public. No access to autograder, thus no guarantee that the solutions are correct.
Lecture Videos, CS 224n, Winter 2019
Lecture slides, CS 224n, Winter 2019
Lecture notes, CS 224n, Winter 2019
βοΈ
Assignment 1 Constructed count vectorized embeddings using co-occurance matrix and used Gensim word2vec to study predictions and language biases.
βοΈ
Assignment 2 Implemented and trained word2vec in Numpy.
Written: Understanding word2vec
Coding: Implementing word2vec
βοΈ
Assignment 3
βοΈ
Assignment 4 Coding: Neural Machine Translation with RNN
Left local Windows 10 machine with RTX 2080 Ti training overnight. Hit early stopping at around 11 hours. Test BLEU score 35.89.
Written: Analyzing NMT Systems
βοΈ
Assignment 5 Public Trained using batch_size=64
vs default 32, and set max_epoch=60
vs default 30 on a local RTX 2080 Ti. GPU memory at 10/11GB. Training reached maximum number of epochs after 34 hours, with training loss at the low 70s and validation perplexity at 59. Average words per second is around 2000 words per second. Test BLEU score 27.96.
Coding: Neural Machine Translation with RNN
Written: Neural Machine Translation with RNN
LICENSE
All slides, notes, assignments, and provided code scaffolds are owned by Stanford University.