Natural language processing (NLP) is one of the most important technologies of the information age. Understanding complex language utterances is also a crucial part of artificial intelligence. Applications of NLP are everywhere because people communicate most everything in language: web search, advertisement, emails, customer service, language translation, radiology reports, etc. There are a large variety of underlying tasks and machine learning models behind NLP applications. Recently, deep learning approaches have obtained very high performance across many different NLP tasks. These models can often be trained with a single end-to-end model and do not require traditional, task-specific feature engineering. In this winter quarter course students will learn to implement, train, debug, visualize and invent their own neural network models. The course provides a thorough introduction to cutting-edge research in deep learning applied to NLP. On the model side we will cover word vector representations, window-based neural networks, recurrent neural networks, long-short-term-memory models, recursive neural networks, convolutional neural networks as well as some recent models involving a memory component. Through lectures and programming assignments students will learn the necessary engineering tricks for making neural networks work on practical problems.
Table of Contents
Lectures
- Lecture 1: Introduction to NLP and Deep Learning (Video, Slides + Review)
- Lecture 2: Word Vector Representations - word2vec (Video, Slides + Readings)
- Lecture 3: GloVe - Global Vectors for Word Representation (Video, Slides + Readings)
- Lecture 4: Word Window Classification and Neural Networks (Video, Slides + Readings)
- Lecture 5: Backpropagation and Project Advice (Video, Slides + Readings)
- Lecture 6: Dependency Parsing (Video, Slides + Readings)
- Lecture 7: Introduction to TensorFlow (Video, Slides + Readings)
- Lecture 8: Recurrent Neural Networks and Language Models (Video, Slides + Readings)
- Lecture 9: Machine translation and advanced recurrent LSTMs and GRUs (Video, Slides + Readings)
- Lecture 10: Neural Machine Translation and Models with Attention (Video, Slides + Readings)
- Lecture 11: Gated recurrent units and further topics in NMT (Video, Slides + Readings)
- Lecture 12: End-to-end models for Speech Processing (Video, Slides)
- Lecture 13: Convolutional Neural Networks (Video, Slides + Readings)
- Lecture 14: Tree Recursive Neural Networks and Constituency Parsing (Video, Slides + Readings)
- Lecture 15: Coreference Resolution (Video, Slides + Readings)
- Lecture 16: Dynamic Neural Networks for Question Answering (Video, Slides + Readings)
- Lecture 17: Issues in NLP and Possible Architectures for NLP (Video, Slides + Readings)
- Lecture 18: Tackling the Limits of Deep Learning for NLP (Video, Slides + Readings)