LLM Training Puzzles
- by Sasha Rush - srush_nlp
This is a collection of 8 challenging puzzles about training large language models (or really any NN) on many, many GPUs. Very few people actually get a chance to train on thousands of computers, but it is an interesting challenge and one that is critically important for modern AI. The goal of these puzzles is to get hands-on experience with the key primitives and to understand the goals of memory efficiency and compute pipelining.
I recommend running in Colab. Click here and copy the notebook to get start.
If you are into this kind of thing, this is 6th in a series of these puzzles.