PicoGPT
Accompanying blog post: GPT in 60 Lines of Numpy
You've seen openai/gpt-2.
You've seen karpathy/minGPT.
You've even seen karpathy/nanoGPT!
But have you seen picoGPT??!?
picoGPT
is an unnecessarily tiny and minimal implementation of GPT-2 in plain NumPy. The entire forward pass code is 40 lines of code.
picoGPT features:
- Fast?
❌ Nah, picoGPT is megaSLOW🐌 - Training code?
❌ Error,4️⃣ 0️⃣ 4️⃣ not found - Batch inference?
❌ picoGPT is civilized, single file line, one at a time only - top-p sampling?
❌ top-k?❌ temperature?❌ categorical sampling?!❌ greedy?✅ - Readable?
gpt2.py
✅ gpt2_pico.py
❌ - Smol???
✅ ✅ ✅ ✅ ✅ ✅ YESS!!! TEENIE TINY in fact🤏
A quick breakdown of each of the files:
encoder.py
contains the code for OpenAI's BPE Tokenizer, taken straight from their gpt-2 repo.utils.py
contains the code to download and load the GPT-2 model weights, tokenizer, and hyper-parameters.gpt2.py
contains the actual GPT model and generation code which we can run as a python script.gpt2_pico.py
is the same asgpt2.py
, but in even fewer lines of code. Why? Because why not😎 👍 .
Dependencies
pip install -r requirements.txt
Tested on Python 3.9.10
.
Usage
python gpt2.py "Alan Turing theorized that computers would one day become"
Which generates
the most powerful machines on the planet.
The computer is a machine that can perform complex calculations, and it can perform these calculations in a way that is very similar to the human brain.
You can also control the number of tokens to generate, the model size (one of ["124M", "355M", "774M", "1558M"]
), and the directory to save the models:
python gpt2.py \
"Alan Turing theorized that computers would one day become" \
--n_tokens_to_generate 40 \
--model_size "124M" \
--models_dir "models"