There are no reviews yet. Be the first to send feedback to the community and the maintainers!
Repository Details
This repository contains Python code for implementing a GPT (Generative Pre-trained Transformer) transformer model with self-attention logic. The code is based on Andrej Karpathy's tutorial on creating a transformer from scratch.