π¦ Generative Deep Learning - 2nd Edition Codebase
The official code repository for the second edition of the O'Reilly book Generative Deep Learning: Teaching Machines to Paint, Write, Compose and Play.
π Book Chapters
Below is a outline of the book chapters:
Part I: Introduction to Generative Deep Learning
- Generative Modeling
- Deep Learning
Part II: Methods
- Variational Autoencoders
- Generative Adversarial Networks
- Autoregressive Models
- Normalizing Flows
- Energy-Based Models
- Diffusion Models
Part III: Applications
- Transformers
- Advanced GANs
- Music Generation
- World Models
- Multimodal Models
- Conclusion
π Star History
π Getting Started
Kaggle API
To download some of the datasets for the book, you will need a Kaggle account and an API token. To use the Kaggle API:
- Sign up for a Kaggle account.
- Go to the 'Account' tab of your user profile
- Select 'Create API Token'. This will trigger the download of
kaggle.json
, a file containing your API credentials.
The .env file
Create a file called .env
in the root directory, containing the following values (replacing the Kaggle username and API key with the values from the JSON):
JUPYTER_PORT=8888
TENSORBOARD_PORT=6006
KAGGLE_USERNAME=<your_kaggle_username>
KAGGLE_KEY=<your_kaggle_key>
Get set up with Docker
This codebase is designed to be run with Docker.
If you've never used Docker before, don't worry! I have included a guide to Docker in the Docker README file in this repository. This includes a full run through of why Docker is awesome and a brief guide to the Dockerfile
and docker-compose.yml
for this project.
Building the Docker image
If you do not have a GPU, run the following command:
docker compose build
If you do have a GPU that you wish to use, run the following command:
docker compose -f docker-compose.gpu.yml build
Running the container
If you do not have a GPU, run the following command:
docker compose up
If you do have a GPU that you wish to use, run the following command:
docker compose -f docker-compose.gpu.yml up
Jupyter will be available in your local browser, on the port specified in your env file - for example
http://localhost:8888
The notebooks that accompany the book are available in the /notebooks
folder, organized by chapter and example.
ποΈ Downloading data
The codebase comes with an in-built data downloader helper script.
Run the data downloader as follows (from outside the container), choosing one of the named datasets below:
bash scripts/download.sh [faces, bricks, recipes, flowers, wines, cellosuites, chorales]
π Tensorboard
Tensorboard is really useful for monitoring experiments and seeing how your model training is progressing.
To launch Tensorboard, run the following script (from outside the container):
<CHAPTER>
- the required chapter (e.g.03_vae
)<EXAMPLE>
- the required example (e.g.02_vae_fashion
)
bash scripts/tensorboard.sh <CHAPTER> <EXAMPLE>
Tensorboard will be available in your local browser on the port specified in your .env
file - for example:
http://localhost:6006
βοΈ Using a cloud virtual machine
To set up a virtual machine with GPU in Google Cloud Platform, follow the instructions in the Google Cloud README file in this repository.
π¦ Other resources
Some of the examples in this book are adapted from the excellent open source implementations that are available through the Keras website. I highly recommend you check out this resource as new models and examples are constantly being added.