💻 ⚡
Get started with JAX! The goal of this repo is to make it easier to get started with JAX, Flax, and Haiku!
JAX
ecosystem is becoming an increasingly popular alternative to PyTorch
and TensorFlow
.
Note: I'm only going to recommend content that I've personally analyzed and found useful here. If you want a comprehensive list check out the awesome-jax repo.
Table of Contents
My Machine Learning with JAX Tutorials
Tip on how to use notebooks: just open the notebook directly in Google Colab (you'll see a button on top of the Jupyter file which will direct you to Colab). This way you can avoid having to setup the Python env! (This was especially convenient for me since I'm on Windows which is still not supported)
Tutorial #1: From Zero to Hero
In this video, we start from the basics and then gradually dig into the nitty-gritty details
of jit
, grad
, vmap
, and various other idiosyncrasies of JAX.
YouTube Video (Tutorial #1)
Accompanying Jupyter Notebook
Tutorial #2: From Hero to HeroPro+
In this video, we learn all additional components needed to train ML models (such as NNs) on multiple machines! We'll train a simple MLP model and we'll even train an ML model on 8 TPU cores!
YouTube Video (Tutorial #2)
Accompanying Jupyter Notebook
Tutorial #3: Building a Neural Network from Scratch
Watch me code a Neural Network from scratch!
In this video, I build an MLP and train it as a classifier on MNIST using PyTorch's data loader (although it's trivial to use a more complex dataset) - all this in "pure" JAX (no Flax/Haiku/Optax).
I then do an additional analysis:
- Visualize MLP's learned weights
- Visualize embeddings of a batch of images using t-SNE
- Finally, I analyze whether we have too many dead ReLU neurons in our network
YouTube Video (Tutorial #3)
Accompanying Jupyter Notebook (Note: I'll soon refactor it but I'll link the original)
Tutorial #4: Machine Learning with Flax - From Zero to Hero
In this video, I cover everything you need to know to get started with Flax!
We cover init
, apply
, TrainState
, etc. and other idiosyncrasies like the usage of mutable
and rngs
keywords.
YouTube Video (Tutorial #4)
Accompanying Jupyter Notebook
Tutorial #5 (coming up): Machine Learning with Haiku - From Zero to Hero
todo
Other useful content
Aside from the official docs here are some resources that helped me.
Videos
- Introduction to JAX (gives a very high-level overview)
- JAX: Accelerated Machine Learning Research | SciPy 2020 | VanderPlas (many more details)
- NeurIPS 2020: JAX Ecosystem Meetup (DeepMind team about the ecosystem of libs around JAX)
- Introduction to JAX for Machine Learning and More (nice, hands-on workshop)
- Day 1 Talks: JAX, Flax & Transformers | HuggingFace (all 4 talks are good)
- Day 2 Talks: JAX, Flax & Transformers | HuggingFace (only the first 2 talks are relevant)
Blogs
- Using JAX to accelerate our research | DeepMind (similar info as the NeuroIPS 2020 video)
- You don't know JAX | Colin Raffel
Acknowledgements
Citation
If you find this content useful, please cite the following:
@misc{Gordic2021GetStartedWithJAX,
author = {Gordić, Aleksa},
title = {Get started with JAX},
year = {2021},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/gordicaleksa/get-started-with-JAX}},
}
Connect With Me
If you'd love to have some more AI-related content in your life
- Subscribing to my YouTube channel The AI Epiphany
🔔 - Follow me on LinkedIn and Twitter
💡 - Follow me on Medium
📚 ❤️ - Join the Discord community!
👪