The Fastai Extensions Repository
A centralized repository to improve the discoverability of non-official fastai extensions. All extensions are designed for fastai V2 unless told otherwise.
Do not hesitate to send a PR of start an issue to add elements to this list.
Domain specific
- TimeseriesAI (repo-tcapelle / repo-oguiza / repo-fast-track / discussion) a library to help you apply Deep Learning to your time series/ sequential datasets, in particular Time Series Classification (TSC) and Time Series Regression (TSR) problems
- Fast AI Audio (repo / discussion) allow you to quickly and easily build machine learning models for a wide variety of audio applications
- MetaAI (repo-V1 / discussion) meta-learning algorithms to train a model on a variety of learning tasks, such that it can solve new learning tasks using only a small number of training samples
- faimed3d (repo / docs / discussion) for processing volumetric medical data such as CT or MRI images with multiple sequences and building 3d models for classification/segmentation.
Models
- TabNet (repo / discussion) attention-based network for tabular data
- FastHug (repo / discussion) use fastai-v2 with HuggingFace's pretrained transformers
- Fastai2 Tabular Hybrid (repo / discussion) hybrid approaches to supporting more datatypes with fastai2 tabular
- TabularGP (repo / discussion) gaussian process for tabular data
- Fastseq (repo / discussion) implements the N-Beats time serie forecasting model
- Mish (repo / discussion) Mish Deep Learning Activation Function
Callbacks
- Manifold mixup and Output mixup (repo / discussion) applies mixup on inner layers for improved benefits and aplicability to arbitrary input types
- BatchLossFilter (repo-V1 / discussion) speed-up learning by focussing on the harder samples
- Cutout, Ricap and Cutmix (repo-V1 / discussion) image data augmentation techniques
- Blend (repo-V1 / discussion) image data augmentation that generalizes MixUp, Cutout, CutMix, RICAP and allows for data augmentation rate scheduling
- MixMatch (repo-V1 / discussion) state-of-the-art semi-supervised learning
Interpretation
- FastShap (old fork / discussion) using the SHAP interpretability library with fastai (now merged with fastinference)
- The Colorful Dimension (repo-V1 / discussion) charts made by plotting the activations histogram epoch by epoch, coloring the pixel according to log of intensity
- The Twin Peaks Chart (repo-V1 / discussion) a tool to evaluate the health of your classification model in real time
- Tensorboard Callback (repo-V1 / discussion) logs model and training information to display them with tensorboard
- FastAI-LIME (repo-V1) interpreting fastai CNN models using LIME
- Feature importance (repo-V1 / discussion) computing feature importance for tabular learners using the permutation method
Inference
- Fastinference (repo / discussion) a collection of inference modules for fastai including inference speedup and interpretability
- Fastinference-onnx (repo) an ONNX only version of fastai
- fastinference-pytorch (repo) a PyTorch-only version of fastai
Hyperparameters
- Batch size finder (repo / discussion) batch size finder from OpenAI
- wd finder (repo-V1) an extension of the learning rate finder to find a proper weight decay by grid search
- Curriculum Learning Dropout (repo-V1 / discussion) dropout scheduler
Optimizers
- Ranger (repo / discussion) a synergistic optimizer combining RAdam (Rectified Adam), LookAhead and Gradient Centralization
Notebook
- DDip (repo / discussion) iPython extension to enable PyTorch's Distributed Data Parallel in fastai's notebooks
Deployment
- Fastai serving (repo / discussion) a Docker image for serving fastai models, mimicking the API of Tensorflow Serving
- Fastai2 Starlette (repo) a starting point to deploy models with Starlette
- FastAPI-Fastai2 (repo) template to deploy models with FastAPI