Bayesian machine learning notebooks
This repository is a collection of notebooks about Bayesian Machine Learning. The following links display
some of the notebooks via nbviewer to ensure a proper rendering of formulas.
Dependencies are specified in requirements.txt
files in subdirectories.
-
Bayesian regression with linear basis function models. Introduction to Bayesian linear regression. Implementation with plain NumPy and scikit-learn. See also PyMC3 implementation.
-
Gaussian processes. Introduction to Gaussian processes for regression. Implementation with plain NumPy/SciPy as well as with scikit-learn and GPy.
-
Gaussian processes for classification. Introduction to Gaussian processes for classification. Implementation with plain NumPy/SciPy as well as with scikit-learn.
-
Sparse Gaussian processes. Introduction to sparse Gaussian processes using a variational approach. Example implementation with JAX.
-
Bayesian optimization. Introduction to Bayesian optimization. Implementation with plain NumPy/SciPy as well as with libraries scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
-
Variational inference in Bayesian neural networks. Demonstrates how to implement a Bayesian neural network and variational inference of weights. Example implementation with Keras.
-
Reliable uncertainty estimates for neural network predictions. Uses noise contrastive priors for Bayesian neural networks to get more reliable uncertainty estimates for OOD data. Implemented with Tensorflow 2 and Tensorflow Probability.
-
Latent variable models, part 1: Gaussian mixture models and the EM algorithm. Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Implementation with plain NumPy/SciPy and scikit-learn. See also PyMC3 implementation.
-
Latent variable models, part 2: Stochastic variational inference and variational autoencoders. Introduction to stochastic variational inference with a variational autoencoder as application example. Implementation with Tensorflow 2.x.
-
Deep feature consistent variational autoencoder. Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example implementation with Keras.
-
Conditional generation via Bayesian optimization in latent space. Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt.