• Stars
    star
    220
  • Rank 180,422 (Top 4 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created over 3 years ago
  • Updated 11 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Bayes-Newton—A Gaussian process library in JAX, with a unifying view of approximate Bayesian inference as variants of Newton's method.

Bayes-Newton

Bayes-Newton is a library for approximate inference in Gaussian processes (GPs) in JAX (with objax), built and maintained by Will Wilkinson.

Bayes-Newton provides a unifying view of approximate Bayesian inference, and allows for the combination of many models (e.g. GPs, sparse GPs, Markov GPs, sparse Markov GPs) with the inference method of your choice (VI, EP, Laplace, Linearisation). For a full list of the methods implemented scroll down to the bottom of this page.

The methodology is outlined in the following article:

Installation

Latest (stable) release from PyPI

pip install bayesnewton

For development, you might want to use the latest source from GitHub: In a check-out of the develop branch of the BayesNewton GitHub repository, run

pip install -e .

Step-by-step: Getting started with the examples

For running the demos or experiments in this repository or building on top of it, you can follow these steps for creating a virtual environment and activating it:

python3 -m venv venv
source venv/bin/activate

Installing all required dependencies for the examples:

python -m pip install -r requirements.txt
python -m pip install -e .

Running the tests requires additionally a specific version of GPflow to test against:

python -m pip install pytest
python -m pip install tensorflow==2.10 tensorflow-probability==0.18.0 gpflow==2.5.2

Run tests

cd tests; pytest

Simple Example

Given some inputs x and some data y, you can construct a Bayes-Newton model as follows,

kern = bayesnewton.kernels.Matern52()
lik = bayesnewton.likelihoods.Gaussian()
model = bayesnewton.models.MarkovVariationalGP(kernel=kern, likelihood=lik, X=x, Y=y)

The training loop (inference and hyperparameter learning) is then set up using objax's built in functionality:

lr_adam = 0.1
lr_newton = 1
opt_hypers = objax.optimizer.Adam(model.vars())
energy = objax.GradValues(model.energy, model.vars())

@objax.Function.with_vars(model.vars() + opt_hypers.vars())
def train_op():
    model.inference(lr=lr_newton, **inf_args)  # perform inference and update variational params
    dE, E = energy(**inf_args)  # compute energy and its gradients w.r.t. hypers
    opt_hypers(lr_adam, dE)  # update the hyperparameters
    return E

As we are using JAX, we can JIT compile the training loop:

train_op = objax.Jit(train_op)

and then run the training loop,

iters = 20
for i in range(1, iters + 1):
    loss = train_op()

Full demos are available here.

Citing Bayes-Newton

@article{wilkinson2023bayes,
  title={{B}ayes--{N}ewton Methods for Approximate {B}ayesian Inference with {PSD} Guarantees},
  author={Wilkinson, William J and S{\"a}rkk{\"a}, Simo and Solin, Arno},
  journal={Journal of Machine Learning Research},
  volume={24},
  number={83},
  pages={1--50},
  year={2023}
}

Implemented Models

For a full list of the all the models available see the model class list.

Variational GPs

  • Variationl GP (Opper, Archambeau: The Variational Gaussian Approximation Revisited, Neural Computation 2009; Khan, Lin: Conugate-Computation Variational Inference - Converting Inference in Non-Conjugate Models in to Inference in Conjugate Models, AISTATS 2017)
  • Sparse Variational GP (Hensman, Matthews, Ghahramani: Scalable Variational Gaussian Process Classification, AISTATS 2015; Adam, Chang, Khan, Solin: Dual Parameterization of Sparse Variational Gaussian Processes, NeurIPS 2021)
  • Markov Variational GP (Chang, Wilkinson, Khan, Solin: Fast Variational Learning in State Space Gaussian Process Models, MLSP 2020)
  • Sparse Markov Variational GP (Adam, Eleftheriadis, Durrande, Artemev, Hensman: Doubly Sparse Variational Gaussian Processes, AISTATS 2020; Wilkinson, Solin, Adam: Sparse Algorithms for Markovian Gaussian Processes, AISTATS 2021)
  • Spatio-Temporal Variational GP (Hamelijnck, Wilkinson, Loppi, Solin, Damoulas: Spatio-Temporal Variational Gaussian Processes, NeurIPS 2021)

Expectation Propagation GPs

  • Expectation Propagation GP (Minka: A Family of Algorithms for Approximate Bayesian Inference, Ph. D thesis 2000)
  • (Stochastic) Sparse Expectation Propagation GP (Csato, Opper: Sparse on-line Gaussian processes, Neural Computation 2002; Bui, Yan, Turner: A Unifying Framework for Gaussian Process Pseudo Point Approximations Using Power Expectation Propagation, JMLR 2017)
  • Markov Expectation Propagation GP (Wilkinson, Chang, Riis Andersen, Solin: State Space Expectation Propagation, ICML 2020)
  • Sparse Markov Expectation Propagation GP (Wilkinson, Solin, Adam: Sparse Algorithms for Markovian Gaussian Processes, AISTATS 2021)

Laplace/Newton GPs

  • Laplace GP (Rasmussen, Williams: Gaussian Processes for Machine Learning, 2006)
  • Sparse Laplace GP (Wilkinson, Särkkä, Solin: Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees)
  • Markov Laplace GP (Wilkinson, Särkkä, Solin: Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees)
  • Sparse Markov Laplace GP

Linearisation GPs

  • Posterior Linearisation GP (García-Fernández, Tronarp, Sarkka: Gaussian Process Classification Using Posterior Linearization, IEEE Signal Processing 2019; Steinberg, Bonilla: Extended and Unscented Gaussian Processes, NeurIPS 2014)
  • Sparse Posterior Linearisation GP
  • Markov Posterior Linearisation GP (García-Fernández, Svensson, Sarkka: Iterated Posterior Linearization Smoother, IEEE Automatic Control 2016; Wilkinson, Chang, Riis Andersen, Solin: State Space Expectation Propagation, ICML 2020)
  • Sparse Markov Posterior Linearisation GP (Wilkinson, Solin, Adam: Sparse Algorithms for Markovian Gaussian Processes, AISTATS 2021)
  • Taylor Expansion / Analytical Linearisaiton GP (Steinberg, Bonilla: Extended and Unscented Gaussian Processes, NeurIPS 2014)
  • Markov Taylor GP / Extended Kalman Smoother (Bell: The Iterated Kalman Smoother as a Gauss-Newton method, SIAM Journal on Optimization 1994)
  • Sparse Taylor GP
  • Sparse Markov Taylor GP / Sparse Extended Kalman Smoother (Wilkinson, Solin, Adam: Sparse Algorithms for Markovian Gaussian Processes, AISTATS 2021)

Gauss-Newton GPs

(Wilkinson, Särkkä, Solin: Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees)

  • Gauss-Newton
  • Variational Gauss-Newton
  • PEP Gauss-Newton
  • 2nd-order PL Gauss-Newton

Quasi-Newton GPs

(Wilkinson, Särkkä, Solin: Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees)

  • Quasi-Newton
  • Variational Quasi-Newton
  • PEP Quasi-Newton
  • PL Quasi-Newton

GPs with PSD Constraints via Riemannian Gradients

  • VI Riemann Grad (Lin, Schmidt, Khan: Handling the Positive-Definite Constraint in the Bayesian Learning Rule, ICML 2020)
  • Newton/Laplace Riemann Grad (Lin, Schmidt, Khan: Handling the Positive-Definite Constraint in the Bayesian Learning Rule, ICML 2020)
  • PEP Riemann Grad (Wilkinson, Särkkä, Solin: Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees)

Others

  • Infinite Horizon GP (Solin, Hensman, Turner: Infinite-Horizon Gaussian Processes, NeurIPS 2018)
  • Parallel Markov GP (with VI, EP, PL, ...) (Särkkä, García-Fernández: Temporal parallelization of Bayesian smoothers; Corenflos, Zhao, Särkkä: Gaussian Process Regression in Logarithmic Time; Hamelijnck, Wilkinson, Loppi, Solin, Damoulas: Spatio-Temporal Variational Gaussian Processes, NeurIPS 2021)
  • 2nd-order Posterior Linearisation GP (sparse, Markov, ...) (Wilkinson, Särkkä, Solin: Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees)

License

This software is provided under the Apache License 2.0. See the accompanying LICENSE file for details.

More Repositories

1

SDE

Example codes for the book Applied Stochastic Differential Equations
MATLAB
172
star
2

kalman-jax

Approximate inference for Markov Gaussian processes using iterated Kalman smoothing, in JAX
Jupyter Notebook
94
star
3

GP-MVS

Multi-View Stereo by Temporal Nonparametric Fusion
Python
60
star
4

vio_benchmark

Tools for benchmarking different Visual-Inertial Odometry solutions
Python
58
star
5

generative-inverse-heat-dissipation

Code release for the paper Generative Modeling With Inverse Heat Dissipation
Python
55
star
6

spatio-temporal-GPs

Code for NeurIPS 2021 paper 'Spatio-Temporal Variational Gaussian Processes'
Python
46
star
7

android-viotester

Visual-Inertial Odometry (VIO) benchmark app for Android
Java
45
star
8

IHGP

Infinite-horizon Gaussian processes
MATLAB
30
star
9

boundary-gp

Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
Jupyter Notebook
23
star
10

mobile-cv-suite

A Computer Vision & real-time ML resarch library for mobile phones. The building blocks required to test new algorithms on mobile & embedded devices.
Shell
21
star
11

SLAM-module

SLAM module
C++
20
star
12

PeriodicBNN

Code for 'Periodic Activation Functions Induce Stationarity' (NeurIPS 2021)
Jupyter Notebook
17
star
13

uncertainty-nerf-gs

Code release for the paper "Sources of Uncertainty in 3D Scene Reconstruction"
Python
12
star
14

hilbert-gp

Codes for Hilbert space reduced-rank GP regression
MATLAB
11
star
15

scalable-inference-in-sdes

Methods and experiments for assumed density SDE approximations
Jupyter Notebook
11
star
16

nonstationary-audio-gp

End-to-End Probabilistic Inference for Nonstationary Audio Analysis
MATLAB
11
star
17

sequential-gp

Code for 'Memory-based dual Gaussian processes for sequential learning' (ICML 2023)
Jupyter Notebook
10
star
18

stationary-activations

Codes for 'Stationary Activations for Uncertainty Calibration in Deep Learning' (NeurIPS 2020)
Jupyter Notebook
10
star
19

t-SVGP

Codes for 'Dual Parameterization of Sparse Variational Gaussian Processes' (NeurIPS 2021)
Python
7
star
20

sfr

PyTorch implementation of Sparse Function-space Representation of Neural Networks
Jupyter Notebook
4
star
21

iterative-smoothing-bridge

Code for 'Transport with Support: Data-Conditional Diffusion Bridges'
Python
4
star
22

zed-capture

Tool for capturing IMU sensor and video data through zed-open-capture
C++
3
star
23

calibrated-dnn

Repository containing code for the paper: Fixing Overconfidence in Dynamic Neural Networks
Python
3
star
24

accelerated-arrays

Lightweight accelerated tensor / array programming library for smart phone GPUs
C++
2
star
25

view-aware-inference

Gaussian Process Priors for View-Aware Inference
Python
2
star
26

u-blox-capture

Python
1
star
27

apml

Advances in Probabilistic Machine Learning Seminar
1
star
28

improved-hyperparameter-learning

Codes for 'Improving Hyperparameter Learning under Approximate Inference in Gaussian Process Models' (ICML 2023)
Jupyter Notebook
1
star
29

sfr-experiments

Code accompanying ICLR 2024 paper "Function-space Parameterization of Neural Networks for Sequential Learning"
Jupyter Notebook
1
star