• Stars
    star
    213
  • Rank 185,410 (Top 4 %)
  • Language
    Python
  • License
    MIT License
  • Created over 5 years ago
  • Updated 5 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Just a little MCMC

minimc

Just a little MCMC

Build Status Coverage Status

This is a test library to provide reference implementations of MCMC algorithms and ideas. The basis and reference for much of this library is from Michael Betancourt's wonderful A Conceptual Introduction to Hamiltonian Monte Carlo.

The highlight of the library right now is the ~15 line Hamiltonian Monte Carlo implementation (which relies on an 8 line integrator). Both of these are commented and documented, but aim to be instructive to read.

Currently Implemented

  • Step size tuning
  • Leapfrog integrator
  • Hamiltonian Monte Carlo
  • Some log probabilities (normal, multivariate normal, mixtures, funnel)

Roadmap

Installation

I would suggest cloning this and playing with the source code, but it can be pip installed with

pip install git+git://github.com/colcarroll/minimc.git

Examples

The API of minimc is mimicked in minimc.minimc_slow, which returns trajectories instead of just samples. This makes for nicer images and experiments, but it is a bit slower.

import autograd.numpy as np
from minimc import neg_log_normal, hamiltonian_monte_carlo
from minimc.autograd_interface import AutogradPotential

neg_log_p = AutogradPotential(neg_log_normal(0, 0.1))
samples = hamiltonian_monte_carlo(2_000, neg_log_p, initial_position=0.)

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2500/2500 [00:04<00:00, 615.91it/s]

from minimc.minimc_slow import hamiltonian_monte_carlo as hmc_slow

samples, positions, momentums, accepted, p_accepts = hmc_slow(50, neg_log_p,
                                                              initial_position=0.,
                                                              step_size=0.01)

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 50/50 [00:00<00:00, 52.72it/s]

from minimc import neg_log_mvnormal

mu = np.zeros(2)
cov = np.array([[1.0, 0.8], [0.8, 1.0]])
neg_log_p = AutogradPotential(neg_log_mvnormal(mu, cov))

samples = hamiltonian_monte_carlo(1000, neg_log_p, np.zeros(2))

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1500/1500 [00:02<00:00, 623.13.92it/s]

samples, positions, momentums, accepted, p_accepts = hmc_slow(10, neg_log_p,
                                                              np.zeros(2),
                                                              path_len=4,
                                                              step_size=0.01)

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 10/10 [00:01<00:00, 9.06it/s]

from minimc import mixture

neg_log_probs = [neg_log_normal(1.0, 0.5), neg_log_normal(-1.0, 0.5)]
probs = np.array([0.2, 0.8])
neg_log_p = AutogradPotential(mixture(neg_log_probs, probs))
samples = hamiltonian_monte_carlo(2000, neg_log_p, 0.0)

neg_log_probs = [
    neg_log_normal(-1.0, 0.3),
    neg_log_normal(0., 0.2),
    neg_log_normal(1.0, 0.3),
    ]
probs = np.array([0.1, 0.5, 0.4])
neg_log_p = AutogradPotential(mixture(neg_log_probs, probs))
samples = hamiltonian_monte_carlo(2_000, neg_log_p, 0.)

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2000/2000 [00:09<00:00, 261.17it/s]

samples, positions, momentums, accepted, p_accepts = hmc_slow(100, neg_log_p,
                                                              0.0,
                                                              step_size=0.01)

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 100/100 [00:07<00:00, 14.04it/s]

mu1 = np.ones(2)
cov1 = 0.5 * np.array([[1.0, 0.7],
                       [0.7, 1.0]])
mu2 = -np.ones(2)
cov2 = 0.2 * np.array([[1.0, -0.6],
                       [-0.6, 1.0]])

mu3 = np.array([-1.0, 2.0])
cov3 = 0.3 * np.eye(2)

neg_log_p = AutogradPotential(mixture(
    [
        neg_log_mvnormal(mu1, cov1),
        neg_log_mvnormal(mu2, cov2),
        neg_log_mvnormal(mu3, cov3),
    ],
    [0.3, 0.3, 0.4],
))

samples = hamiltonian_monte_carlo(2000, neg_log_p, np.zeros(2))

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2500/2500 [00:11<00:00, 212.83it/s]

samples, positions, momentums, accepted, p_accepts = hmc_slow(20, neg_log_p,
                                                              np.zeros(2),
                                                              path_len=3,
                                                              step_size=0.01)

100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 20/20 [00:08<00:00, 2.31it/s]

More Repositories

1

ridge_map

Ridge plots of ridges
Python
512
star
2

imcmc

Image Markov Chain Monte Carlo
Python
237
star
3

strava_calendar

Visualizations from Strava data in matplotlib
Python
85
star
4

sampled

Decorator for PyMC3
Python
49
star
5

flask_react_example

JavaScript
31
star
6

ppl-api

A comparison of PPL APIs
Jupyter Notebook
24
star
7

quantile_dotplot

Python implementation of plot from Kay, Kola, Hullman, Munson "When (ish) is My Bus?" (2016)
Python
18
star
8

pydata_nyc2017

Slides and materials for workshop on "Two views on regression with PyMC3 and scikit-learn"
Jupyter Notebook
18
star
9

couplings

Unbiased MCMC with couplings
Jupyter Notebook
17
star
10

callisto

A command line utility to create kernels in Jupyter from virtual environments.
Python
16
star
11

hamiltonian_monte_carlo_talk

Essay on Hamiltonian Monte Carlo in PyMC3
Jupyter Notebook
14
star
12

flask_angular_example

a minimal example of a data bound sklearn model
JavaScript
14
star
13

carpo

Run and time jupyter notebooks
Python
12
star
14

working_ml

Examples of applied machine learning
Jupyter Notebook
12
star
15

mcmc-adapt

A poster for Scipy 2021
HTML
10
star
16

skample

Sample data from sketches
JavaScript
7
star
17

redistricting-pymc3-pycon-2018

Code and notebooks for "Fighting Gerrymandering with PyMC3" from PyCon 2018
Jupyter Notebook
6
star
18

ngMathJax

Live rendering with AngularJS and MathJax
HTML
5
star
19

yourplotlib

PyData NYC 2019 talk on building a maintainable plotting library
Jupyter Notebook
5
star
20

compart

An experimental library for doing computational art.
Jupyter Notebook
4
star
21

tidytex

Keep LaTeX folders clean and compile automatically
Python
2
star
22

march_madness_viewer

View March Madness 2015 Picks
JavaScript
2
star
23

newsreader

Test Reddit scraper
Python
2
star
24

l2hmc_pymc3

Standalone implementation of sampler from Levy, Hoffman, Sohl-Dickson's paper, in PyMC3
Python
2
star
25

flymc3

Flask + PyMC3
HTML
2
star
26

adventofcode

My advent of code excursions.
Julia
2
star
27

hmc_tuning_talk

"Pragmatic Probabilistic Programming: Parameter Adaptation in PyMC3" talk from 2019 Probabilistic Programming Summit
JavaScript
2
star
28

driving_fatalities

Case study in state fatality rates using partial pooling with PyMC3
Jupyter Notebook
2
star
29

email_fetcher

Fetches your emails
Python
1
star
30

run_mapper

R code for mapping .gpx files
R
1
star
31

march_madness

2015 Kaggle Competition
Python
1
star
32

arviz_pydata_nyc

Talk on ArviZ at PyData NYC
Jupyter Notebook
1
star
33

golearn

Experiments in machine learning with Go
Go
1
star
34

probprog_poster

ArviZ poster from probprog 2018 (https://probprog.cc), joint with Austin Rochford
TeX
1
star
35

dotfiles

My personal dotfiles
Shell
1
star
36

prob_prog_experiments

Prototypes in probabilistic programming
Python
1
star
37

lin_reg_essay

Essay from talk slides on linear regression
HTML
1
star
38

sde_experiment

An experimenting in simulating time series data.
JavaScript
1
star
39

seen_it_news

Code for a hipster twitter bot
Python
1
star
40

intro_ml_talk

Overview of ML
Jupyter Notebook
1
star
41

pete

He's just happy to be here
Python
1
star
42

bayesian_reg_talk

Slides from talk on bayesian regression
JavaScript
1
star
43

tourney_viewer

Viewer for Kaggle March Madness predictions
JavaScript
1
star
44

blogwork

Scripts from my blog
Python
1
star
45

parametric_graphs

Parametric graphs in d3.js
HTML
1
star
46

ColCarroll.github.io

CSS
1
star
47

jshmc

Interactive HMC
JavaScript
1
star