• Stars
    star
    153
  • Rank 243,368 (Top 5 %)
  • Language
    Python
  • License
    MIT License
  • Created over 12 years ago
  • Updated about 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Build Status

This library provides objects that model probability distributions and the related operations that are common in generative Bayesian modeling and Bayesian inference, including Gibbs sampling and variational mean field algorithms. The file abstractions.py describes the queries a distribution must support to be used in each algorithm, as well as an API for models, which compose the distribution objects.

Example

The file models.py shows how to construct mixture models building on the distribution objects in this library. For example, to generate data from a Gaussian mixture model, we might set some hyperparameters, construct a Mixture object, and then ask it to randomly generate some data from the prior:

import numpy as np
from pybasicbayes import models, distributions

# hyperparameters
alpha_0=5.0
obs_hypparams = dict(mu_0=np.zeros(2),sigma_0=np.eye(2),kappa_0=0.05,nu_0=5)

# create the model
priormodel = models.Mixture(alpha_0=alpha_0,
        components=[distributions.Gaussian(**obs_hypparams) for itr in range(30)])

# generate some data
data = priormodel.rvs(400)

# delete the model
del priormodel

If we throw away the prior model at the end, we're left just with the data, which look like this:

from matplotlib import pyplot as plt
plt.figure()
plt.plot(data[:,0],data[:,1],'kx')
plt.title('data')

randomly generated mixture model data

Imagine we loaded these data from some measurements file and we wanted to fit a mixture model to it. We can create a new Mixture and run inference to get a representation of the posterior distribution over mixture models conditioned on observing these data:

posteriormodel = models.Mixture(alpha_0=alpha_0,
        components=[distributions.Gaussian(**obs_hypparams) for itr in range(30)])

posteriormodel.add_data(data)

Since pybasicbayes implements both Gibbs sampling and variational mean field inference algorithms, we can use both together in a hybrid algorithm.

import copy
from pybasicbayes.util.text import progprint_xrange

allscores = [] # variational lower bounds on the marginal data log likelihood
allmodels = []
for superitr in range(5):
    # Gibbs sampling to wander around the posterior
    print 'Gibbs Sampling'
    for itr in progprint_xrange(100):
        posteriormodel.resample_model()

    # mean field to lock onto a mode
    print 'Mean Field'
    scores = [posteriormodel.meanfield_coordinate_descent_step()
                for itr in progprint_xrange(100)]

    allscores.append(scores)
    allmodels.append(copy.deepcopy(posteriormodel))

import operator
models_and_scores = sorted([(m,s[-1]) for m,s
    in zip(allmodels,allscores)],key=operator.itemgetter(1),reverse=True)

Now we can plot the score trajectories:

plt.figure()
for scores in allscores:
    plt.plot(scores)
plt.title('model vlb scores vs iteration')

model vlb scores vs iteration

And show the point estimate of the best model by calling the convenient Mixture.plot():

models_and_scores[0][0].plot()
plt.title('best model')

best fit model and data

Since these are Bayesian methods, we have much more than just a point estimate for plotting: we have fit entire distributions, so we can query any confidence or marginal that we need.

See the file demo.py for the code for this demo.

Authors

Matt Johnson, Alex Wiltschko, Yarden Katz, Nick Foti, and Scott Linderman.

More Repositories

1

autodidact

A pedagogical implementation of Autograd
Jupyter Notebook
945
star
2

pyhsmm

Python
548
star
3

svae

code for Structured Variational Autoencoders
Python
349
star
4

pyslds

Python
88
star
5

my-oh-my-zsh

Shell
84
star
6

pylds

some tools for gaussian linear dynamical systems
Python
83
star
7

pyhsmm-autoregressive

autoregressive plugin
Python
27
star
8

pyhsmm-factorial

Python
18
star
9

todo-bash

super-simple command-line todo tracking
Shell
13
star
10

ode-diff-notes

Python
12
star
11

py-manipulate

Python
10
star
12

probprog

simple probabilistic programming in Scheme
Scheme
10
star
13

matlab-hsmm

Objective-C
9
star
14

variational_autoencoder

Python
8
star
15

pplham

Python
7
star
16

spectral_clustering

Python
6
star
17

py4sid

subspace identification for linear systems
Python
6
star
18

py-diskmemo

Python
6
star
19

pykalmanfilters

Python
5
star
20

kalman_grads

Python
4
star
21

pyparticlefilters

Python
4
star
22

pyhsmm-collapsedinfinite

Python
4
star
23

config-vim

my .vim
Vim Script
4
star
24

gaussian-hogwild-gibbs

Python
4
star
25

dirichlet-truncated-multinomial

Python
4
star
26

information-theory-tutorial

Python
4
star
27

config-fish

my fish configuration
Shell
4
star
28

next.ml

JavaScript
3
star
29

yaldapy

yet another LDA implementation
Python
3
star
30

pyhsmm-beamsampling

beam sampling for pyhsmm
Python
3
star
31

pymattutil

a few miscellaneous functions I've found useful
Python
3
star
32

matlab-fftconv

MATLAB
3
star
33

autograd_linalg

improve scipy.linalg and corresponding autograd gradfuns
Python
3
star
34

cs281_linear_regression

Python
2
star
35

mattjj.github.com

2
star
36

cachedproperties

an experiment in property caching
Python
2
star
37

pyhsmm-subhmms

Python
2
star
38

dotfiles

my config files, especially .tmux.conf
Shell
2
star
39

py-stateplots

Python
2
star
40

vim-prefs

my vim prefs, separated so that it can be a bundle
Vim Script
2
star
41

mpi4py-paralleltempering

Python
1
star
42

pykinematics

Python
1
star
43

jsmattutil

JavaScript
1
star