• Stars
    star
    235
  • Rank 171,079 (Top 4 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 6 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Functional tensors for probabilistic programming

Build Status Latest Version Documentation Status

Funsor

Funsor is a tensor-like library for functions and distributions.

See Functional tensors for probabilistic programming for a system description.

Installing

Install using pip:

Funsor supports Python 3.7+.

pip install funsor

Install from source:

git clone [email protected]:pyro-ppl/funsor.git
cd funsor
git checkout master
pip install .

Using funsor

Funsor can be used through a number of interfaces:

  • Funsors can be used directly for probabilistic computations, using PyTorch optimizers in a standard training loop. Start with these examples: discrete_hmm, eeg_slds, kalman_filter, pcfg, sensor, slds, and vae.
  • Funsors can be used to implement custom inference algorithms within Pyro, using custom elbo implementations in standard pyro.infer.SVI training. See these examples: mixed_hmm and bart forecasting.
  • funsor.pyro provides a number of Pyro-compatible (and PyTorch-compatible) distribution classes that use funsors under the hood, as well utilities to convert between funsors and distributions.
  • funsor.minipyro provides a limited alternate backend for the Pyro probabilistic programming language, and can perform some ELBO computations exactly.

Design

See design doc.

The goal of this library is to generalize Pyro's delayed inference algorithms from discrete to continuous variables, and to create machinery to enable partially delayed sampling compatible with universality. To achieve this goal this library makes three orthogonal design choices:

  1. Open terms are objects. Funsors generalize the tensor interface to also cover arbitrary functions of multiple variables ("inputs"), where variables may be integers, real numbers, or real tensors. Function evaluation / substitution is the basic operation, generalizing tensor indexing. This allows probability distributions to be first-class Funsors and make use of existing tensor machinery, for example we can generalize tensor contraction to computing analytic integrals in conjugate probabilistic models.

  2. Support nonstandard interpretation. Funsors support user-defined interpretations, including, eager, lazy, mixed eager+lazy, memoized (like opt_einsum's sharing), and approximate interpretations like Monte Carlo approximations of integration operations (e.g. .sum() over a funsor dimension).

  3. Named dimensions. Substitution is the most basic operation of Funsors. To avoid the difficulties of broadcasting and advanced indexing in positionally-indexed tensor libraries, all Funsor dimensions are named. Indexing uses the .__call__() method and can be interpreted as substitution (with well-understood semantics). Funsors are viewed as algebraic expressions with one algebraic free variable per dimension. Each dimension is either covariant (an output) or contravariant (an input).

Using funsor we can easily implement Pyro-style delayed sampling, roughly:

trace_log_prob = 0.

def pyro_sample(name, dist, obs=None):
    assert isinstance(dist, Funsor)
    if obs is not None:
        value = obs
    elif lazy:
        # delayed sampling (like Pyro's parallel enumeration)
        value = funsor.Variable(name, dist.support)
    else:
        value = dist.sample('value')[0]['value']

    # save log_prob in trace
    trace_log_prob += dist(value)

    return value

# ...later during inference...
loss = -trace_log_prob.reduce(logaddexp)  # collapses delayed variables

See funsor/minipyro.py for complete implementation.

Related projects

Citation

If you use Funsor, please consider citing:

@article{obermeyer2019functional,
  author = {Obermeyer, Fritz and Bingham, Eli and Jankowiak, Martin and
            Phan, Du and Chen, Jonathan P},
  title = {{Functional Tensors for Probabilistic Programming}},
  journal = {arXiv preprint arXiv:1910.10775},
  year = {2019}
}