• Stars
    star
    219
  • Rank 181,133 (Top 4 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 10 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Python framework for inference in Hawkes processes.

PyHawkes implements a variety of Bayesian inference algorithms for discovering latent network structure given point process observations. Suppose you observe timestamps of Twitter messages, but you don't get to see how those users are connected to one another. You might infer that there is an unobserved connection from one user to another if the first user's activity tends to precede the second user's. This intuition is formalized by combining excitatory point processes (aka Hawkes processes) with random network models and performing Bayesian inference to discover the latent network.

Examples

We provide a number of classes for building and fitting such models. Let's walk through a simple example where we construct a discrete time model with three nodes, as in examples/discrete_demo. The nodes are connected via an excitatory network such that each event increases the likelihood of subsequent events on downstream nodes.

# Create a simple random network with K nodes a sparsity level of p
# Each event induces impulse responses of length dt_max on connected nodes
K = 3
p = 0.25
dt_max = 20
network_hypers = {"p": p, "allow_self_connections": False}
true_model = DiscreteTimeNetworkHawkesModelSpikeAndSlab(
    K=K, dt_max=dt_max, network_hypers=network_hypers)

# Generate T time bins of events from the the model
# S is the TxK event count matrix, R is the TxK rate matrix
S,R = true_model.generate(T=100)
true_model.plot()

You should see something like this. Here, each event on node one adds an impulse response on the rate of nodes two and three. True Model

Now create a test model and try to infer the network given only the event counts.

# Create the test model, add the event count data, and plot
test_model = DiscreteTimeNetworkHawkesModelSpikeAndSlab(
    K=K, dt_max=dt_max, network_hypers=network_hypers)
test_model.add_data(S)
fig, handles = test_model.plot(color="#e41a1c")

# Run a Gibbs sampler
N_samples = 100
lps = []
for itr in range(N_samples):
    test_model.resample_model()
    lps.append(test_model.log_probability())

    # Update plots
    test_model.plot(handles=test_handles)

If you enable interactive plotting, you should see something like this. Inferred Model

In addition to Gibbs sampling, we have implemented maximum a posteriori (MAP) estimation, mean field variational Bayesian inference, and stochastic variational inference. To see how those methods can be used, look in examples/inference.

Installation

For a basic (but lower performance) installation run

pip install pyhawkes

To install from source run

git clone [email protected]:slinderman/pyhawkes.git
cd pyhawkes
pip install -e .

This will be rather slow, however, since the default version does not do any multi-threading. For advanced installation instructions to support multithreading, see MULTITHREADING.md.

This codebase is considerably cleaner than the old CUDA version, and is still quite fast with the Cython+OMP extensions and joblib for parallel sampling of the adjacency matrix.

More Information

Complete details of this work can be found in:

Linderman, Scott W. and Adams, Ryan P. Discovering Latent Network Structure in Point Process Data. International Conference on Machine Learning (ICML), 2014.

and

Linderman, Scott W., and Adams, Ryan P. Scalable Bayesian Inference for Excitatory Point Process Networks. arXiv preprint arXiv:1507.03228, 2015.

More Repositories

1

stats320

STATS320: Statistical Methods for Neural Data Analysis
Jupyter Notebook
191
star
2

recurrent-slds

Recurrent Switching Linear Dynamical Systems
Python
91
star
3

pypolyagamma

Fast C code for sampling Polya-gamma random variates. Builds on Jesse Windle's BayesLogit library.
C++
79
star
4

pyglm

Interpretable neural spike train models with fully-Bayesian inference algorithms
Python
47
star
5

theano_pyglm

Generalized linear models for neural spike train modeling, in Python! With GPU-accelerated fully-Bayesian inference, MAP inference, and network priors.
Python
46
star
6

stats215

TeX
29
star
7

stats271sp2021

Material for STATS271: Applied Bayesian Statistics (Spring 2021)
Jupyter Notebook
26
star
8

stats305c

STATS305C: Applied Statistics III (Spring, 2023)
Jupyter Notebook
23
star
9

thesis

My PhD Thesis
TeX
21
star
10

pyhsmm_spiketrains

Code for fitting neural spike trains with nonparametric hidden Markov and semi-Markov models built upon mattjj's PyHSMM framework.
Python
12
star
11

graphistician

Generative random network models and Bayesian inference algorithms
Python
10
star
12

tdlds

Reducing the temporal-difference learning theory of dopamine to a linear dynamical system
Jupyter Notebook
9
star
13

gslrandom

Cython wrapper for GSL random number generators
Python
7
star
14

stats305b

STATS 305B: Applied Statistics II. Models and Algorithms for Discrete Data.
Jupyter Notebook
6
star
15

cs281sec09

Graph models with MCMC
Python
5
star
16

neymanscott

Bayesian inference for Neyman-Scott processes
Jupyter Notebook
4
star
17

ml4nd

Machine Learning Methods for Neural Data Analysis
Jupyter Notebook
4
star
18

cython_openmp_mwe

Minimum working example of OpenMP with Cython
Python
2
star
19

birkhoff

Reparametrizing the Birkhoff Polytope
Jupyter Notebook
2
star
20

eigenglm

C++
2
star
21

dpriv_mcmc

Python
1
star
22

torchhmm

Pytorch extension to compute gradients through HMM message passing
Python
1
star