• Stars
    star
    121
  • Rank 293,924 (Top 6 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created almost 8 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

🕝 Time-warped principal components analysis (twPCA)

⚠️ Please use our newer code --- Piecewise Linear Time Warping:

Our new work removes the assumption of low-dimensional dynamics, and uses a new optimization framework to avoid local minima in the warping function fitting routine. The new code package is also better optimized for speed, contains cross-validation routines, and has tools for working with spike data in continuous time.

[DEPRECATED] Time warped principal components analysis (TWPCA)

Ben Poole 🍺, Alex H. Williams 🎙️, Niru Maheswaranathan

image

Overview

Installation

Again, this package is deprecated, so it should only be used as legacy software. But if you want to install it, you can do so manually:

git clone https://github.com/ganguli-lab/twpca
cd twpca
pip install -e .

Description

Analysis of multi-trial neural data often relies on a strict alignment of neural activity to stimulus or behavioral events. However, activity on a single trial may be shifted and skewed in time due to differences in attentional state, biophysical kinetics, and other unobserved latent variables. This temporal variability can inflate the apparent dimensionality of data and obscure our ability to recover inherently simple, low-dimensional structure.

Here we present a novel method, time-warped PCA (twPCA), that simultaneously identifies temporal warps of individual trials and low-dimensional structure across neurons and time. Furthermore, we identify the temporal warping in a data-driven, unsupervised manner, removing the need for explicit knowledge of external variables responsible for temporal variability.

For more information, check out our abstract or poster.

We also encourage you to look into our new package, affinewarp, which was built with similar applications in mind.

Code

We provide code for twPCA in python (note: we use tensorflow as a backend for computation).

To apply twPCA to your own dataset, first install the code (pip install twpca) and load in your favorite dataset and shape it so that it is a 3D numpy array with dimensions (number of trials, number of timepoints per trial, number of neurons). For example, if you have a dataset with 100 trials each lasting 50 samples with 25 neurons, then your array should have shape (100, 50, 25).

Then, you can apply twPCA to your data by running from twpca import TWPCA; model = TWPCA(data, n_components).fit() where n_components is the number of low-rank factors you wish to fit and data is a 3D numpy as described above. A more thorough example is given below:

from twpca import TWPCA
from twpca.datasets import jittered_neuron

# generates a dataset consisting of a single feature that is jittered on every trial.
# This helper function returns the raw feature, as well as the aligned (ground truth)
# data and the observed (jittered) data.
feature, aligned_data, raw_data = jittered_neuron()

# applies TWPCA to your dataset with the given number of components (this follows the
# scikit-learn fit/trasnform API)
n_components = 1
model = TWPCA(raw_data, n_components).fit()

# the model object now contains the low-rank factors
time_factors = model.params['time']         # compare this to the ground truth feature
neuron_factors = model.params['neuron']     # in this single-neuron example, this will be a scalar

# you can use the model object to align data (compare this to the aligned_data from above)
estimated_aligned_data = model.transform()

We have provided a more thorough demo notebook demonstrating the application of tWPCA to a synthetic dataset.

Further detail

Motivation

Performing dimensionality reduction on misaligned time series produces illusory complexity. For example, the figure below shows that a dataset consisting of a single feature jittered across trials (red data) has illusory complexity (as the spectrum of singular values decays slowly).

image

The twPCA model

To address this problem for a sequence of multi-dimensional time-series we simultaneously fit a latent factor model (e.g. a matrix decomposition), and time warping functions to align the latent factors to each measured time series. Each trial is modeled as a low-rank matrix where the neuron factors are fixed (gray box below) while the time factors vary from trial to trial by warping a canonical temporal factor differently on each trial.

image

More Repositories

1

Synaptic-Flow

Jupyter Notebook
214
star
2

pathint

Code to accompany our paper "Continual Learning Through Synaptic Intelligence" ICML 2017
Jupyter Notebook
96
star
3

deepchaos

Experiments for the paper "Exponential expressivity in deep neural networks through transient chaos"
Jupyter Notebook
65
star
4

RetinalResources

Code for "A Unified Theory of Early Visual Representations from Retina to Cortex through Anatomically Constrained Deep CNNs", ICLR 2019
Jupyter Notebook
48
star
5

degrees-of-freedom

Python
35
star
6

proxalgs

Proximal algorithms and operators in python
Python
25
star
7

minFunc

unconstrained optimization tools
MATLAB
14
star
8

RetinalCellTypes

Jupyter Notebook
10
star
9

Complex_Synapse

Complex Synapse Project: notes, Matlab code, poster, slides
TeX
8
star
10

nems

Neural encoding models
Python
7
star
11

textureSynth

A fork of the Portilla and Simoncelli texture synthesis code
MATLAB
7
star
12

rica

Reconstruction ICA (http://papers.nips.cc/paper/4467-ica-with-reconstruction-cost-for-efficient-overcomplete-feature-learning)
Python
4
star
13

tensorAMP

code to reproduce simulations from Statistical mechanics of low-rank tensor decomposition, NIPS 2018
4
star
14

steerable_pyramid

Matlab tools for multi-scale image processing, from Eero Simoncelli's lab (http://www.cns.nyu.edu/~eero/steerpyr/)
MATLAB
3
star
15

pyGLM

python GLM
Python
3
star
16

deep-retina-reduction

Model reduction of deep retina models
2
star
17

website

Ganguli Lab website
JavaScript
2
star
18

projecting-manifolds

Code for paper "Random projections of random manifolds".
Python
2
star
19

Energy_Accuracy_Tradeoff_Cellular_Sensing

A repo for supplemental videos and code associated with the paper https://arxiv.org/abs/2002.10567
1
star