• Stars
    star
    258
  • Rank 158,189 (Top 4 %)
  • Language
    C++
  • License
    Apache License 2.0
  • Created over 5 years ago
  • Updated 12 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Differentiable computations of the signature and logsignature transforms, on both CPU and GPU. (ICLR 2021)

Signatory

Differentiable computations of the signature and logsignature transforms, on both CPU and GPU.

What is the signature transform?

The signature transform is roughly analogous to the Fourier transform, in that it operates on a stream of data (often a time series). Whilst the Fourier transform extracts information about frequency, the signature transform extracts information about order and area. Furthermore (and unlike the Fourier transform), order and area represent all possible nonlinear effects: the signature transform is a universal nonlinearity, meaning that every continuous function of the input stream may be approximated arbitrary well by a linear function of its signature. If you're doing machine learning then you probably understand why this is such a desirable property!

Besides this, the signature transform has many other nice properties -- robustness to missing or irregularly sampled data; optional translation invariance; optional sampling invariance. Furthermore it can be used to encode certain physical quantities, and may be used for data compression.

Check out this for a primer on the use of the signature transform in machine learning, just as a feature transformation, and this for a more in-depth look at integrating the signature transform into neural networks.

Installation

pip install signatory==<SIGNATORY_VERSION>.<TORCH_VERSION> --no-cache-dir --force-reinstall

where <SIGNATORY_VERSION> is the version of Signatory you would like to download (the most recent version is 1.2.7) and <TORCH_VERSION> is the version of PyTorch you are using.

Available for Python 3.7--3.9 on Linux and Windows. Requires PyTorch 1.8.0--1.11.0.

(If you need it, then previous versions of Signatory included support for older versions of Python, PyTorch, and MacOS, see here.)

After installation, just import signatory inside Python.

Take care not to run pip install signatory, as this will likely download the wrong version.

Example:

For example, if you are using PyTorch 1.11.0 and want Signatory 1.2.7, then you should run:

pip install signatory==1.2.7.1.11.0 --no-cache-dir --force-reinstall

Why you need to specify all of this:

Yes, this looks a bit odd. This is needed to work around limitations of PyTorch and pip.

The --no-cache-dir --force-reinstall flags are because pip doesn't expect to need to care about versions quite as much as this, so it will sometimes erroneously use inappropriate caches if not told otherwise.

Installation from source is also possible; please consult the documentation. This also includes information on how to run the tests and benchmarks.

If you have any problems with installation then check the FAQ. If that doesn't help then feel free to open an issue.

Documentation

The documentation is available here.

Example

Usage is straightforward. As a simple example,

import signatory
import torch
batch, stream, channels = 1, 10, 2
depth = 4
path = torch.rand(batch, stream, channels)
signature = signatory.signature(path, depth)
# signature is a PyTorch tensor

For further examples, see the documentation.

Citation

If you found this library useful in your research, please consider citing the paper.

@inproceedings{kidger2021signatory,
  title={{S}ignatory: differentiable computations of the signature and logsignature transforms, on both {CPU} and {GPU}},
  author={Kidger, Patrick and Lyons, Terry},
  booktitle={International Conference on Learning Representations},
  year={2021},
  note={\url{https://github.com/patrick-kidger/signatory}}
}

More Repositories

1

equinox

Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
Python
2,029
star
2

torchtyping

Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
Python
1,380
star
3

diffrax

Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
Python
1,369
star
4

jaxtyping

Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/
Python
1,117
star
5

NeuralCDE

Code for "Neural Controlled Differential Equations for Irregular Time Series" (Neurips 2020 Spotlight)
Python
610
star
6

torchcde

Differentiable controlled differential equation solvers for PyTorch with GPU support and memory-efficient adjoint backpropagation.
Python
411
star
7

lineax

Linear solvers in JAX and Equinox. https://docs.kidger.site/lineax
Python
344
star
8

mkposters

Make posters from Markdown files.
Python
324
star
9

sympy2jax

Turn SymPy expressions into trainable JAX expressions.
Python
313
star
10

optimistix

Nonlinear optimisation (root-finding, least squares, ...) in JAX+Equinox. https://docs.kidger.site/optimistix/
Python
299
star
11

torchcubicspline

Interpolating natural cubic splines. Includes batching, GPU support, support for missing values, evaluating derivatives of the spline, and backpropagation.
Python
214
star
12

sympytorch

Turning SymPy expressions into PyTorch modules.
Python
139
star
13

quax

Multiple dispatch over abstract array types in JAX.
Python
100
star
14

Deep-Signature-Transforms

Code for "Deep Signature Transforms" (NeurIPS 2019)
Jupyter Notebook
87
star
15

FasterNeuralDiffEq

Code for "'Hey, that's not an ODE:' Faster ODE Adjoints via Seminorms" (ICML 2021)
Python
86
star
16

typst_pyimage

Typst extension, adding support for generating figures using inline Python code
Python
72
star
17

generalised_shapelets

Code for "Generalised Interpretable Shapelets for Irregular Time Series"
Jupyter Notebook
52
star
18

PatModules.jl

A better import/module system for Julia.
Julia
18
star
19

exvoker

A CLI tool. Extract regexes from stdout (e.g. URLs) and invoke commands on them (e.g. open the webpage).
Rust
9
star
20

action_update_python_project

Github Action to: Check version / Test / git tag / GitHub Release / Deploy to PyPI
8
star
21

pytkdocs_tweaks

Some custom tweaks to the results produced by pytkdocs.
Python
5
star
22

Learning-Interpolation

Applying machine learning to help numerically solve the Camassa-Holm equation.
Jupyter Notebook
4
star
23

matching

Round robin matching algorithm.
Python
3
star
24

candle

Simple PyTorch helpers. (I think we've probably all written one of these for ourselves!)
Python
3
star
25

tools

Helpful abstract tools (functions, classes, ... ) for coding in Python.
Python
3
star
26

pdfscraper

Saves a webpage and all linked pdfs.
Python
3
star
27

ktools

Tools for working with Keras.
Python
2
star
28

loccounter

Counts lines of Python code.
Python
2
star
29

Dissertation

Master's Dissertation: Polynomial Approximation of Holomorphic Functions
2
star
30

py2annotate

An extension to Sphinx autodoc to augment Sphinx documentation with type annotations, when using Python 2 style type annotations.
Python
2
star
31

adventuregame

The very start of a game I was toying with before I got distracted by the PhD...
Python
2
star
32

MPE-CDT-Project

A simple machine learning project for weather observations.
Jupyter Notebook
2
star
33

tfext

Some extra stuff for using with TensorFlow.
Python
2
star
34

mkdocs_include_exclude_files

Modify which files MkDocs includes or excludes.
Python
1
star
35

patrick-kidger

1
star
36

rl-test

Python
1
star