• Stars
    star
    208
  • Rank 189,015 (Top 4 %)
  • Language
    Python
  • License
    Other
  • Created about 5 years ago
  • Updated 4 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Keras implementation of Legendre Memory Units

KerasLMU: Recurrent neural networks using Legendre Memory Units

Paper

This is a Keras-based implementation of the Legendre Memory Unit (LMU). The LMU is a novel memory cell for recurrent neural networks that dynamically maintains information across long windows of time using relatively few resources. It has been shown to perform as well as standard LSTM or other RNN-based models in a variety of tasks, generally with fewer internal parameters (see this paper for more details). For the Permuted Sequential MNIST (psMNIST) task in particular, it has been demonstrated to outperform the current state-of-the-art results. See the note below for instructions on how to get access to this model.

The LMU is mathematically derived to orthogonalize its continuous-time history – doing so by solving d coupled ordinary differential equations (ODEs), whose phase space linearly maps onto sliding windows of time via the Legendre polynomials up to degree d βˆ’ 1 (the example for d = 12 is shown below).

Legendre polynomials

A single LMU cell expresses the following computational graph, which takes in an input signal, x, and couples a optimal linear memory, m, with a nonlinear hidden state, h. By default, this coupling is trained via backpropagation, while the dynamics of the memory remain fixed.

Computational graph

The discretized A and B matrices are initialized according to the LMU's mathematical derivation with respect to some chosen window length, ΞΈ. Backpropagation can be used to learn this time-scale, or fine-tune A and B, if necessary.

Both the kernels, W, and the encoders, e, are learned. Intuitively, the kernels learn to compute nonlinear functions across the memory, while the encoders learn to project the relevant information into the memory (see paper for details).

Note

The paper branch in the lmu GitHub repository includes a pre-trained Keras/TensorFlow model, located at models/psMNIST-standard.hdf5, which obtains a psMNIST result of 97.15%. Note that the network is using fewer internal state-variables and neurons than there are pixels in the input sequence. To reproduce the results from this paper, run the notebooks in the experiments directory within the paper branch.

Nengo Examples

Citation

@inproceedings{voelker2019lmu,
  title={Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks},
  author={Aaron R. Voelker and Ivana Kaji\'c and Chris Eliasmith},
  booktitle={Advances in Neural Information Processing Systems},
  pages={15544--15553},
  year={2019}
}

More Repositories

1

nengo

A Python library for creating and simulating large-scale brain models
Python
821
star
2

nengo-gui

Nengo interactive visualizer
Python
97
star
3

nengo-1.4

Create and run detailed neural simulations
Python
96
star
4

nengo-dl

Deep learning integration for Nengo
Python
88
star
5

pytorch-spiking

Spiking neuron integration for PyTorch
Python
36
star
6

nengo-loihi

Run Nengo models on Intel's Loihi chip
Python
35
star
7

keras-spiking

Spiking neuron integration for Keras
Python
29
star
8

nengo-ocl

OpenCL-based simulator for Nengo neural models
Python
23
star
9

nengo-examples

Instructive examples using projects in the Nengo ecosystem
Jupyter Notebook
22
star
10

nengo-spa

Implementation of the Semantic Pointer Architecture for Nengo
Python
20
star
11

nengo-fpga

Nengo extension to connect to FPGAs
Python
13
star
12

nengo-mpi

MPI backend for the nengo neural simulator.
Python
5
star
13

nengo-extras

Extra utilities and add-ons for Nengo
Python
5
star
14

lmu

LMU metapackage
Python
4
star
15

nengo-gyrus

Recursively generate large-scale Nengo models using NumPy semantics.
Python
3
star
16

nengo-bones

A template for making a new Python project in the Nengo ecosystem
Python
3
star
17

pytest-plt

Create Matplotlib plots easily for visual inspection of complicated tests
Python
3
star
18

nengo-interfaces

Simplifies external input and output communication
Python
2
star
19

pytest-rng

Provides a seeded random number generator to make tests using randomness deterministic
Python
2
star
20

nengo.github.io

Website with general information and pointers to subprojects
SCSS
1
star
21

nengo-edge

Tools for working with NengoEdge
Python
1
star
22

pytest-allclose

Gather statistics on allclose comparisons to measure overall test accuracy
Python
1
star