• Stars
    star
    214
  • Rank 184,678 (Top 4 %)
  • Language
    Jupyter Notebook
  • Created over 4 years ago
  • Updated 4 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Synaptic Flow

Getting Started

First clone this repo, then install all dependencies

pip install -r requirements.txt

The code was tested with Python 3.6.0.

Code Base

Below is a description of the major sections of the code base. Run python main.py --help for a complete description of flags and hyperparameters.

Datasets

This code base supports the following datasets: MNIST, CIFAR-10, CIFAR-100, Tiny ImageNet, ImageNet. All datasets except ImageNet will download automatically. For ImageNet setup locally in the Data folder.

Models

There are four model classes each defining a variety of model architectures:

  • Default models support basic dense and convolutional model.
  • Lottery ticket models support VGG/ResNet architectures based on OpenLTH.
  • Tiny ImageNet models support VGG/ResNet architectures based on this Github repository.
  • ImageNet models supports VGG/ResNet architectures from torchvision.

Layers

Custom dense, convolutional, batchnorm, and residual layers implementing masked parameters can be found in the Layers folder.

Pruners

All pruning algorithms are implemented in the Pruners folder.

Experiments

Below is a list and description of the experiment files found in the Experiment folder:

  • singleshot.py: used to make figure 1, 2, and 6.
  • multishot.py: used to make figure 5a.
  • unit-conservation.py: used to make figure 3.
  • layer-conservation.py: used to make figure 4.
  • lottery-layer-conservation.py: used to make figure 5b.
  • synaptic-flow-ratio.py: used to make figure 7.

Results

All data used to generate the figures in our paper can be found in the Results/data folder. Run the notebook figures.ipynb to generate the figures.

Error

Due to an error in multishop.py (which has since been fixed), IMP did not reset the parameters to their original values between iterations. All benchmarks in the paper are not affected as they are run in singleshot.py.

Citation

If you use this code for your research, please cite our paper, "Pruning neural networks without any data by iteratively conserving synaptic flow".

More Repositories

1

twpca

πŸ• Time-warped principal components analysis (twPCA)
Jupyter Notebook
121
star
2

pathint

Code to accompany our paper "Continual Learning Through Synaptic Intelligence" ICML 2017
Jupyter Notebook
96
star
3

deepchaos

Experiments for the paper "Exponential expressivity in deep neural networks through transient chaos"
Jupyter Notebook
65
star
4

RetinalResources

Code for "A Unified Theory of Early Visual Representations from Retina to Cortex through Anatomically Constrained Deep CNNs", ICLR 2019
Jupyter Notebook
48
star
5

degrees-of-freedom

Python
35
star
6

proxalgs

Proximal algorithms and operators in python
Python
25
star
7

minFunc

unconstrained optimization tools
MATLAB
14
star
8

RetinalCellTypes

Jupyter Notebook
10
star
9

Complex_Synapse

Complex Synapse Project: notes, Matlab code, poster, slides
TeX
8
star
10

nems

Neural encoding models
Python
7
star
11

textureSynth

A fork of the Portilla and Simoncelli texture synthesis code
MATLAB
7
star
12

rica

Reconstruction ICA (http://papers.nips.cc/paper/4467-ica-with-reconstruction-cost-for-efficient-overcomplete-feature-learning)
Python
4
star
13

tensorAMP

code to reproduce simulations from Statistical mechanics of low-rank tensor decomposition, NIPS 2018
4
star
14

steerable_pyramid

Matlab tools for multi-scale image processing, from Eero Simoncelli's lab (http://www.cns.nyu.edu/~eero/steerpyr/)
MATLAB
3
star
15

pyGLM

python GLM
Python
3
star
16

deep-retina-reduction

Model reduction of deep retina models
2
star
17

website

Ganguli Lab website
JavaScript
2
star
18

projecting-manifolds

Code for paper "Random projections of random manifolds".
Python
2
star
19

Energy_Accuracy_Tradeoff_Cellular_Sensing

A repo for supplemental videos and code associated with the paper https://arxiv.org/abs/2002.10567
1
star