• Stars
    star
    314
  • Rank 133,353 (Top 3 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 5 years ago
  • Updated almost 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

End-to-end training of sparse deep neural networks with little-to-no performance loss.

Rigging the Lottery: Making All Tickets Winners

80% Sparse Resnet-50

Paper: https://arxiv.org/abs/1911.11134

15min Presentation [pml4dc] [icml]

ML Reproducibility Challenge 2020 report

Colabs for Calculating FLOPs of Sparse Models

MobileNet-v1

ResNet-50

Best Sparse Models

Parameters are float, so each parameter is represented with 4 bytes. Uniform sparsity distribution keeps first layer dense therefore have slightly larger size and parameters. ERK applies to all layers except for 99% sparse model, in which we set the first layer to be dense, since otherwise we observe much worse performance.

Extended Training Results

Performance of RigL increases significantly with extended training iterations. In this section we extend the training of sparse models by 5x. Note that sparse models require much less FLOPs per training iteration and therefore most of the extended trainings cost less FLOPs than baseline dense training.

Observing improving performance we wanted to understand where the performance of sparse networks saturates. Longest training we ran had 100x training length of the original 100 epoch ImageNet training. This training costs 5.8x of the original dense training FLOPS and the resulting 99% sparse Resnet-50 achieves an impressive 68.15% test accuracy (vs 5x training accuracy of 61.86%).

S. Distribution Sparsity Training FLOPs Inference FLOPs Model Size (Bytes) Top-1 Acc Ckpt
- (DENSE) 0 3.2e18 8.2e9 102.122 76.8 -
ERK 0.8 2.09x 0.42x 23.683 77.17 link
Uniform 0.8 1.14x 0.23x 23.685 76.71 link
ERK 0.9 1.23x 0.24x 13.499 76.42 link
Uniform 0.9 0.66x 0.13x 13.532 75.73 link
ERK 0.95 0.63x 0.12x 8.399 74.63 link
Uniform 0.95 0.42x 0.08x 8.433 73.22 link
ERK 0.965 0.45x 0.09x 6.904 72.77 link
Uniform 0.965 0.34x 0.07x 6.904 71.31 link
ERK 0.99 0.29x 0.05x 4.354 61.86 link
ERK 0.99 0.58x 0.05x 4.354 63.89 link
ERK 0.99 2.32x 0.05x 4.354 66.94 link
ERK 0.99 5.8x 0.05x 4.354 68.15 link

We also ran extended training runs with MobileNet-v1. Again training 100x more, we were not able saturate the performance. Training longer consistently achieved better results.

S. Distribution Sparsity Training FLOPs Inference FLOPs Model Size (Bytes) Top-1 Acc Ckpt
- (DENSE) 0 4.5e17 1.14e9 16.864 72.1 -
ERK 0.89 1.39x 0.21x 2.392 69.31 link
ERK 0.89 2.79x 0.21x 2.392 70.63 link
Uniform 0.89 1.25x 0.09x 2.392 69.28 link
Uniform 0.89 6.25x 0.09x 2.392 70.25 link
Uniform 0.89 12.5x 0.09x 2.392 70.59 link

1x Training Results

S. Distribution Sparsity Training FLOPs Inference FLOPs Model Size (Bytes) Top-1 Acc Ckpt
ERK 0.8 0.42x 0.42x 23.683 75.12 link
Uniform 0.8 0.23x 0.23x 23.685 74.60 link
ERK 0.9 0.24x 0.24x 13.499 73.07 link
Uniform 0.9 0.13x 0.13x 13.532 72.02 link

Results w/o label smoothing

S. Distribution Sparsity Training FLOPs Inference FLOPs Model Size (Bytes) Top-1 Acc Ckpt
ERK 0.8 0.42x 0.42x 23.683 75.02 link
ERK 0.8 2.09x 0.42x 23.683 76.17 link
ERK 0.9 0.24x 0.24x 13.499 73.4 link
ERK 0.9 1.23x 0.24x 13.499 75.9 link
ERK 0.95 0.13x 0.12x 8.399 70.39 link
ERK 0.95 0.63x 0.12x 8.399 74.36 link

Evaluating checkpoints

Download the checkpoints and run the evaluation on ERK checkpoints with the following:

python imagenet_train_eval.py --mode=eval_once --output_dir=path/to/ckpt/folder \
    --eval_once_ckpt_prefix=model.ckpt-3200000 --use_folder_stub=False \
    --training_method=rigl --mask_init_method=erdos_renyi_kernel \
    --first_layer_sparsity=-1

When running checkpoints with uniform sparsity distribution use --mask_init_method=random and --first_layer_sparsity=0. Set --model_architecture=mobilenet_v1 when evaluating mobilenet checkpoints.

Sparse Training Algorithms

In this repository we implement following dynamic sparsity strategies:

  1. SET: Implements Sparse Evalutionary Training (SET) which corresponds to replacing low magnitude connections randomly with new ones.

  2. SNFS: Implements momentum based training without sparsity re-distribution:

  3. RigL: Our method, RigL, removes a fraction of connections based on weight magnitudes and activates new ones using instantaneous gradient information.

And the following one-shot pruning algorithm:

  1. SNIP: Single-shot Network Pruning based on connection sensitivity prunes the least salient connections before training.

We have code for following settings:

  • Imagenet2012: TPU compatible code with Resnet-50 and MobileNet-v1/v2.
  • CIFAR-10 with WideResNets.
  • MNIST with 2 layer fully connected network.

Setup

First clone this repo.

git clone https://github.com/google-research/rigl.git
cd rigl

We use Neurips 2019 MicroNet Challenge code for counting operations and size of our networks. Let's clone the google_research repo and add current folder to the python path.

git clone https://github.com/google-research/google-research.git
mv google-research/ google_research/
export PYTHONPATH=$PYTHONPATH:$PWD

Now we can run some tests. Following script creates a virtual environment and installs the necessary libraries. Finally, it runs few tests.

bash run.sh

We need to activate the virtual environment before running an experiment. With that, we are ready to run some trivial MNIST experiments.

source env/bin/activate

python rigl/mnist/mnist_train_eval.py

You can load and verify the performance of the Resnet-50 checkpoints like following.

python rigl/imagenet_resnet/imagenet_train_eval.py --mode=eval_once --training_method=baseline --eval_batch_size=100 --output_dir=/path/to/folder --eval_once_ckpt_prefix=s80_model.ckpt-1280000 --use_folder_stub=False

We use the Official TPU Code for loading ImageNet data. First clone the tensorflow/tpu repo and then add models/ folder to the python path.

git clone https://github.com/tensorflow/tpu.git
export PYTHONPATH=$PYTHONPATH:$PWD/tpu/models/

Other Implementations

Citation

@incollection{rigl,
 author = {Evci, Utku and Gale, Trevor and Menick, Jacob and Castro, Pablo Samuel and Elsen, Erich},
 booktitle = {Proceedings of Machine Learning and Systems 2020},
 pages = {471--481},
 title = {Rigging the Lottery: Making All Tickets Winners},
 year = {2020}
}

Disclaimer

This is not an official Google product.

More Repositories

1

bert

TensorFlow code and pre-trained models for BERT
Python
37,769
star
2

google-research

Google Research
Jupyter Notebook
33,759
star
3

tuning_playbook

A playbook for systematically maximizing the performance of deep learning models.
26,593
star
4

vision_transformer

Jupyter Notebook
10,251
star
5

text-to-text-transfer-transformer

Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
Python
6,099
star
6

arxiv-latex-cleaner

arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
Python
5,233
star
7

simclr

SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Jupyter Notebook
3,937
star
8

multinerf

A Code Release for Mip-NeRF 360, Ref-NeRF, and RawNeRF
Python
3,612
star
9

timesfm

TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
Python
3,576
star
10

scenic

Scenic: A Jax Library for Computer Vision Research and Beyond
Python
3,295
star
11

football

Check out the new game server:
Python
3,260
star
12

albert

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Python
3,209
star
13

frame-interpolation

FILM: Frame Interpolation for Large Motion, In ECCV 2022.
Python
2,818
star
14

t5x

Python
2,656
star
15

electra

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Python
2,325
star
16

kubric

A data generation pipeline for creating semi-realistic synthetic multi-object videos with rich annotations such as instance segmentation masks, depth maps, and optical flow.
Jupyter Notebook
2,312
star
17

big_vision

Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.
Jupyter Notebook
2,219
star
18

uda

Unsupervised Data Augmentation (UDA)
Python
2,131
star
19

language

Shared repository for open-sourced projects from the Google AI Language team.
Python
1,605
star
20

pegasus

Python
1,600
star
21

dex-lang

Research language for array processing in the Haskell/ML family
Haskell
1,581
star
22

torchsde

Differentiable SDE solvers with GPU support and efficient sensitivity analysis.
Python
1,548
star
23

parti

1,538
star
24

big_transfer

Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
Python
1,504
star
25

FLAN

Python
1,460
star
26

robotics_transformer

Python
1,337
star
27

disentanglement_lib

disentanglement_lib is an open-source library for research on learning disentangled representations.
Python
1,311
star
28

multilingual-t5

Python
1,197
star
29

circuit_training

Python
1,151
star
30

tapas

End-to-end neural table-text understanding models.
Python
1,143
star
31

planet

Learning Latent Dynamics for Planning from Pixels
Python
1,134
star
32

mixmatch

Python
1,130
star
33

deduplicate-text-datasets

Rust
1,104
star
34

fixmatch

A simple method to perform semi-supervised learning with limited data.
Python
1,094
star
35

morph-net

Fast & Simple Resource-Constrained Learning of Deep Network Structure
Python
1,016
star
36

maxim

[CVPR 2022 Oral] Official repository for "MAXIM: Multi-Axis MLP for Image Processing". SOTA for denoising, deblurring, deraining, dehazing, and enhancement.
Python
996
star
37

deeplab2

DeepLab2 is a TensorFlow library for deep labeling, aiming to provide a unified and state-of-the-art TensorFlow codebase for dense pixel labeling tasks.
Python
995
star
38

batch-ppo

Efficient Batched Reinforcement Learning in TensorFlow
Python
963
star
39

augmix

AugMix: A Simple Data Processing Method to Improve Robustness and Uncertainty
Python
951
star
40

magvit

Official JAX implementation of MAGVIT: Masked Generative Video Transformer
Python
947
star
41

pix2seq

Pix2Seq codebase: multi-tasks with generative modeling (autoregressive and diffusion)
Jupyter Notebook
865
star
42

seed_rl

SEED RL: Scalable and Efficient Deep-RL with Accelerated Central Inference. Implements IMPALA and R2D2 algorithms in TF2 with SEED's architecture.
Python
793
star
43

meta-dataset

A dataset of datasets for learning to learn from few examples
Jupyter Notebook
762
star
44

noisystudent

Code for Noisy Student Training. https://arxiv.org/abs/1911.04252
Python
751
star
45

rliable

[NeurIPS'21 Outstanding Paper] Library for reliable evaluation on RL and ML benchmarks, even with only a handful of seeds.
Jupyter Notebook
747
star
46

recsim

A Configurable Recommender Systems Simulation Platform
Python
739
star
47

jax3d

Python
733
star
48

long-range-arena

Long Range Arena for Benchmarking Efficient Transformers
Python
719
star
49

lottery-ticket-hypothesis

A reimplementation of "The Lottery Ticket Hypothesis" (Frankle and Carbin) on MNIST.
Python
706
star
50

federated

A collection of Google research projects related to Federated Learning and Federated Analytics.
Python
675
star
51

bleurt

BLEURT is a metric for Natural Language Generation based on transfer learning.
Python
651
star
52

prompt-tuning

Original Implementation of Prompt Tuning from Lester, et al, 2021
Python
642
star
53

nasbench

NASBench: A Neural Architecture Search Dataset and Benchmark
Python
641
star
54

neuralgcm

Hybrid ML + physics model of the Earth's atmosphere
Python
641
star
55

xtreme

XTREME is a benchmark for the evaluation of the cross-lingual generalization ability of pre-trained multilingual models that covers 40 typologically diverse languages and includes nine tasks.
Python
631
star
56

lasertagger

Python
606
star
57

sound-separation

Python
603
star
58

pix2struct

Python
587
star
59

vmoe

Jupyter Notebook
569
star
60

dreamer

Dream to Control: Learning Behaviors by Latent Imagination
Python
568
star
61

robopianist

[CoRL '23] Dexterous piano playing with deep reinforcement learning.
Python
562
star
62

omniglue

Code release for CVPR'24 submission 'OmniGlue'
Python
561
star
63

fast-soft-sort

Fast Differentiable Sorting and Ranking
Python
561
star
64

ravens

Train robotic agents to learn pick and place with deep learning for vision-based manipulation in PyBullet. Transporter Nets, CoRL 2020.
Python
560
star
65

sam

Python
551
star
66

batch_rl

Offline Reinforcement Learning (aka Batch Reinforcement Learning) on Atari 2600 games
Python
521
star
67

bigbird

Transformers for Longer Sequences
Python
518
star
68

tensor2robot

Distributed machine learning infrastructure for large-scale robotics research
Python
483
star
69

byt5

Python
477
star
70

adapter-bert

Python
476
star
71

mint

Multi-modal Content Creation Model Training Infrastructure including the FACT model (AI Choreographer) implementation.
Python
465
star
72

leaf-audio

LEAF is a learnable alternative to audio features such as mel-filterbanks, that can be initialized as an approximation of mel-filterbanks, and then be trained for the task at hand, while using a very small number of parameters.
Python
446
star
73

robustness_metrics

Jupyter Notebook
442
star
74

maxvit

[ECCV 2022] Official repository for "MaxViT: Multi-Axis Vision Transformer". SOTA foundation models for classification, detection, segmentation, image quality, and generative modeling...
Jupyter Notebook
436
star
75

receptive_field

Compute receptive fields of your favorite convnets
Python
434
star
76

maskgit

Official Jax Implementation of MaskGIT
Jupyter Notebook
429
star
77

weatherbench2

A benchmark for the next generation of data-driven global weather models.
Python
420
star
78

l2p

Learning to Prompt (L2P) for Continual Learning @ CVPR22 and DualPrompt: Complementary Prompting for Rehearsal-free Continual Learning @ ECCV22
Python
408
star
79

distilling-step-by-step

Python
407
star
80

ssl_detection

Semi-supervised learning for object detection
Python
398
star
81

nerf-from-image

Shape, Pose, and Appearance from a Single Image via Bootstrapped Radiance Field Inversion
Python
377
star
82

computation-thru-dynamics

Understanding computation in artificial and biological recurrent networks through the lens of dynamical systems.
Jupyter Notebook
369
star
83

tf-slim

Python
368
star
84

realworldrl_suite

Real-World RL Benchmark Suite
Python
341
star
85

python-graphs

A static analysis library for computing graph representations of Python programs suitable for use with graph neural networks.
Python
325
star
86

task_adaptation

Python
310
star
87

self-organising-systems

Jupyter Notebook
308
star
88

ibc

Official implementation of Implicit Behavioral Cloning, as described in our CoRL 2021 paper, see more at https://implicitbc.github.io/
Python
306
star
89

tensorflow_constrained_optimization

Python
300
star
90

syn-rep-learn

Learning from synthetic data - code and models
Python
294
star
91

arco-era5

Recipes for reproducing Analysis-Ready & Cloud Optimized (ARCO) ERA5 datasets.
Python
291
star
92

vdm

Jupyter Notebook
291
star
93

rlds

Jupyter Notebook
284
star
94

exoplanet-ml

Machine learning models and utilities for exoplanet science.
Python
283
star
95

retvec

RETVec is an efficient, multilingual, and adversarially-robust text vectorizer.
Jupyter Notebook
281
star
96

sparf

This is the official code release for SPARF: Neural Radiance Fields from Sparse and Noisy Poses [CVPR 2023-Highlight]
Python
279
star
97

tensorflow-coder

Python
275
star
98

lm-extraction-benchmark

Python
270
star
99

language-table

Suite of human-collected datasets and a multi-task continuous control benchmark for open vocabulary visuolinguomotor learning.
Jupyter Notebook
260
star
100

falken

Falken provides developers with a service that allows them to train AI that can play their games
Python
254
star