• Stars
    star
    555
  • Rank 80,213 (Top 2 %)
  • Language
    Python
  • License
    MIT License
  • Created over 5 years ago
  • Updated 7 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.

BackPACK BackPACK: Packing more into backprop

Travis Coveralls Python 3.7+

BackPACK is built on top of PyTorch. It efficiently computes quantities other than the gradient.

Provided quantities include:

  • Individual gradients from a mini-batch
  • Estimates of the gradient variance or second moment
  • Approximate second-order information (diagonal and Kronecker approximations)

Motivation: Computation of most quantities is not necessarily expensive (often just a small modification of the existing backward pass where backpropagated information can be reused). But it is difficult to do in the current software environment.

Installation

pip install backpack-for-pytorch

Examples

Contributing

BackPACK is actively being developed. We are appreciating any help. If you are considering to contribute, do not hesitate to contact us. An overview of the development procedure is provided in the developer README.

How to cite

If you are using BackPACK, consider citing the paper

@inproceedings{dangel2020backpack,
    title     = {Back{PACK}: Packing more into Backprop},
    author    = {Felix Dangel and Frederik Kunstner and Philipp Hennig},
    booktitle = {International Conference on Learning Representations},
    year      = {2020},
    url       = {https://openreview.net/forum?id=BJlrF24twB}
}
BackPACK is not endorsed by or affiliated with Facebook, Inc. PyTorch, the PyTorch logo and any related marks are trademarks of Facebook, Inc.

More Repositories

1

cockpit

Cockpit: A Practical Debugging Tool for Training Deep Neural Networks
Python
474
star
2

unfoldNd

(N=1,2,3)-dimensional unfold (im2col) and fold (col2im) in PyTorch
Python
83
star
3

hbp

Hessian backpropagation (HBP): PyTorch extension of backpropagation for block-diagonal curvature matrix approximations
Python
20
star
4

phd-thesis

Source code for my PhD thesis: Backpropagation Beyond the Gradient
TeX
20
star
5

singd

[ICML 2024] SINGD: KFAC-like Structured Inverse-Free Natural Gradient Descent (http://arxiv.org/abs/2312.05705)
Python
19
star
6

curvlinops

scipy linear operators for the Hessian, Fisher/GGN, and more in PyTorch
Python
17
star
7

vivit

[TMLR 2022] Curvature access through the generalized Gauss-Newton's low-rank structure: Eigenvalues, eigenvectors, directional derivatives & Newton steps
Python
17
star
8

einconv

Convolutions and more as einsum for PyTorch
Python
12
star
9

sirfshampoo

[ICML 2024] SIRFShampoo: Structured inverse- and root-free Shampoo in PyTorch (https://arxiv.org/abs/2402.03496)
Python
10
star
10

phd-thesis-template

LaTeX template for my PhD thesis at the University of Tuebingen
TeX
9
star
11

backobs

Use DeepOBS with BackPACK
Python
2
star
12

org-export-setup

My org-export settings
TeX
2
star
13

python-utilities

Python utility functions I often use
Python
2
star
14

wandb_preempt

Code and tutorial on integrating wandb sweeps with Slurm pre-emption
Python
1
star
15

vivit-experiments

Experiments for the TMLR 2023 paper "ViViT: Curvature Access Through the Generalized Gauss-Newton’s Low-rank Structure"
Python
1
star