• Stars
    star
    687
  • Rank 65,332 (Top 2 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 5 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

PyTorch, TensorFlow, JAX and NumPy β€” all of them natively using the same code

EagerPy: Writing Code That Works Natively with PyTorch, TensorFlow, JAX, and NumPy

EagerPy is a Python framework that lets you write code that automatically works natively with PyTorch, TensorFlow, JAX, and NumPy. EagerPy is also great when you work with just one framework but prefer a clean and consistent API that is fully chainable, provides extensive type annotions and lets you write beautiful code.

πŸ”₯ Design goals

  • Native Performance: EagerPy operations get directly translated into the corresponding native operations.
  • Fully Chainable: All functionality is available as methods on the tensor objects and as EagerPy functions.
  • Type Checking: Catch bugs before running your code thanks to EagerPy's extensive type annotations.

πŸ“– Documentation

Learn more about EagerPy in the documentation.

πŸš€ Quickstart

pip install eagerpy

EagerPy requires Python 3.6 or newer. Besides that, all essential dependencies are automatically installed. To use it with PyTorch, TensorFlow, JAX, or NumPy, the respective framework needs to be installed separately. These frameworks are not declared as dependencies because not everyone wants to use and thus install all of them and because some of these packages have different builds for different architectures and CUDA versions.

πŸŽ‰ Example

import torch
x = torch.tensor([1., 2., 3., 4., 5., 6.])

import tensorflow as tf
x = tf.constant([1., 2., 3., 4., 5., 6.])

import jax.numpy as np
x = np.array([1., 2., 3., 4., 5., 6.])

import numpy as np
x = np.array([1., 2., 3., 4., 5., 6.])

# No matter which framwork you use, you can use the same code
import eagerpy as ep

# Just wrap a native tensor using EagerPy
x = ep.astensor(x)

# All of EagerPy's functionality is available as methods
x = x.reshape((2, 3))
x.flatten(start=1).square().sum(axis=-1).sqrt()
# or just: x.flatten(1).norms.l2()

# and as functions (yes, gradients are also supported!)
loss, grad = ep.value_and_grad(loss_fn, x)
ep.clip(x + eps * grad, 0, 1)

# You can even write functions that work transparently with
# Pytorch tensors, TensorFlow tensors, JAX arrays, NumPy arrays

def my_universal_function(a, b, c):
    # Convert all inputs to EagerPy tensors
    a, b, c = ep.astensors(a, b, c)

    # performs some computations
    result = (a + b * c).square()

    # and return a native tensor
    return result.raw

πŸ—Ί Use cases

Foolbox Native, the latest version of Foolbox, a popular adversarial attacks library, has been rewritten from scratch using EagerPy instead of NumPy to achieve native performance on models developed in PyTorch, TensorFlow and JAX, all with one code base.

EagerPy is also used by other frameworks to reduce code duplication (e.g. GUDHI) or to compare the performance of different frameworks.

πŸ“„ Citation

If you use EagerPy, please cite our paper using the this BibTex entry:

@article{rauber2020eagerpy,
  title={{EagerPy}: Writing Code That Works Natively with {PyTorch}, {TensorFlow}, {JAX}, and {NumPy}},
  author={Rauber, Jonas and Bethge, Matthias and Brendel, Wieland},
  journal={arXiv preprint arXiv:2008.04175},
  year={2020},
  url={https://eagerpy.jonasrauber.de},
}

🐍 Compatibility

We currently test with the following versions:

  • PyTorch 1.4.0
  • TensorFlow 2.1.0
  • JAX 0.1.57
  • NumPy 1.18.1

More Repositories

1

linear-region-attack

A powerful white-box adversarial attack that exploits knowledge about the geometry of neural networks to find minimal adversarial perturbations without doing gradient descent
Python
11
star
2

randn-matlab-python

Reproducing Random Numbers in Matlab and Python / NumPy
Python
10
star
3

uniformly-sampling-nd-ball

Efficiently sampling vectors from the n-sphere and n-ball
Python
10
star
4

arxiv-bear-app

My personal reference management system for arXiv papers using Bear.app
9
star
5

foolbox-native-tutorial

Jupyter Notebook
9
star
6

OSXNotifier.jl

Julia package to send notifications to the OS X Notification Center
Julia
8
star
7

foolbox-native

Foolbox Native brings native performance to Foolbox
Python
7
star
8

tensorflow-imagenet

Jupyter notebook templates for training or fine-tuning on ImageNet using TensorFlow.
Jupyter Notebook
5
star
9

foolbox-tensorflow-keras-applications

The pretrained TensorFlow Keras models with a Foolbox Zoo compatible interface
Python
4
star
10

c2s-docker

Dockerfile for the c2s toolbox https://github.com/lucastheis/c2s
4
star
11

analysis-by-synthesis

Analysis by Synthesis β€” reimplemented
Python
4
star
12

attax

Attax: adversarial attacks using JAX
Python
3
star
13

lockfile

lockfile provides a minimalistic and modern implementation of a simple file-based lock mechanism for Python
Python
3
star
14

DiscreteEntropyEstimators.jl

Discrete Entropy Estimators implemented in Julia
Julia
2
star
15

norm

Norm is a tiny command-line utility to compute the norm of the difference between two images
Nim
2
star
16

clipping-aware-rescaling

Calculates eta such that norm(clip(x + eta * delta, a, b) - x) == eps.
Python
2
star
17

cifar10-fast-reimplemented

A simplified reimplementation of the original fast CIFAR10 training code
Python
1
star
18

c2s-ipython-docker

Dockerfile for the c2s toolbox with IPython support https://github.com/lucastheis/c2s
1
star
19

plotspikes.py

A minimal python function for beautiful spike plotting
Python
1
star