• Stars
    star
    115
  • Rank 304,028 (Top 7 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created over 4 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Differentiable Data Augmentation Library

Differentiable Data Augmentation Library

This library is a core of Faster AutoAugment and its descendants. This library is research oriented, and its AIP may change in the near future.

Requirements and Installation

Requirements

Python>=3.8
PyTorch>=1.5.0
torchvision>=0.6
kornia>=0.2

Installation

pip install -U git+https://github.com/moskomule/dda

APIs

dda.functional

Basic operations that can be differentiable w.r.t. the magnitude parameter mag. When mag=0, no augmentation is applied, and when mag=1 (and mag=-1 if it exists), the severest augmentation is applied. As introduced in Faster AutoAugment, some operations use straight-through estimator to be differentiated w.r.t. their magnitude parameters.

def operation(img: torch.Tensor,
              mag: Optional[torch.Tensor]) -> torch.Tensor:
    ...

dda.pil contains the similar APIs using PIL (not differentiable).

dda.operations

class Operation(nn.Module):
   
    def __init__(self,
                 initial_magnitude: Optional[float] = None,
                 initial_probability: float = 0.5,
                 magnitude_range: Optional[Tuple[float, float]] = None,
                 probability_range: Optional[Tuple[float, float]] = None,
                 temperature: float = 0.1,
                 flip_magnitude: bool = False,
                 magnitude_scale: float = 1,
                 debug: bool = False):
        ...

If magnitude_range=None, probability_range=None, then magnitude, probability is not Parameter but Buffer, respectively.

magnitude moves in magnitude_scale * magnitude_range. For example, dda.operations.Rotation has magnitude_range=[0, 1] and magnitude_scale=30 so that magnitude is between 0 to 30 degrees.

To differentiate w.r.t. the probability parameter, RelaxedBernoulli is used.

Examples

Citation

dda (except RandAugment) is developed as a core library of the following research projects.

If you use dda in your academic research, please cite hataya2020a.

@inproceesings{hataya2020a,
    title={{Faster AutoAugment: Learning Augmentation Strategies using Backpropagation}},
    author={Ryuichiro Hataya and Jan Zdenek and Kazuki Yoshizoe and Hideki Nakayama},
    year={2020},
    booktitle={ECCV}
}

...

More Repositories

1

senet.pytorch

PyTorch implementation of SENet
Python
2,128
star
2

ewc.pytorch

An implementation of EWC with PyTorch
Jupyter Notebook
181
star
3

sam.pytorch

A PyTorch implementation of Sharpness-Aware Minimization for Efficiently Improving Generalization
Python
110
star
4

homura

homura is a library for fast prototyping DL research
Python
104
star
5

shampoo.pytorch

An implementation of shampoo
Python
72
star
6

cca.pytorch

CCAs for looking into DNNs
Python
69
star
7

pytorch.rl.learning

for learning reinforcement learning using PyTorch.
Python
65
star
8

anatome

Ἀνατομή is a PyTorch library to analyze representation of neural networks
Jupyter Notebook
55
star
9

l0.pytorch

an implementation of L0 regularization with PyTorch
Jupyter Notebook
52
star
10

ssl-suite

SSL using PyTorch
Python
50
star
11

mixup.pytorch

an implementation of mixup
Python
39
star
12

pytorch.snapshot.ensembles

PyTorch implementation of "SNAPSHOT ENSEMBLES: TRAIN 1, GET M FOR FREE" [WIP]
Python
36
star
13

eve.pytorch

Python
32
star
14

simple_transformers

Simple transformer implementations that I can understand
Python
20
star
15

chika

chika is a simple and easy config tool for hierarchical configurations.
Python
20
star
16

distillation.pytorch

Implementation of several knowledge distillation techniques on PyTorch
Python
15
star
17

hypergrad

Simple and extensible hypergradient for PyTorch
Python
15
star
18

pytorch.detection.learning

learning object detection using pytorch
Jupyter Notebook
12
star
19

introvae.pytorch

Python
11
star
20

softdisc

Differentiable Discrete Algorithms for PyTorch
Python
9
star
21

memory_efficient_attention.pytorch

A human-readable PyTorch implementation of "Self-attention Does Not Need O(n^2) Memory" (Rabe&Staats'21).
Python
7
star
22

gsync

Simple PyDrive wrapper and command line tool.
Python
6
star
23

softsort.pytorch

PyTorch implementation of Cuturi M., Teboul O., Vert JP: Differentiable Sorting using Optimal Transport: The Sinkhorn CDF and Quantile Operator
Python
6
star
24

neuralcompressor.pytorch

PyTorch Implementation of Compressing Word Embeddings via Deep Compositional Code Learning
Python
5
star
25

dataset-contamination

Datasets of "Will Large-scale Generative Models Corrupt Future Datasets?"
4
star
26

mlp_mixer.pytorch

PyTorch implementation of "MLP-Mixer: An all-MLP Architecture for Vision"
Python
4
star
27

maguro

A simple job scheduler for GPUs
Python
4
star
28

pyproject_template

Python
3
star
29

abel.pytorch

PyTorch LR scheduler of ABEL
Python
3
star
30

jax_devcontainer

devcontainer for JAX
Jupyter Notebook
3
star
31

miniargs

A wrapper of argparse which I can remember the APIs.
Python
3
star
32

moskomule.github.io

HTML
3
star
33

.dotfiles

dotfiles
Shell
2
star
34

convnext.pytorch

ConvNeXt with hub
2
star
35

madoka

A simple wrapper of matplotlib for figures in papers
Jupyter Notebook
2
star
36

attentions.pytorch

Attentions for computer vision
Python
2
star
37

greedy-learning

To catch up the recent progress of layerwise learning
Python
2
star
38

honen

A wrapper of matplotlib for myself
Jupyter Notebook
2
star
39

hyperhomura

Gradient-based Hyperparameter Optimization for homura
2
star
40

functorch_utils

Utilities for functorch
1
star
41

cuda-server

Shell
1
star
42

LaTeX-better-practice

To avoid re-searching and re-inventing wheels
1
star
43

yax

JAX things
Dockerfile
1
star
44

black-box-optimization

Julia
1
star
45

moskomule

1
star
46

mine.pytorch

Python
1
star
47

mae.pytorch

A PyTorch implementation of Masked Autoencoders (WIP)
Python
1
star