• Stars
    star
    383
  • Rank 111,362 (Top 3 %)
  • Language
    Python
  • License
    MIT License
  • Created over 5 years ago
  • Updated 12 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A collection of incremental learning paper implementations including PODNet (ECCV20) and Ghost (CVPR-W21).

Incremental Learners for Continual Learning

Repository storing some my public works done during my PhD thesis (2019-).

You will find in there both known implementation (iCaRL, etc.) but also all my papers. You can find the list of the latter on my Google Scholar.

My work on continual segmentation can be found here and on continual data loaders here.

Structures

Every model must inherit inclearn.models.base.IncrementalLearner.

PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning

Paper ECCV Youtube

podnet

podnet plot

If you use this paper/code in your research, please consider citing us:

@inproceedings{douillard2020podnet,
    title={PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning},
    author={Douillard, Arthur and Cord, Matthieu and Ollion, Charles and Robert, Thomas and Valle, Eduardo},
    booktitle={Proceedings of the IEEE European Conference on Computer Vision (ECCV)},
    year={2020}
}

To run experiments on CIFAR100 with three different class orders, with the challenging setting of 50 steps:

python3 -minclearn --options options/podnet/podnet_cnn_cifar100.yaml options/data/cifar100_3orders.yaml \
    --initial-increment 50 --increment 1 --fixed-memory \
    --device <GPU_ID> --label podnet_cnn_cifar100_50steps \
    --data-path <PATH/TO/DATA>

Likewise, for ImageNet100:

python3 -minclearn --options options/podnet/podnet_cnn_imagenet100.yaml options/data/imagenet100_1order.yaml \
    --initial-increment 50 --increment 1 --fixed-memory \
    --device <GPU_ID> --label podnet_cnn_imagenet100_50steps \
    --data-path <PATH/TO/DATA>

And ImageNet1000:

Likewise, for ImageNet100:

python3 -minclearn --options options/podnet/podnet_cnn_imagenet100.yaml options/data/imagenet1000_1order.yaml \
    --initial-increment 500 --increment 50 --fixed-memory --memory-size 20000 \
    --device <GPU_ID> --label podnet_cnn_imagenet1000_10steps \
    --data-path <PATH/TO/DATA>

Furthermore several options files are available to reproduce the ablations showcased in the paper. Please see the directory ./options/podnet/ablations/.

Insight From the Future for Continual Learning

Paper CVPR Workshop

ghost

If you use this paper/code in your research, please consider citing us:

@inproceedings{douillard2020ghost,
    title={Insight From the Future for Continual Learning},
    author={Arthur Douillard and Eduardo Valle and Charles Ollion and Thomas Robert and Matthieu Cord},
    booktitle={arXiv preprint library},
    year={2020}
}

The code is still very dirty, I'll clean it later. Forgive me.

More Repositories

1

CVPR2021_PLOP

Official code of CVPR 2021's PLOP: Learning without Forgetting for Continual Semantic Segmentation
Python
140
star
2

dytox

Dynamic Token Expansion with Continual Transformers, accepted at CVPR 2022
Python
133
star
3

deepcourse

Learn the Deep Learning for Computer Vision in three steps: theory from base to SotA, code in PyTorch, and space-repetition with Anki
Jupyter Notebook
133
star
4

keras-snapshot_ensembles

Implementation in Keras of: Snapshot Ensembles: Train 1, get M for free (https://arxiv.org/abs/1704.00109)
Python
25
star
5

keras-mobilenet

Implementation in Keras of MobileNet (https://arxiv.org/abs/1704.04861)
Python
23
star
6

keras-effnet

Implementation in Keras of Effnet (https://arxiv.org/abs/1801.06434)
Python
21
star
7

keras-shufflenet

Implementation in Keras of ShuffleNet (https://arxiv.org/abs/1707.01083)
Python
19
star
8

nalu.pytorch

Implementation of NALU & NAC (https://arxiv.org/abs/1808.00508 | DeepMind) in PyTorch.
Jupyter Notebook
17
star
9

mada.pytorch

Unfinished Work: Implementation of Multi-Adversarial Domain Adaptation (https://arxiv.org/abs/1809.02176) in Pytorch
Python
16
star
10

turing_pattern_generator

A generator of Turing patterns from an image
Jupyter Notebook
11
star
11

continual-learning-terminology

10
star
12

water_simulation

Water simulation with OpenGL
C
10
star
13

tensorflow-faceid

Faceid-like in Tensorflow using a Siamese network with contrastive loss
Python
10
star
14

awesome-deeplearning-papers

A collection of Deep Learning papers I read, sorted by category.
Python
9
star
15

keras-squeeze_and_excitation_network

Implementation in Keras of Squeeze and Excitation (https://arxiv.org/abs/1709.01507)
Python
6
star
16

Continual_Learning_Leaderboards

Learderboards of Continual Learning for various benchmarks.
4
star
17

optimizers.pytorch

A collection of Optimizers, from famous to exotic, implemented in PyTorch
Python
4
star
18

teledetection

Implementation in C of a custom k-means for clouds detection in satellite images.
C
4
star
19

FastRadixTree

Orthographic Corrector in CPP using a Trie.
C++
2
star
20

distributed_memory_mpi

Distributed memory with MPI in Python, features also map/reduce/filter!
Python
1
star
21

quiz

JavaScript
1
star
22

phd_thesis

TeX
1
star
23

Soundrain

Soundcloud music downloader
Python
1
star
24

MoviesPopularity

An app that rates movies according to its comments, incremental learning is also done.
Scala
1
star
25

Reflex-Tap

A small website (used to practice) about a fictive mobile app (english & Korean version)
HTML
1
star
26

Smart-Saleman

Basic solution for the saleman problem.
Python
1
star
27

arthurdouillard.github.io

My current blog, auto-updated from the template https://github.com/arthurdouillard/hugo-website
HTML
1
star
28

Blind-Mouse-in-a-Maze

Blind Mouse in a Maze - Interview question
C++
1
star
29

elix_anki_scrapper

Scrapper of Elix (French's Sign Language) for Anki
Jupyter Notebook
1
star
30

coursera-R-programming

Assignements of the `R Programming` course on coursera
R
1
star
31

algo_with_mpi

Some basic algos with mpi for Python (mpi4py)
Python
1
star