• Stars
    star
    804
  • Rank 56,681 (Top 2 %)
  • Language
    Python
  • License
    Other
  • Created almost 3 years ago
  • Updated about 1 month ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

PyCIL: A Python Toolbox for Class-Incremental Learning

PyCIL: A Python Toolbox for Class-Incremental Learning


IntroductionMethods ReproducedReproduced ResultsHow To UseLicenseAcknowledgmentsContact


LICENSEPython PyTorch method CIL visitors

Welcome to PyCIL, perhaps the toolbox for class-incremental learning with the most implemented methods. This is the code repository for "PyCIL: A Python Toolbox for Class-Incremental Learning" [paper] in PyTorch. If you use any content of this repo for your work, please cite the following bib entry:

@article{zhou2023pycil,
    author = {Da-Wei Zhou and Fu-Yun Wang and Han-Jia Ye and De-Chuan Zhan},
    title = {PyCIL: a Python toolbox for class-incremental learning},
    journal = {SCIENCE CHINA Information Sciences},
    year = {2023},
    volume = {66},
    number = {9},
    pages = {197101-},
    doi = {https://doi.org/10.1007/s11432-022-3600-y}
  }

@article{zhou2023class,
    author = {Zhou, Da-Wei and Wang, Qi-Wei and Qi, Zhi-Hong and Ye, Han-Jia and Zhan, De-Chuan and Liu, Ziwei},
    title = {Deep Class-Incremental Learning: A Survey},
    journal = {arXiv preprint arXiv:2302.03648},
    year = {2023}
 }

What's New

Introduction

Traditional machine learning systems are deployed under the closed-world setting, which requires the entire training data before the offline training process. However, real-world applications often face the incoming new classes, and a model should incorporate them continually. The learning paradigm is called Class-Incremental Learning (CIL). We propose a Python toolbox that implements several key algorithms for class-incremental learning to ease the burden of researchers in the machine learning community. The toolbox contains implementations of a number of founding works of CIL, such as EWC and iCaRL, but also provides current state-of-the-art algorithms that can be used for conducting novel fundamental research. This toolbox, named PyCIL for Python Class-Incremental Learning, is open source with an MIT license.

For more information about incremental learning, you can refer to these reading materials:

  • A brief introduction (in Chinese) about CIL is available here.
  • A PyTorch Tutorial to Class-Incremental Learning (with explicit codes and detailed explanations) is available here.

Methods Reproduced

  • FineTune: Baseline method which simply updates parameters on new tasks.
  • EWC: Overcoming catastrophic forgetting in neural networks. PNAS2017 [paper]
  • LwF: Learning without Forgetting. ECCV2016 [paper]
  • Replay: Baseline method with exemplar replay.
  • GEM: Gradient Episodic Memory for Continual Learning. NIPS2017 [paper]
  • iCaRL: Incremental Classifier and Representation Learning. CVPR2017 [paper]
  • BiC: Large Scale Incremental Learning. CVPR2019 [paper]
  • WA: Maintaining Discrimination and Fairness in Class Incremental Learning. CVPR2020 [paper]
  • PODNet: PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning. ECCV2020 [paper]
  • DER: DER: Dynamically Expandable Representation for Class Incremental Learning. CVPR2021 [paper]
  • PASS: Prototype Augmentation and Self-Supervision for Incremental Learning. CVPR2021 [paper]
  • RMM: RMM: Reinforced Memory Management for Class-Incremental Learning. NeurIPS2021 [paper]
  • IL2A: Class-Incremental Learning via Dual Augmentation. NeurIPS2021 [paper]
  • SSRE: Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning. CVPR2022 [paper]
  • FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning. WACV2023 [paper]
  • Coil: Co-Transport for Class-Incremental Learning. ACM MM2021 [paper]
  • FOSTER: Feature Boosting and Compression for Class-incremental Learning. ECCV 2022 [paper]
  • MEMO: A Model or 603 Exemplars: Towards Memory-Efficient Class-Incremental Learning. ICLR 2023 Spotlight [paper]
  • BEEF: BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion. ICLR 2023 [paper]
  • SimpleCIL: Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need. arXiv 2023 [paper]

Intended authors are welcome to contact us to reproduce your methods in our repo. Feel free to merge your algorithm into PyCIL if you are using our codebase!

Reproduced Results

CIFAR-100

ImageNet-100

ImageNet-100 (Top-5 Accuracy)

More experimental details and results can be found in our survey.

How To Use

Clone

Clone this GitHub repository:

git clone https://github.com/G-U-N/PyCIL.git
cd PyCIL

Dependencies

  1. torch 1.81
  2. torchvision 0.6.0
  3. tqdm
  4. numpy
  5. scipy
  6. quadprog
  7. POT

Run experiment

  1. Edit the [MODEL NAME].json file for global settings.
  2. Edit the hyperparameters in the corresponding [MODEL NAME].py file (e.g., models/icarl.py).
  3. Run:
python main.py --config=./exps/[MODEL NAME].json

where [MODEL NAME] should be chosen from finetune, ewc, lwf, replay, gem, icarl, bic, wa, podnet, der, etc.

  1. hyper-parameters

When using PyCIL, you can edit the global parameters and algorithm-specific hyper-parameter in the corresponding json file.

These parameters include:

  • memory-size: The total exemplar number in the incremental learning process. Assuming there are $K$ classes at the current stage, the model will preserve $\left[\frac{memory-size}{K}\right]$ exemplar per class.
  • init-cls: The number of classes in the first incremental stage. Since there are different settings in CIL with a different number of classes in the first stage, our framework enables different choices to define the initial stage.
  • increment: The number of classes in each incremental stage $i$, $i$ > 1. By default, the number of classes per incremental stage is equivalent per stage.
  • convnet-type: The backbone network for the incremental model. According to the benchmark setting, ResNet32 is utilized for CIFAR100, and ResNet18 is used for ImageNet.
  • seed: The random seed adopted for shuffling the class order. According to the benchmark setting, it is set to 1993 by default.

Other parameters in terms of model optimization, e.g., batch size, optimization epoch, learning rate, learning rate decay, weight decay, milestone, and temperature, can be modified in the corresponding Python file.

Datasets

We have implemented the pre-processing of CIFAR100, imagenet100, and imagenet1000. When training on CIFAR100, this framework will automatically download it. When training on imagenet100/1000, you should specify the folder of your dataset in utils/data.py.

    def download_data(self):
        assert 0,"You should specify the folder of your dataset"
        train_dir = '[DATA-PATH]/train/'
        test_dir = '[DATA-PATH]/val/'

Here is the file list of ImageNet100 (or say ImageNet-Sub).

Awesome Papers using PyCIL

Our Papers

  • Learning without Forgetting for Vision-Language Models (arXiv 2023) [paper]

  • Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (arXiv 2023) [paper] [code]

  • Deep Class-Incremental Learning: A Survey (arXiv 2023) [paper] [code]

  • BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion (ICLR 2023) [paper] [code]

  • A model or 603 exemplars: Towards memory-efficient class-incremental learning (ICLR 2023) [paper] [code]

  • Few-shot class-incremental learning by sampling multi-phase tasks (TPAMI 2022) [paper] [code]

  • Foster: Feature Boosting and Compression for Class-incremental Learning (ECCV 2022) [paper] [code]

  • Forward compatible few-shot class-incremental learning (CVPR 2022) [paper] [code]

  • Co-Transport for Class-Incremental Learning (ACM MM 2021) [paper] [code]

Other Awesome Works

  • Towards Continual Egocentric Activity Recognition: A Multi-modal Egocentric Activity Dataset for Continual Learning (arXiv 2023) [paper]

  • S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning (NeurIPS 2022) [paper] [code]

License

Please check the MIT license that is listed in this repository.

Acknowledgments

We thank the following repos providing helpful components/functions in our work.

The training flow and data configurations are based on Continual-Learning-Reproduce. The original information of the repo is available in the base branch.

Contact

If there are any questions, please feel free to propose new features by opening an issue or contact with the author: Da-Wei Zhou([email protected]) and Fu-Yun Wang([email protected]). Enjoy the code.

Star History 🚀

Star History Chart

More Repositories

1

AnimateLCM

[SIGGRAPH ASIA 2024 TCS] AnimateLCM: Computation-Efficient Personalized Style Video Generation without Personalized Video Data
Python
592
star
2

Phased-Consistency-Model

[NeurIPS 2024] Boosting the performance of consistency models with PCM!
Python
355
star
3

Gen-L-Video

The official implementation for "Gen-L-Video: Multi-Text to Long Video Generation via Temporal Co-Denoising".
Jupyter Notebook
284
star
4

Be-Your-Outpainter

[ECCV 2024] Be-Your-Outpainter https://arxiv.org/abs/2403.13745
Python
211
star
5

Rectified-Diffusion

Rectified Diffusion: Straightness Is Not Your Need
Python
100
star
6

Motion-I2V

[SIGGRAPH 2024] Motion I2V: Consistent and Controllable Image-to-Video Generation with Explicit Motion Modeling
Python
95
star
7

a-PyTorch-Tutorial-to-Class-Incremental-Learning

a PyTorch Tutorial to Class-Incremental Learning | a Distributed Training Template of CIL with core code less than 100 lines.
Python
82
star
8

ECCV22-FOSTER

The official implementation for ECCV22 paper: "FOSTER: Feature Boosting and Compression for Class-Incremental Learning" in PyTorch.
Python
51
star
9

Awesome-Consistency-Models

Awesome List of Consistency Models
33
star
10

Pix2Video.pytorch

Implementation of the paper "Pix2Video: Video Editing using Image Diffusion"
Python
26
star
11

T2I-HumanFeedback

Implementations of Baseline Methods for Aligning Text2Img Diffusion Models with Human FeedBack
20
star
12

Awesome-Tuned-Text2Img-Diffusion

Awesome papers/projects about fine-tuning text2img diffusion models for new/reference concepts/domains.
16
star
13

ICLR23-BEEF

The implementation for ICLR2023 paper: "BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion" in PyTorch.
Python
14
star
14

Gen-L-2

Long video generation with short video diffusion models.
12
star
15

Stable-Consistency-Tuning

Stable Consistency Tuning: Understanding and Improving Consistency models
8
star
16

Mask-detector

An AI model to detect whether the man in the video is wearing the mask.
Java
5
star
17

2021NJUAP

南京大学2021高级程序设计课设与作业
C++
4
star
18

Parallel-Computing-Project

Parallel algorithms (quick-sort, merge-sort , enumeration-sort) implemented by p-threads and CUDA
C++
3
star
19

ICS2020-workbench

ICS2020 mini-labs
C
3
star
20

G-U-N

2
star
21

Restricted-Boltzmann-Machine-for-AutoEncoder

reproduce and expansion: Hinton, Geoffrey E., and Ruslan R. Salakhutdinov. "Reducing the dimensionality of data with neural networks." science 313.5786 (2006): 504-507.
Python
2
star
22

cpp-2048-mcts

linux下的C++版2048小游戏与智能Agent
C++
1
star