• Stars
    star
    663
  • Rank 67,991 (Top 2 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 6 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)

Meta-Transfer Learning for Few-Shot Learning

LICENSE Python TensorFlow PyTorch Citations

This repository contains the TensorFlow and PyTorch implementation for the CVPR 2019 Paper "Meta-Transfer Learning for Few-Shot Learning" by Qianru Sun,* Yaoyao Liu,* Tat-Seng Chua, and Bernt Schiele (*=equal contribution).

If you have any questions on this repository or the related paper, feel free to create an issue or send me an email.

Summary

Introduction

Meta-learning has been proposed as a framework to address the challenging few-shot learning setting. The key idea is to leverage a large number of similar few-shot tasks in order to learn how to adapt a base-learner to a new task for which only a few labeled samples are available. As deep neural networks (DNNs) tend to overfit using a few samples only, meta-learning typically uses shallow neural networks (SNNs), thus limiting its effectiveness. In this paper we propose a novel few-shot learning method called meta-transfer learning (MTL) which learns to adapt a deep NN for few shot learning tasks. Specifically, meta refers to training multiple tasks, and transfer is achieved by learning scaling and shifting functions of DNN weights for each task. We conduct experiments using (5-class, 1-shot) and (5-class, 5-shot) recognition tasks on two challenging few-shot learning benchmarks: ๐‘š๐‘–๐‘›๐‘–ImageNet and Fewshot-CIFAR100.

Figure: Meta-Transfer Learning. (a) Parameter-level fine-tuning (FT) is a conventional meta-training operation, e.g. in MAML. Its update works for all neuron parameters, ๐‘Š and ๐‘. (b) Our neuron-level scaling and shifting (SS) operations in meta-transfer learning. They reduce the number of learning parameters and avoid overfitting problems. In addition, they keep large-scale trained parameters (in yellow) frozen, preventing โ€œcatastrophic forgettingโ€.

Getting Started

Please see README.md files in the corresponding folders:

Datasets

Directly download processed images: [Download Page]

๐’Ž๐’Š๐’๐’ŠImageNet

The ๐‘š๐‘–๐‘›๐‘–ImageNet dataset was proposed by Vinyals et al. for few-shot learning evaluation. Its complexity is high due to the use of ImageNet images but requires fewer resources and infrastructure than running on the full ImageNet dataset. In total, there are 100 classes with 600 samples of 84ร—84 color images per class. These 100 classes are divided into 64, 16, and 20 classes respectively for sampling tasks for meta-training, meta-validation, and meta-test. To generate this dataset from ImageNet, you may use the repository ๐‘š๐‘–๐‘›๐‘–ImageNet tools.

Fewshot-CIFAR100

Fewshot-CIFAR100 (FC100) is based on the popular object classification dataset CIFAR100. The splits were proposed by TADAM. It offers a more challenging scenario with lower image resolution and more challenging meta-training/test splits that are separated according to object super-classes. It contains 100 object classes and each class has 600 samples of 32 ร— 32 color images. The 100 classes belong to 20 super-classes. Meta-training data are from 60 classes belonging to 12 super-classes. Meta-validation and meta-test sets contain 20 classes belonging to 4 super-classes, respectively.

๐’•๐’Š๐’†๐’“๐’†๐’…ImageNet

The ๐‘ก๐‘–๐‘’๐‘Ÿ๐‘’๐‘‘ImageNet dataset is a larger subset of ILSVRC-12 with 608 classes (779,165 images) grouped into 34 higher-level nodes in the ImageNet human-curated hierarchy. To generate this dataset from ImageNet, you may use the repository ๐‘ก๐‘–๐‘’๐‘Ÿ๐‘’๐‘‘ImageNet dataset: ๐‘ก๐‘–๐‘’๐‘Ÿ๐‘’๐‘‘ImageNet tools.

Performance

(%) ๐‘š๐‘–๐‘›๐‘– 1-shot ๐‘š๐‘–๐‘›๐‘– 5-shot FC100 1-shot FC100 5-shot
MTL Paper 60.2 ยฑ 1.8 74.3 ยฑ 0.9 43.6 ยฑ 1.8 55.4 ยฑ 0.9
TensorFlow 60.8 ยฑ 1.8 74.3 ยฑ 0.9 44.3 ยฑ 1.8 56.8 ยฑ 1.0
  • The performance for the PyTorch version is under checking.

Citation

Please cite our paper if it is helpful to your work:

@inproceedings{SunLCS2019MTL,
  author    = {Qianru Sun and
               Yaoyao Liu and
               Tat{-}Seng Chua and
               Bernt Schiele},
  title     = {Meta-Transfer Learning for Few-Shot Learning},
  booktitle = {{IEEE} Conference on Computer Vision and Pattern Recognition, {CVPR}
               2019, Long Beach, CA, USA, June 16-20, 2019},
  pages     = {403--412},
  publisher = {Computer Vision Foundation / {IEEE}},
  year      = {2019}
}

Acknowledgements

Our implementations use the source code from the following repositories and users:

More Repositories

1

class-incremental-learning

PyTorch implementation of AANets (CVPR 2021) and Mnemonics Training (CVPR 2020 Oral)
Python
425
star
2

mini-imagenet-tools

Tools for generating mini-ImageNet dataset and processing batches
Python
380
star
3

few-shot-classification-leaderboard

Leaderboards for few-shot image classification on miniImageNet, tieredImageNet, FC100, and CIFAR-FS.
HTML
332
star
4

minimal-light

A simple and elegant Jekyll theme for an academic personal homepage
SCSS
193
star
5

tiered-imagenet-tools

Tools for generating tieredImageNet dataset and processing batches
Python
67
star
6

e3bm

PyTorch implementation of "An Ensemble of Epoch-wise Empirical Bayes for Few-shot Learning" (ECCV 2020)
Python
48
star
7

CL-DETR

PyTorch implementation of "Continual Detection Transformer for Incremental Object Detection" (CVPR 2023)
25
star
8

social-relation-tensorflow

TensorFlow implementation of "A Domain Based Approach to Social Relation Recognition" (CVPR2017)
Python
24
star
9

yaoyao-liu.github.io

My homepage's source code
CSS
11
star
10

online-hyperparameter-optimization

PyTorch implementation of "Online Hyperparameter Optimization for Class-Incremental Learning" (AAAI 2023 Oral)
Python
10
star
11

URL-Redirect-zh

้€š่ฟ‡Githubๅฎž็ŽฐURL่ฝฌๅ‘
HTML
10
star
12

strata-academic

A simple and elegant jekyll theme for academic personal homepage
CSS
9
star
13

POD-AANets

Code for PODNet w/ AANets
Python
9
star
14

meta-transfer-learning-pytorch

6
star
15

fsl-html-source

HTML
3
star
16

jekyll-jemdoc

Light text markup for creating websites - the Jekyll version
CSS
3
star
17

minimal-light-theme-mpi-inf

A simple and elegant Jekyll theme for an MPI Informatics personal homepage
CSS
2
star
18

URL-Redirect

URL Redirect via GitHub
HTML
1
star
19

face-image-generation

1
star
20

README-Syntax

README document syntax guide๏ผŒi.e. Github Flavored Markdown syntax introduction
1
star
21

homepage

CSS
1
star
22

minimal-light-project-pages

SCSS
1
star