• Stars
    star
    118
  • Rank 299,923 (Top 6 %)
  • Language
    Python
  • License
    MIT License
  • Created about 5 years ago
  • Updated over 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Code for Paper "Incremental Few-Shot Learning with Attention Attractor Networks"

inc-few-shot-attractor-public

This repository contains code for the following paper: Incremental Few-Shot Learning with Attention Attractor Networks. Mengye Ren, Renjie Liao, Ethan Fetaya, Richard S. Zemel. NeurIPS 2019. [arxiv]

Dependencies

  • cv2
  • numpy
  • pandas
  • python 2.7 / 3.5+
  • tensorflow 1.11
  • tqdm

Our code is tested on Ubuntu 14.04 and 16.04.

Setup

First, designate a folder to be your data root:

export DATA_ROOT={DATA_ROOT}

Then, set up the datasets following the instructions in the subsections.

miniImageNet

[Google Drive] (5GB)

# Download and place "mini-imagenet.tar.gz" in "$DATA_ROOT/mini-imagenet".
mkdir -p $DATA_ROOT/mini-imagenet
cd $DATA_ROOT/mini-imagenet
mv ~/Downloads/mini-imagenet.tar .
tar -xvf mini-imagenet.tar
rm -f mini-imagenet.tar

tieredImageNet

[Google Drive] (15GB)

# Download and place "tiered-imagenet.tar" in "$DATA_ROOT/tiered-imagenet".
mkdir -p $DATA_ROOT/tiered-imagenet
cd $DATA_ROOT/tiered-imagenet
mv ~/Downloads/tiered-imagenet.tar .
tar -xvf tiered-imagenet.tar
rm -f tiered-imagenet.tar

Note: Please make sure that the following hardware requirements are met before running tieredImageNet experiments.

  • Disk: 30 GB
  • RAM: 32 GB

Config files

Run make to make protobuf files.

git clone https://github.com/renmengye/inc-few-shot-attractor.git
cd inc-few-shot-attractor
make

Core Experiments

Pretraining

./run.sh {GPUID} python run_exp.py --config {CONFIG_FILE}     \
                  --dataset {DATASET}                         \
                  --data_folder {DATASET_FOLDER}              \
                  --results {SAVE_FOLDER}                     \
                  --tag {EXPERIMENT_NAME}
  • Possible DATASET options are mini-imagenet, tiered-imagenet.
  • Possible CONFIG options are any prototxt file in the ./configs/pretrain folder.

Meta-learning

./run.sh {GPUID} python run_exp.py --config {CONFIG_FILE}     \
                  --dataset {DATASET}                         \
                  --data_folder {DATASET_FOLDER}              \
                  --pretrain {PRETRAIN_CKPT_FOLDER}           \
                  --nshot {NUMBER_OF_SHOTS}                   \
                  --nclasses_b {NUMBER_OF_FEWSHOT_WAYS}       \
                  --results {SAVE_FOLDER}                     \
                  --tag {EXPERIMENT_NAME}                     \
                  [--eval]                                    \
                  [--retest]
  • Possible DATASET options are mini-imagenet, tiered-imagenet.
  • Possible CONFIG options are any prototxt file in the ./configs/attractors folder, e.g. \*-{mlp|lr}-attn-s{1|5}.prototxt means 1/5-shot model using MLP or LR as fast weights model.
  • You need to pass in PRETRAIN_CKPT_FOLDER option with the pretrained model.
  • Add --retest flag for restoring a fully trained model and re-run eval.

Baselines

  • Baseline configs are in ./configs/lwof and ./configs/imprint.
  • For ProtoNet baseline please run run_proto_exp.py with the same flags from the previous section.
  • Configs for ablation studies can be found in ./configs/ablation.

Citation

If you use our code, please consider cite the following:

  • Mengye Ren, Renjie Liao, Ethan Fetaya and Richard S. Zemel. Incremental Few-Shot Learning with Attention Attractor Networks. In Advances in Neural Information Processing Systems (NeurIPS), 2019.
@inproceedings{ren19incfewshot,
  author   = {Mengye Ren and
              Renjie Liao and
              Ethan Fetaya and
              Richard S. Zemel},
  title    = {Incremental Few-Shot Learning with Attention Attractor Networks,
  booktitle= {Advances in Neural Information Processing Systems (NeurIPS)},
  year     = {2019},
}

More Repositories

1

few-shot-ssl-public

Meta Learning for Semi-Supervised Few-Shot Classification
Python
553
star
2

revnet-public

Code for "The Reversible Residual Network: Backpropagation Without Storing Activations"
Python
351
star
3

tensorflow-forward-ad

Forward-mode Automatic Differentiation for TensorFlow
Python
140
star
4

rec-attend-public

Code that implements paper "End-to-End Instance Segmentation with Recurrent Attention"
Python
109
star
5

imageqa-public

Code for paper "Exploring Models and Data for Image Question Answering"
Python
83
star
6

base62-csharp

Base62 Encoding C# implementation
C#
47
star
7

deep-dashboard

Deep Dashboard: Machine Learning Training Visualizer
JavaScript
44
star
8

meta-optim-public

Understanding Short-Horizon Bias in Stochastic Meta-Optimization
Python
37
star
9

oc-fewshot-public

Code associated with paper "Wandering Within a World: Online Contextualized Few-Shot Learning"
Python
24
star
10

imageqa-qgen

A question generator described in paper "Exploring Model and Data for Image Question Answering"
Python
24
star
11

np-conv2d

2D Convolution using NumPy
Python
17
star
12

cityscapes-api

API for Cityscapes Dataset
Python
11
star
13

CoursePlanner

Planner tool for college course selection and timetable scheduling
C#
10
star
14

pysched

Pipeline based scheduler made in Python
Python
9
star
15

online-unsup-proto-net

Python
7
star
16

div-norm

Implementation of divisive normalization in TensorFlow
Python
7
star
17

resnet

Modified from the original tensorflow version.
Python
3
star
18

csc467

CSC467 Compiler Project
C
2
star
19

neural-lm

Neural Language Model Implementation
C++
2
star
20

tfplus

Deep learning utility library based on Tensorflow
Python
2
star
21

deep-tracker

Python
2
star
22

AutoTetris

An automatic solution to the classic game Tetris
Java
2
star
23

bazel-docker

Build Docker container with Bazel
Python
1
star
24

grade-school-math-relational

Abstract relation annotations of the GSM-8k dataset
1
star
25

imageqa_icml2015_poster

TeX
1
star