• Stars
    star
    362
  • Rank 117,671 (Top 3 %)
  • Language
    Jupyter Notebook
  • Created almost 8 years ago
  • Updated almost 6 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

PyTorch implementation of the CortexNet predictive model

CortexNet

This repo contains the PyTorch implementation of CortexNet.
Check the project website for further information.

Project structure

The project consists of the following folders and files:

  • data/: contains Bash scripts and a Python class definition inherent video data loading;
  • image-pretraining/: hosts the code for pre-training TempoNet's discriminative branch;
  • model/: stores several network architectures, including PredNet, an additive feedback Model01, and a modulatory feedback Model02 (CortexNet);
  • notebook/: collection of Jupyter Notebooks for data exploration and results visualisation;
  • utils/: scripts for
    • (current or former) training error plotting,
    • experiments diff,
    • multi-node synchronisation,
    • generative predictions visualisation,
    • network architecture graphing;
  • results@: link to the location where experimental results will be saved within 3-digit folders;
  • new_experiment.sh*: creates a new experiment folder, updates last@, prints a memo about last used settings;
  • last@: symbolic link pointing to a new results sub-directory created by new_experiment.sh;
  • main.py: training script for CortexNet in MatchNet or TempoNet configuration;

Dependencies

pip install sk-video
  • tqdm: progress bar
conda config --add channels conda-forge
conda update --all
conda install tqdm

IDE

This project has been realised with PyCharm by JetBrains and the Vim editor. Grip has been also fundamental for crafting decent documtation locally.

Initialise environment

Once you've determined where you'd like to save your experimental results β€” let's call this directory <my saving location> β€” run the following commands from the project's root directory:

ln -s <my saving location> results  # replace <my saving location>
mkdir results/000 && touch results/000/train.log  # init. placeholder
ln -s results/000 last  # create pointer to the most recent result

Setup new experiment

Ready to run your first experiment? Type the following:

./new_experiment.sh

GPU selection

Let's say your machine has N GPUs. You can choose to use any of these, by specifying the index n = 0, ..., N-1. Therefore, type CUDA_VISIBLE_DEVICES=n just before python ... in the following sections.

Train MatchNet

  • Download e-VDS35 (e.g. e-VDS35-May17.tar) from here.
  • Use data/resize_and_split.sh to prepare your (video) data for training. It resizes videos present in folders of folders (i.e. directory of classes) and may split them into training and validation set. May also skip short videos and trim longer ones. Check data/README.md for more details.
  • Run the main.py script to start training. Use -h to print the command line interface (CLI) arguments help.
python -u main.py --mode MatchNet <CLI arguments> | tee last/train.log

Train TempoNet

  • Download e-VDS35 (e.g. e-VDS35-May17.tar) from here.
  • Pre-train the forward branch (see image-pretraining/) on an image data set (e.g. 33-image-set.tar from here);
  • Use data/resize_and_sample.sh to prepare your (video) data for training. It resizes videos present in folders of folders (i.e. directory of classes) and samples them. Videos are then distributed across training and validation set. May also skip short videos and trim longer ones. Check data/README.md for more details.
  • Run the main.py script to start training. Use -h to print the CLI arguments help.
python -u main.py --mode TempoNet --pre-trained <path> <CLI args> | tee last/train.log

GPU selection

To run on a specific GPU, say n, type CUDA_VISIBLE_DEVICES=n just before python ....

More Repositories

1

NYU-DLSP20

NYU Deep Learning Spring 2020
Jupyter Notebook
6,663
star
2

NYU-DLSP21

NYU Deep Learning Spring 2021
Jupyter Notebook
1,546
star
3

torch-Video-Tutorials

Light your way in Deep Learning with Torch πŸ”¦
Lua
592
star
4

pytorch-PPUU

Code for Prediction and Planning Under Uncertainty (PPUU)
Jupyter Notebook
199
star
5

torch-TripletEmbedding

TripletLoss used in Google's FaceNet paper
Lua
162
star
6

pytorch-Video-Tutorials

The versatility of Python 🐍 enlightened by Torch πŸ”¦ to seize Deep Learning
129
star
7

SP19-DL-collaborative-notes

Collaborative lecture notes for Spring '19 NYU DL class
TeX
116
star
8

torch-Developer-Guide

Some advanced tricks with Torch7 explained easily
Lua
94
star
9

torch-Machine-learning-with-Torch

Collection of simple machine learning algorithms for Torch
Lua
54
star
10

NYU-DLFL22

NYU Deep Learning Fall 2022
Jupyter Notebook
52
star
11

NYU-AISP24

NYU Artificial Intelligence Spring 2024
45
star
12

atcold.github.com

HTML
24
star
13

Unix-dot-files

Mac OS X and Linux optimal configuration files
Shell
19
star
14

DLSP20-collaborative-notes

19
star
15

torch-net-toolkit

A simple module for <Torch7> and the <nn> package
Lua
18
star
16

stylish-Dark-Themes

Missing Dark Themes for the Stylish platform
CSS
10
star
17

tex-Learning-TikZ

TeX
9
star
18

torch-Torch7-tools

This would be a collection of useful routines and function currently missing in Torch7
Lua
9
star
19

torch-pretty-nn

Brings some colour to the boring `nn` package of Torch.
Lua
7
star
20

torch-INRIA

Loads face, torso, body and background samples from INRIA dataset
Lua
5
star
21

SOTA-models

State Of The Art deep neural network models
Lua
5
star
22

ino-ESP32

Project repository for ESP32
C++
4
star
23

jn-web-app

Fiddling with Jupyter notebook and web apps
Jupyter Notebook
3
star
24

python-AMC-IMDB-ratings

Getting IMDB ratings for AMC movies
Python
3
star
25

Figures-Yann

Figures for Yann's book
Jupyter Notebook
2
star
26

web-Learning-WEB-technology

Some working examples for JS, CSS, and HTML
HTML
2
star
27

tex-History-timelines

A few timelines in TikZ
TeX
1
star
28

smartEYE

Implementation of the VADNN article
Lua
1
star