• Stars
    star
    118
  • Rank 299,923 (Top 6 %)
  • Language
    Python
  • Created over 2 years ago
  • Updated about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

[CVPR2022] Remember Intentions: Retrospective-Memory-based Trajectory Prediction

Remember Intentions: Retrospective-Memory-based Trajectory Prediction

Official PyTorch code for CVPR'22 paper "Remember Intentions: Retrospective-Memory-based Trajectory Prediction".

[Paper]ย [Zhihu]

system design

Abstract: To realize trajectory prediction, most previous methods adopt the parameter-based approach, which encodes all the seen past-future instance pairs into model parameters. However, in this way, the model parameters come from all seen instances, which means a huge amount of irrelevant seen instances might also involve in predicting the current situation, disturbing the performance. To provide a more explicit link between the current situation and the seen instances, we imitate the mechanism of retrospective memory in neuropsychology and propose MemoNet, an instance-based approach that predicts the movement intentions of agents by looking for similar scenarios in the training data. In MemoNet, we design a pair of memory banks to explicitly store representative instances in the training set, acting as prefrontal cortex in the neural system, and a trainable memory addresser to adaptively search a current situation with similar instances in the memory bank, acting like basal ganglia. During prediction, MemoNet recalls previous memory by using the memory addresser to index related instances in the memory bank. We further propose a two-step trajectory prediction system, where the first step is to leverage MemoNet to predict the destination and the second step is to fulfill the whole trajectory according to the predicted destinations. Experiments show that the proposed MemoNet improves the FDE by 20.3%/10.2%/28.3% from the previous best method on SDD/ETH-UCY/NBA datasets. Experiments also show that our MemoNet has the ability to trace back to specific instances during prediction, promoting more interpretability.

We give an example of trajectories predicted by our model and the corresponding ground truth as following:

system design

Below is an example of prediction interpretability where the first column stands for the current agent. The last three columns stand for the memory instances found by the current agent. system design

[2022/09] Update: ETH's code & model are available!

You can find the code and the instructions in the ETH folder.

Installation

Environment

  • Tested OS: Linux / RTX 3090
  • Python == 3.7.9
  • PyTorch == 1.7.1+cu110

Dependencies

Install the dependencies from the requirements.txt:

pip install -r requirements.txt

Pretrained Models

We provide a complete set of pre-trained models including:

  • intention encoder-decoder:
  • learnable addresser:
  • generated memory bank:
  • fulfillment encoder-decoder:

You can download the pretrained models/data from here.

File Structure

After the prepartion work, the whole project should has the following structure:

./MemoNet
โ”œโ”€โ”€ ReadMe.md
โ”œโ”€โ”€ data                            # datasets
โ”‚ย ย  โ”œโ”€โ”€ test_all_4096_0_100.pickle
โ”‚ย ย  โ””โ”€โ”€ train_all_512_0_100.pickle
โ”œโ”€โ”€ models                          # core models
โ”‚ย ย  โ”œโ”€โ”€ layer_utils.py
โ”‚ย ย  โ”œโ”€โ”€ model_AIO.py
โ”‚ย ย  โ””โ”€โ”€ ...
โ”œโ”€โ”€ requirements.txt
โ”œโ”€โ”€ run.sh
โ”œโ”€โ”€ sddloader.py                    # sdd dataloader
โ”œโ”€โ”€ test_MemoNet.py                 # testing code
โ”œโ”€โ”€ train_MemoNet.py                # training code
โ”œโ”€โ”€ trainer                         # core operations to train the model
โ”‚ย ย  โ”œโ”€โ”€ evaluations.py
โ”‚ย ย  โ”œโ”€โ”€ test_final_trajectory.py
โ”‚ย ย  โ””โ”€โ”€ trainer_AIO.py
โ””โ”€โ”€ training                        # saved models/memory banks
    โ”œโ”€โ”€ saved_memory
    โ”‚ย ย  โ”œโ”€โ”€ sdd_social_filter_fut.pt
    โ”‚ย ย  โ”œโ”€โ”€ sdd_social_filter_past.pt
    โ”‚ย ย  โ””โ”€โ”€ sdd_social_part_traj.pt
    โ”œโ”€โ”€ training_ae
    โ”‚ย ย  โ””โ”€โ”€ model_encdec
    โ”œโ”€โ”€ training_selector
    โ”‚ย ย  โ”œโ”€โ”€ model_selector
    โ”‚ย ย  โ””โ”€โ”€ model_selector_warm_up
    โ””โ”€โ”€ training_trajectory
        โ””โ”€โ”€ model_encdec_trajectory

Training

Important configurations.

  • --mode: verify the current training mode,
  • --model_ae: pretrained model path,
  • --info: path name to store the models,
  • --gpu: number of devices to run the codes,

Training commands.

bash run.sh

Reproduce

To get the reported results, following

python test_MemoNet.py --reproduce True --info reproduce --gpu 0

And the code will output:

./training/training_trajectory/model_encdec_trajectory
Test FDE_48s: 12.659514427185059 ------ Test ADE: 8.563031196594238
----------------------------------------------------------------------------------------------------

Acknowledgement

Thanks for the framework provided by Marchetz/MANTRA-CVPR20, which is source code of the published work MANTRA in CVPR-2020. The github repo is MANTRA code. We borrow the framework and interface from the code.

We also thank for the pre-processed data provided by the works of PECNet (paper,code).

Citation

If you use this code, please cite our paper:

@InProceedings{MemoNet_2022_CVPR,
author = {Xu, Chenxin and Mao, Weibo and Zhang, Wenjun and Chen, Siheng},
title = {Remember Intentions: Retrospective-Memory-based Trajectory Prediction},
booktitle = {The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022}
}

More Repositories

1

MING

ๆ˜ŽๅŒป (MING)๏ผšไธญๆ–‡ๅŒป็–—้—ฎ่ฏŠๅคงๆจกๅž‹
Python
812
star
2

RegAD

[ECCV2022 Oral] Registration based Few-Shot Anomaly Detection
Python
268
star
3

FACT

Python
155
star
4

Where2comm

Python
147
star
5

MedKLIP

The official code for MedKLIP: Medical Knowledge Enhanced Language-Image Pre-Training in Radiology. We propose to leverage medical specific knowledge enhancing language-image pre-training method, significantly advancing the ability of pre-trained models to handle unseen diseases on zero-shot classification and grounding tasks.
Python
134
star
6

LED

[CVPR2023] Leapfrog Diffusion Model for Stochastic Trajectory Prediction
Jupyter Notebook
130
star
7

MVFA-AD

[CVPR2024 Highlight] Adapting Visual-Language Models for Generalizable Anomaly Detection in Medical Images
Python
129
star
8

EqMotion

[CVPR2023] EqMotion: Equivariant Multi-agent Motion Prediction with Invariant Interaction Reasoning
Python
112
star
9

BCL

[ICML2022] Contrastive Learning with Boosted Memorization
Python
110
star
10

GroupNet

[CVPR22] GroupNet: Multiscale Hypergraph Neural Networks for Trajectory Prediction with Relational Reasoning
Python
108
star
11

TBP-Former

78
star
12

CoCa3D

Python
75
star
13

GenMedicalEval

69
star
14

CoBEVFlow

[NeurIPS 2023] Asynchrony-Robust Collaborative Perception via Birdโ€™s Eye View Flow
Python
65
star
15

FedDisco

Python
60
star
16

RECORDS-LTPLL

[ICLR 2023] PyTorch implementation for "Long-Tailed Partial Label Learning via Dynamic Rebalancing"
Python
55
star
17

ECGAD

[MICCAI2023 Early Accept] Multi-scale Cross-restoration Framework for Electrocardiogram Anomaly Detection
Python
48
star
18

FedDG-GA

[CVPR 2023] Federated Domain Generalization with Generalization Adjustment
Python
37
star
19

SyncNet

[ECCV2022] Latency-Aware Collaborative Perception
Python
33
star
20

SPGSN

The source codes of 'Skeleton-parted graph scattering networks for 3D human motion prediction'. ECCV 2022
Python
29
star
21

AuxFormer

[ICCV2023] Auxiliary Tasks Benefit 3D Skeleton-based Human Motion Prediction
Python
25
star
22

pFedGraph

Python
23
star
23

BE-SSL

Codes for our paper "Boundary-Enhanced Self-Supervised Learningfor Brain Structure Segmentation"
Python
23
star
24

JRTransformer

[ICCV2023] Joint-Relation Transformer for Multi-Person Motion Prediction
Python
22
star
25

Geometric-Harmonization

[NeurIPS 2023 Spotlight] Combating Representation Learning Disparity with Geometric Harmonization
Python
19
star
26

Collaborative-Uncertainty

Python
19
star
27

GPFL-GRACE

[MICCAI 2023] GRACE: Enhancing Federated Learning for Medical Imaging with Generalized and Personalized Gradient Correction
Python
15
star
28

LoRKD

Python
12
star
29

K-Diag

Python
10
star
30

MoLA

Python
10
star
31

CoFormer

Python
10
star
32

FedGELA

[NeurIPS 2023]Federated Learning with Bilateral Curation for Partially Class-Disjoint Data
Python
10
star
33

FedLESAM

[ICML 2024 spotlight] This repository contains the implementation details for the paper "Locally Estimated Global Perturbations are Better than Local Perturbations for Federated Sharpness-aware Minimization"
Python
9
star
34

FedSkip

FedSkip-Combatting-Statistical-Heterogeneity-with-Federated-Skip-Aggregation official code
Python
7
star
35

OC_LT

Official code base for "Long-Tailed Diffusion Models With Oriented Calibration" ICLR2024
Python
6
star
36

DISAM

This repository contains the implementation details for the paper "Domain-Inspired Sharpness-Aware Minimization Under Domain Shifts," accepted at the ICLR 2024.
Python
6
star
37

CaT

[ICCV2021] CaT: Weakly Supervised Object Detection with Category Transfer
5
star
38

ECISQA

[NeurIPS 2023] Emergent communication in interactive sketch question answering
Jupyter Notebook
5
star
39

FreeAlign

Python
5
star
40

FedMR

[TMLR 2023]Federated Learning under Partially Class-Disjoint Data via Manifold Reshaping
Python
4
star
41

GSC

Python
4
star
42

SSM

[TMM 2022] Self-Supervised Masking for Unsupervised Anomaly Detection and Localization
Python
4
star
43

ITES

Python
1
star
44

NMMP

Python
1
star
45

mediabrain-sjtu.github.io

TeX
1
star