• Stars
    star
    967
  • Rank 47,309 (Top 1.0 %)
  • Language
    Python
  • License
    MIT License
  • Created about 2 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

[ICCV 2023] DETRs with Collaborative Hybrid Assignments Training

DETRs with Collaborative Hybrid Assignments Training

PWC PWC PWC PWC PWC

This repo is the official implementation of "DETRs with Collaborative Hybrid Assignments Training" by Zhuofan Zong, Guanglu Song, and Yu Liu.

News

  • [07/20/2023] Code for Co-DINO is released: 55.4 AP with ResNet-50 and 60.7 AP with Swin-L.
  • [07/14/2023] Co-DETR is accepted to ICCV 2023!
  • [07/12/2023] We finetune Co-DETR on LVIS and achieve the best results without TTA: 71.9 box AP and 59.7 mask AP on LVIS minival, 67.9 box AP and 56.0 mask AP on LVIS val. For instance segmentation, we report the performance of the auxiliary mask branch.
  • [07/03/2023] Co-DETR with ViT-L (304M parameters) sets a new record of 65.6 66.0 AP on COCO test-dev, surpassing the previous best model InternImage-G (~3000M parameters). It is the first model to exceed 66.0 AP on COCO test-dev.
  • [07/03/2023] Code for Co-Deformable-DETR is released.
  • [11/19/2022] We achieved 64.4 AP on COCO minival and 64.5 AP on COCO test-dev with only ImageNet-1K as pre-training data. Codes will be available soon.

Introduction

teaser

In this paper, we present a novel collaborative hybrid assignments training scheme, namely Co-DETR, to learn more efficient and effective DETR-based detectors from versatile label assignment manners.

  1. Encoder optimization: The proposed training scheme can easily enhance the encoder's learning ability in end-to-end detectors by training multiple parallel auxiliary heads supervised by one-to-many label assignments.
  2. Decoder optimization: We conduct extra customized positive queries by extracting the positive coordinates from these auxiliary heads to improve attention learning of the decoder.
  3. State-of-the-art performance: Co-DETR with ViT-L (304M parameters) is the first model to achieve 66.0% AP on COCO test-dev.

teaser

Model Zoo

Performance of Co-DETR with ResNet-50

Model Backbone Epochs Aug Dataset box AP Config Download
Co-DINO R50 12 DETR COCO 52.1 config model
Co-DINO R50 12 LSJ COCO 52.1 config model
Co-DINO-9enc R50 12 LSJ COCO 52.6 config model
Co-DINO R50 36 LSJ COCO 54.8 config model
Co-DINO-9enc R50 36 LSJ COCO 55.4 config model

Performance of Co-DETR with Swin-L

Model Backbone Epochs Aug Dataset box AP Config Download
Co-DINO Swin-L 12 DETR COCO 58.9 config model
Co-DINO Swin-L 24 DETR COCO 59.8 config model
Co-DINO Swin-L 36 DETR COCO 60.0 config model
Co-DINO Swin-L 12 LSJ COCO 59.3 config model
Co-DINO Swin-L 24 LSJ COCO 60.4 config model
Co-DINO Swin-L 36 LSJ COCO 60.7 config model
Co-DINO Swin-L 36 LSJ LVIS 56.9 config model

Results on Deformable-DETR

Model Backbone Epochs Queries box AP Config Download
Co-Deformable-DETR R50 12 300 49.5 config model | log
Co-Deformable-DETR Swin-T 12 300 51.7 config model | log
Co-Deformable-DETR Swin-T 36 300 54.1 config model | log
Co-Deformable-DETR Swin-S 12 300 53.4 config model | log
Co-Deformable-DETR Swin-S 36 300 55.3 config model | log
Co-Deformable-DETR Swin-B 12 300 55.5 config model | log
Co-Deformable-DETR Swin-B 36 300 57.5 config model | log
Co-Deformable-DETR Swin-L 12 300 56.9 config model | log
Co-Deformable-DETR Swin-L 36 900 58.5 config model | log

Running

Install

We implement Co-DETR using MMDetection V2.25.3 and MMCV V1.5.0. The source code of MMdetection has been included in this repo and you only need to build MMCV following official instructions. We test our models under python=3.7.11,pytorch=1.11.0,cuda=11.3. Other versions may not be compatible.

Data

The COCO dataset should be organized as:

data/
  β”œβ”€β”€ train2017/
  β”œβ”€β”€ val2017/
  └── annotations/
  	β”œβ”€β”€ instances_train2017.json
  	└── instances_val2017.json

Training

Train Co-Deformable-DETR + ResNet-50 with 8 GPUs:

sh tools/dist_train.sh projects/configs/co_deformable_detr/co_deformable_detr_r50_1x_coco.py 8 path_to_exp

Train using slurm:

sh tools/slurm_train.sh partition job_name projects/configs/co_deformable_detr/co_deformable_detr_r50_1x_coco.py path_to_exp

Testing

Test Co-Deformable-DETR + ResNet-50 with 8 GPUs, and evaluate:

sh tools/dist_test.sh  projects/configs/co_deformable_detr/co_deformable_detr_r50_1x_coco.py path_to_checkpoint 8 --eval bbox

Test using slurm:

sh tools/slurm_test.sh partition job_name projects/configs/co_deformable_detr/co_deformable_detr_r50_1x_coco.py path_to_checkpoint --eval bbox

Cite Co-DETR

If you find this repository useful, please use the following BibTeX entry for citation.

@misc{codetr2022,
      title={DETRs with Collaborative Hybrid Assignments Training},
      author={Zhuofan Zong and Guanglu Song and Yu Liu},
      year={2022},
      eprint={2211.12860},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

License

This project is released under the MIT license. Please see the LICENSE file for more information.