• Stars
    star
    182
  • Rank 209,857 (Top 5 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created over 1 year ago
  • Updated 5 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

The official implementation of the CVPR2023 paper titled “Neural Map Prior for Autonomous Driving”.

Neural Map Prior for Autonomous Driving (CVPR 2023)

arXiv Paper | CVF Paper | Webpage | 5-min Video | Poster | Blog

Xuan Xiong, Yicheng Liu, Tianyuan Yuan, Yue Wang, Yilun Wang, Hang Zhao*

Table of Contents

Introduction

A neural representation of HD maps to improve local map inference performance for autonomous driving.

This repo is the official implementation of "Neural Map Prior for Autonomous Driving"(CVPR 2023). Our main contributions are:

  • A novel mapping paradigm: integrates the maintenance of global maps and the inference of online local maps.
  • Efficient fusion modules: current-to-prior attention and gated recurrent unit modules facilitate the efficient fusion of global and local map features.
  • Easy integration with HD map learning methods: Neural Map Prior can be easily applied to various map segmentation and detection methods. Simple replacement for your online map inference model in your novel algorithms. Moreover, our approach demonstrates significant advancements in challenging scenarios, such as bad weather conditions and longer perception ranges.
  • Sparse map tiles: a memory-efficient approach for storing neural representations of city-scale HD maps.

Notes

  • The most challenging part of the training and testing process is distributing the appropriate map tile to each GPU and keeping samples in the same map tiles updated on the same GPU. This process includes two steps:
    1. Determine the map tile for each GPU. This can be done with some functions in project/neural_map_prior/map_tiles/lane_render.py.
    2. Rewrite the sampler so that each GPU can only sample the samples in the map tile assigned to it. This can be done in tools/data_sampler.py.

Model Zoo

We experiment with BEVFormer, lift-spat-shoot, HDMapNet and VectorMapNet architectures on nuScenes.

HD semantic map [BEVFormer] (on nuScenes validation)

Model Config Modality Divider Crossing Boundary All(mIoU) Checkpoint Link
BEVFormer Camera 49.20 28.67 50.43 42.76 model
BEVFormer + NMP Camera 54.20 34.52 56.94 48.55 model

Installation

Please check installation for installation and data_preparation for preparing the nuScenes dataset.

Getting Started

Please check getting_started for training, evaluation, and visualization of neural_map_prior.

Architecture

visualization

Contact

Any questions or suggestions are welcome!

Acknowledgements

We would like to thank all who contributed to the open-source projects listed below. Our project would be impossible to get done without the inspiration of these outstanding researchers and developers.

The designate project/neural_map_prior as a module is inspired by the implementations of DETR3DGitHub stars.

Citation

If you find neural_map_prior useful in your research or applications, please consider citing:

@inproceedings{xiong2023neuralmapprior,
  author  = {Xiong, Xuan and Liu, Yicheng and Yuan, Tianyuan and Wang, Yue and Wang, Yilun and Zhao Hang},
  title   = {Neural Map Prior for Autonomous Driving},
  journal = {Proceedings of the IEEE/CVF International Conference on Computer Vision (CVPR)},
  year    = {2023}
}

License

This project is licensed under the Apache 2.0 license.