Home

Awesome

SparseDrive: End-to-End Autonomous Driving via Sparse Scene Representation

https://github.com/swc-17/SparseDrive/assets/64842878/867276dc-7c19-4e01-9a8e-81c4ed844745

News

Introduction

SparseDrive is a Sparse-Centric paradigm for end-to-end autonomous driving.

<center> <img style="border-radius: 0.3125em; box-shadow: 0 2px 4px 0 rgba(34,36,38,.12),0 2px 10px 0 rgba(34,36,38,.08);" src="resources/overview.png" width="1000"> <br> <div style="color:orange; border-bottom: 1px solid #d9d9d9; display: inline-block; color: #999; padding: 2px;">Overview of SparseDrive. SparseDrive first encodes multi-view images into feature maps, then learns sparse scene representation through symmetric sparse perception, and finally perform motion prediction and planning in a parallel manner. An instance memory queue is devised for temporal modeling.</div> </center> <center> <img style="border-radius: 0.3125em; box-shadow: 0 2px 4px 0 rgba(34,36,38,.12),0 2px 10px 0 rgba(34,36,38,.08);" src="resources/sparse_perception.png" width="1000"> <br> <div style="color:orange; border-bottom: 1px solid #d9d9d9; display: inline-block; color: #999; padding: 2px;">Model architecture of symmetric sparse perception, which unifies detection, tracking and online mapping in a symmetric structure.</div> </center> <center> <img style="border-radius: 0.3125em; box-shadow: 0 2px 4px 0 rgba(34,36,38,.12),0 2px 10px 0 rgba(34,36,38,.08);" src="resources/motion_planner.png" width="1000"> <br> <div style="color:orange; border-bottom: 1px solid #d9d9d9; display: inline-block; color: #999; padding: 2px;">Model structure of parallel motion planner, which performs motion prediction and planning simultaneously and outputs safe planning trajectory.</div> </center>

Results in paper

MethodNDSAMOTAminADE (m)L2 (m) AvgCol. (%) AvgTraining Time (h)FPS
UniAD0.4980.3590.710.730.611441.8
SparseDrive-S0.5250.3860.620.610.08209.0
SparseDrive-B0.5880.5010.600.580.06307.3
MethodL2 (m) 1sL2 (m) 2sL2 (m) 3sL2 (m) AvgCol. (%) 1sCol. (%) 2sCol. (%) 3sCol. (%) AvgFPS
UniAD0.450.701.040.730.620.580.630.611.8
VAD0.410.701.050.720.030.190.430.214.5
SparseDrive-S0.290.580.960.610.010.050.180.089.0
SparseDrive-B0.290.550.910.580.010.020.130.067.3

Results of released checkpoint

We found that some collision cases were not taken into consideration in our previous code, so we re-implement the evaluation metric for collision rate in released code and provide updated results.

Main results

Modelconfigckptlogdet: NDSmapping: mAPtrack: AMOTAtrack: AMOTPmotion: EPA_carmotion: minADE_carmotion: minFDE_carmotion: MissRate_carplanning: CRplanning: L2
Stage1cfgckptlog0.52600.56890.3851.260
Stage2cfgckptlog0.52570.56560.3721.2480.4920.610.950.1330.097%0.61

Detailed results for planning

MethodL2 (m) 1sL2 (m) 2sL2 (m) 3sL2 (m) AvgCol. (%) 1sCol. (%) 2sCol. (%) 3sCol. (%) Avg
UniAD0.450.701.040.730.660.660.720.68
UniAD-wo-post-optim0.320.580.940.610.170.270.420.29
VAD0.410.701.050.720.030.210.490.24
SparseDrive-S0.300.580.950.610.010.050.230.10

Quick Start

Quick Start

Citation

If you find SparseDrive useful in your research or applications, please consider giving us a star 🌟 and citing it by the following BibTeX entry.

@article{sun2024sparsedrive,
  title={SparseDrive: End-to-End Autonomous Driving via Sparse Scene Representation},
  author={Sun, Wenchao and Lin, Xuewu and Shi, Yining and Zhang, Chuang and Wu, Haoran and Zheng, Sifa},
  journal={arXiv preprint arXiv:2405.19620},
  year={2024}
}

Acknowledgement