Home

Awesome

<div align="center">

[NeurIPS 2024] TopoLogic: An Interpretable Pipeline for Lane Topology Reasoning on Driving Scenes

arXiv OpenLane-V2

method

</div>

TL;DR

This repository contains the source code of TopoLogic, An Interpretable Pipeline for Lane Topology Reasoning on Driving Scenes.

TopoLogic is the first to employ an interpretable approach for lane topology reasoning. TopoLogic fuses the geometric distance of lane line endpoints mapped through a designed function and the similarity of lane query in a high-dimensional semantic space to reason lane topology. Experiments on the large-scale autonomous driving dataset OpenLane-V2 benchmark demonstrate that TopoLogic significantly outperforms existing methods in topology reasoning in complex scenarios.

Updates

Table of Contents

Model Zoo

MethodBackboneEpochDatasetOLSVersionConfigDownload
TopoLogicResNet-5024subset-A44.1OpenLane-V2-v2.1.0configckpt / log

Main Results

The result is based on the v1.0.0 OpenLane-V2 devkit and metrics.

Results on OpenLane-V2 subset-A val

We provide results on Openlane-V2 subset-A val set.

MethodBackboneEpochSDMapDET<sub>l</sub>TOP<sub>ll</sub>DET<sub>t</sub>TOP<sub>lt</sub>OLS
STSUResNet-5024×12.70.543.015.125.4
VectorMapNetResNet-5024×11.10.441.76.220.8
MapTRResNet-5024×8.30.243.55.820.0
MapTR*ResNet-5024×17.71.143.510.426.0
TopoNetResNet-5024×28.64.148.620.335.6
TopoLogicResNet-5024×29.918.647.221.541.6
SMERFResNet-502433.47.548.623.439.4
TopoLogicResNet-502434.423.448.324.445.1

The result of TopoLogic is from this repo.

Results on OpenLane-V2 subset-B val

MethodBackboneEpochDET<sub>l</sub>TOP<sub>ll</sub>DET<sub>t</sub>TOP<sub>lt</sub>OLS
TopoLogicResNet-502425.915.154.715.139.6

The result is based on the updated v2.1.0 OpenLane-V2 devkit and metrics.
The result of TopoLogic is from this repo.

MethodBackboneEpochDET<sub>l</sub>TOP<sub>ll</sub>DET<sub>t</sub>TOP<sub>lt</sub>OLS
TopoLogicResNet-502429.923.947.225.444.1

Prerequisites

Installation

We recommend using conda to run the code.

conda create -n topologic python=3.8 -y
conda activate topologic

# (optional) If you have CUDA installed on your computer, skip this step.
conda install cudatoolkit=11.1.1 -c conda-forge

pip install torch==1.9.1+cu111 torchvision==0.10.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html

Install other required packages.

pip install -r requirements.txt

Prepare Dataset

Following OpenLane-V2 repo to download the data and run the preprocessing code.

Train and Evaluate

Train

We recommend using 8 GPUs for training. If a different number of GPUs is utilized, you can enhance performance by configuring the --autoscale-lr option. The training logs will be saved to work_dirs/[work_dir_name].

cd TopoLogic
mkdir work_dirs

./tools/dist_train.sh 8 [work_dir_name] [--autoscale-lr]

Evaluate

You can set --show to visualize the results.

./tools/dist_test.sh 8 [work_dir_name] [--show]

Citation

If this work is helpful for your research, please consider citing the following BibTeX entry.

@misc{fu2024topologic,
      title={TopoLogic: An Interpretable Pipeline for Lane Topology Reasoning on Driving Scenes}, 
      author={Yanping Fu and Wenbin Liao and Xinyuan Liu and Hang xu and Yike Ma and Feng Dai and Yucheng Zhang},
      year={2024},
      eprint={2405.14747},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}



Similar work
@INPROCEEDINGS{10588515,
  author={Jia, Peijin and Wen, Tuopu and Luo, Ziang and Fu, Zheng and Liao, Jiaqi and Chen, Huixian and Jiang, Kun and Yang, Mengmeng and Yang, Diange},
  booktitle={2024 IEEE Intelligent Vehicles Symposium (IV)}, 
  title={LaneDAG: Automatic HD Map Topology Generator Based on Geometry and Attention Fusion Mechanism}, 
  year={2024},
  volume={},
  number={},
  pages={1015-1021},
  keywords={Point cloud compression;Visualization;Statistical analysis;Navigation;Intelligent vehicles;Roads;Feature extraction},
  doi={10.1109/IV55156.2024.10588515}}

Related resources

We acknowledge all the open-source contributors for the following projects to make this work possible: