Awesome
PowerBEV
This is the official PyTorch implementation of the paper:
PowerBEV: A Powerful yet Lightweight Framework for Instance Prediction in Bird's-Eye View
Peizheng Li, Shuxiao Ding, Xieyuanli Chen, Niklas Hanselmann, Marius Cordts, JΓΌrgen Gall
π Contents
π° News
- PowerBEV has been accepted by the 32nd International Joint Conference on Artificial Intelligence.
- PowerBEV has been included in ROAD++: The Second Workshop & Challenge on Event Detection for Situation Awareness in Autonomous Driving @ ICCV 2023.
βοΈ Setup
Create the conda environment by running
conda env create -f environment.yml
π Dataset
- Download the full NuScenes dataset (v1.0), which includes the Mini dataset (metadata and sensor file blobs) and the Trainval dataset (metadata and file blobs part 1-10).
- Extract the tar files to the default
nuscenes/
or toYOUR_NUSCENES_DATAROOT
. The files should be organized in the following structure:nuscenes/ βββββ trainval/ β βββββ maps/ β βββββ samples/ β βββββ sweeps/ β βββββ v1.0-trainval/ βββββ mini/ βββββ maps/ βββββ samples/ βββββ sweeps/ βββββ v1.0-mini/
π₯ Pre-trained models
The config file can be found in powerbev/configs
Config | Weights | Dataset | Past Context | Future Horizon | BEV Size | IoU | VPQ |
---|---|---|---|---|---|---|---|
powerbev.yml | PowerBEV_long.ckpt | NuScenes | 1.0s | 2.0s | 100m x 100m (50cm res.) | 39.3 | 33.8 |
powerbev.yml | PowerBEV_short.ckpt | NuScenes | 1.0s | 2.0s | 30m x 30m (15cm res.) | 62.5 | 55.5 |
Note: All metrics above are obtained by training based on pre-trained static weights (static long
/static short
).
π Training
To train the model from scratch on NuScenes, run
python train.py --config powerbev/configs/powerbev.yml
To train the model from the pre-trained static checkpoint on NuScenes, download pre-trained static weights (static long
/static short
) to YOUR_PRETRAINED_STATIC_WEIGHTS_PATH
and run
python train.py --config powerbev/configs/powerbev.yml \
PRETRAINED.LOAD_WEIGHTS True \
PRETRAINED.PATH $YOUR_PRETRAINED_STATIC_WEIGHTS_PATH
Note: These will train the model on 4 GPUs, each with a batch of size 2.
To set your configs, please run
python train.py --config powerbev/configs/powerbev.yml \
DATASET.DATAROOT $YOUR_NUSCENES_DATAROOT \
LOG_DIR $YOUR_OUTPUT_PATH \
GPUS [0] \
BATCHSIZE $YOUR_DESIRED_BATCHSIZE
The above settings can also be changed directly by modifying powerbev.yml
. Please see the config.py
for more information.
π Prediction
Evaluation
Download trained weights (long
/short
) to YOUR_PRETRAINED_WEIGHTS_PATH
and run
python test.py --config powerbev/configs/powerbev.yml \
PRETRAINED.LOAD_WEIGHTS True \
PRETRAINED.PATH $YOUR_PRETRAINED_WEIGHTS_PATH
Visualisation
Download trained weights (long
/short
) to YOUR_PRETRAINED_WEIGHTS_PATH
and run
python visualise.py --config powerbev/configs/powerbev.yml \
PRETRAINED.LOAD_WEIGHTS True \
PRETRAINED.PATH $YOUR_PRETRAINED_WEIGHTS_PATH \
BATCHSIZE 1
This will render predictions from the network and save them to an visualization_outputs
folder.
Note: To visualize Ground Truth, please add the config VISUALIZATION.VIS_GT True
at the end of the command
π License
PowerBEV is released under the MIT license. Please see the LICENSE file for more information.
π Citation
@article{li2023powerbev,
title = {PowerBEV: A Powerful Yet Lightweight Framework for Instance Prediction in Bird's-Eye View},
author = {Li, Peizheng and Ding, Shuxiao and Chen, Xieyuanli and Hanselmann, Niklas and Cordts, Marius and Gall, Juergen},
journal = {arXiv preprint arXiv:2306.10761},
year = {2023}
}
@inproceedings{ijcai2023p120,
title = {PowerBEV: A Powerful Yet Lightweight Framework for Instance Prediction in Birdβs-Eye View},
author = {Li, Peizheng and Ding, Shuxiao and Chen, Xieyuanli and Hanselmann, Niklas and Cordts, Marius and Gall, Juergen},
booktitle = {Proceedings of the Thirty-Second International Joint Conference on
Artificial Intelligence, {IJCAI-23}},
publisher = {International Joint Conferences on Artificial Intelligence Organization},
editor = {Edith Elkind},
pages = {1080--1088},
year = {2023},
month = {8},
note = {Main Track},
doi = {10.24963/ijcai.2023/120},
url = {https://doi.org/10.24963/ijcai.2023/120},
}