Home

Awesome

APRO3D-Net: Attention-based Proposals Refinement for 3D Object Detection

This implementation of APRO3D-Net is based on OpenPCDet. Our paper can be found here.

<p align="center"> <img src="docs/swith_net.png" width="75%"> </p> The overall architecture of APRO3D-Net. The voxelized point cloud is fed to a 3D backbone for feature extraction. The backbone’s output is then converted to a BEV representation on which an RPN is applied to generate ROI. Several ROI Feature Encoders (RFE) transform feature maps produced by backbone into ROI features by: first pooling from inputted feature maps, then encoding pooled points position, finally refining previous ROI feature using pooled features and their position encoding via Attention Module. Refined ROI feature is mapped to confidence and refinement vector by two MLP-based detection heads. Here, blue cuboids and green parallelogram respectively denote feature maps computed by 3D and 2D convolution.

Performance on KITTI val set

AP@R11AP@R40Download
CarPedestrianCyclistCarPedestrianCyclist
APRO3D-Net (kitti)83.5157.4572.9784.8457.0073.35kitti-model

Performance on NuScenes val set

CarPedBusBarrierTraf. ConeTruckTrailerMotorCons. Veh.BicyclemAPDownload
APRO3D-Net (nuscenes)77.7574.0264.8652.6146.3443.9934.9039.3613.4423.0047.03nuscenes-model

Installation

To use this repo, please follow OpenPCDet's intruction for preparing datasets and installation.

Demo

Demo requires open3d

pip install open3d

KITTI Dataset

To visualize prediction for KITTI Dataset,

python visualize_kitti.py --cfg_file cfgs/kitti_models/swh_kitti.yaml \
       --ckpt_file <path_to_directory_containing_ckpt>/roi100_checkpoint_epoch_91.pth --log_file_dir .

Example results

<p align="center"> <img src="docs/kitti_quali_small.png" width="75%"> </p>

NuScenes Dataset

To visualize prediction for NuScenes Dataset

VERSION: 'v1.0-test'
DATA_SPLIT: {
    'train': train,
    'test': test
}
INFO_PATH: {
    'train': [nuscenes_infos_10sweeps_train.pkl],
    'test': [nuscenes_infos_10sweeps_test.pkl],
}
python visualize_nuscenes.py --split mini \
        --result_file <path_to_directory_containing_prediction>/results_nusc_swh_second_rfe_mini.json \
        --scene_idx 0 --render_cam_back --render_point_cloud

Example results

<p align="center"> <img src="docs/nuscenes_quali_small.png" width="75%"> </p>

Test

To test pretrained model, execute the following command in the tools directory

python test.py --cfg_file ${CONFIG_FILE} --ckpt ${CKPT}

# e.g., 
python test.py --cfg_file tools/cfgs/kitti_models/swh_kitti.yaml \
        --ckpt <path_to_directory_containing_ckpt>/roi100_checkpoint_epoch_91.pth