Home

Awesome

[WACV'23] PIDS: Joint Interaction-Dimension Search for 3D Point Cloud

This is the official implementation of our WACV'23 paper, "PIDS: Joint Interaction-Dimension Search for 3D Point Cloud" [https://openaccess.thecvf.com/content/WACV2023/html/Zhang_PIDS_Joint_Point_Interaction-Dimension_Search_for_3D_Point_Cloud_WACV_2023_paper.html]. The framework is based developed upon PyTorch.

Highlights

Point Operator

Dense Sparse Predictor

Starter Guide

conda create -n pids -f pids_env.yaml

Architecture Search Process

Architecture search code is released for easy use and development. Please refer to tutorials/architecture_search_guidelines.ipynb for detailed instructions and hyperparameter guidance.

Training the searched architectures from scratch

The searched architectures are revealed in pids_search_space/arch_genotype.py, defined as a composition of block args representing architecture compositions. If you run the search by yourself, you can modify pids_search_space/arch_genotype.py to incorporate the changes that you have. Please refer to final_evaluation.ipynb for detailed training scripts and hyperparameter definitions.

Obtaining validation/test accuracy.

The evaluation process on 3D-point cloud is not as straight-forward as image classification. To obtain the validation/test accuracy, a voting mechansim is required that runs the point cloud multiple times and average the predictions.

Evaluation

Our paper presents the following results. More detailed comparisons can be seen on the original paper.

Please refer to testing/val_models.py for the testing process. The general usage should be:

python testing/val_models.py --result_path [CKPT_PATH]

Results

SemanticKITTI (08-val)

MethodParameter (M)Multiply-Accumulates (G)Latency (ms)mIOU (%)Notes
KPConv14.860.9221 (164 + 57)59.2
PIDS (Second-order)0.974.7160 (103 + 57)60.1
PIDS (NAS)0.574.4169 (112 + 57)62.4Checkpoint
PIDS (NAS, 2x)1.3611.0206 (149 + 57)64.1Checkpoint

S3DIS (Area-05)

Refer to training/train_S3DIS.py for implementation details.

MethodmIOUceil.floorwallbeamcol.wind.doorchairtablebook.sofaboardclut.
KPConv65.492.697.381.40.016.554.569.590.180.274.666.463.758.1
PIDS67.293.698.381.60.032.251.573.290.782.573.364.771.660.0

Checkpoint is available here.

ModelNet40

MethodParameter (M)Overall Accuracy (%)Notes
KPConv14.992.9
PIDS (Second-order)1.2592.6
PIDS (NAS)0.5693.1Checkpoint
PIDS (NAS, 2x)1.2193.4Checkpoint

Acknowledgement

This project is in part supported by the following grants: NSF-2112562, NSF-1937435, and ARO W911NF-19-2-0107, and CAREER-2048044. We also acknowledge the original KPConv-PyTorch project [https://github.com/HuguesTHOMAS/KPConv-PyTorch] to provide backbone implementations for our work.

Citation

If you would like to use this repository to develop your research work, feel free to use the citation below:

@inproceedings{zhang2023pids,
  title={PIDS: Joint Point Interaction-Dimension Search for 3D Point Cloud},
  author={Zhang, Tunhou and Ma, Mingyuan and Yan, Feng and Li, Hai and Chen, Yiran},
  booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
  pages={1298--1307},
  year={2023}
}