Awesome
Density-aware Chamfer Distance
This repository contains the official PyTorch implementation of our paper:
Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion, NeurIPS 2021
Tong Wu, Liang Pan, Junzhe Zhang, Tai Wang, Ziwei Liu, Dahua Lin
We present a new point cloud similarity measure named Density-aware Chamfer Distance (DCD). It is derived from CD and benefits from several desirable properties: 1) it can detect disparity of density distributions and is thus a more intensive measure of similarity compared to CD; 2) it is stricter with detailed structures and significantly more computationally efficient than EMD; 3) the bounded value range encourages a more stable and reasonable evaluation over the whole test set. DCD can be used as both an evaluation metric and the training loss. We mainly validate its performance on point cloud completion in our paper.
This repository includes:
- Implementation of Density-aware Chamfer Distance (DCD).
- Implementation of our method for this task and the pre-trained model.
Installation
Requirements
Install
Install PyTorch 1.2.0 first, and then get the other requirements by running the following command:
bash setup.sh
Dataset
We use the MVP Dataset. Please download the train set and test set and then modify the data path in data/mvp_new.py
to the your own data location. Please refer to their codebase for further instructions.
Usage
Density-aware Chamfer Distance
The function for DCD calculation is defined in def calc_dcd()
in utils/model_utils.py
.
Users of higher PyTorch versions may try def calc_dcd()
in utils_v2/model_utils.py
, which has been tested on PyTorch 1.6.0 .
Model training and evaluation
- To train a model: run
python train.py ./cfgs/*.yaml
, for example:
python train.py ./cfgs/vrc_plus.yaml
- To test a model: run
python train.py ./cfgs/*.yaml --test_only
, for example:
python train.py ./cfgs/vrc_plus_eval.yaml --test_only
- Config for each algorithm can be found in
cfgs/
. run_train.sh
andrun_test.sh
are provided for SLURM users.
We provide the following config files:
pcn.yaml
: PCN trained with CD loss.vrc.yaml
: VRCNet trained with CD loss.pcn_dcd.yaml
: PCN trained with DCD loss.vrc_dcd.yaml
: VRCNet trained with DCD loss.vrc_plus.yaml
: training with our method.vrc_plus_eval.yaml
: evaluation of our method with guided down-sampling.
Attention: We empirically find that using DP or DDP for training would slightly hurt the performance. So training on multiple cards is not well supported currently.
Pre-trained models
We provide the pre-trained model that reproduce the results in our paper.
Download and extract it to the ./log/pretrained/
directory, and then evaluate it with cfgs/vrc_plus_eval.yaml
. The setting prob_sample: True
turns on the guided down-sampling.
We also provide the model for VRCNet trained with DCD loss here.
Citation
If you find our code or paper useful, please cite our paper:
@inproceedings{wu2021densityaware,
title={Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion},
author={Tong Wu, Liang Pan, Junzhe Zhang, Tai WANG, Ziwei Liu, Dahua Lin},
booktitle={In Advances in Neural Information Processing Systems (NeurIPS), 2021},
year={2021}
}
Acknowledgement
The code is based on the VRCNet implementation. We include the following PyTorch 3rd-party libraries: ChamferDistancePytorch, emd, expansion_penalty, MDS, and Pointnet2.PyTorch. Thanks for these great projects.
Contact
Please contact @wutong16 for questions, comments and reporting bugs.