Awesome
Benchmarking Robustness of 3D Point Cloud Recognition against Common Corruptions
This repo contains the dataset and code for the paper Benchmarking Robustness of 3D Point Cloud Recognition against Common Corruptions by Jiachen Sun et al. This codebase is based on SimpleView, and we thank the authors for their great contributions.
ModelNet40-C
More visualizations can be found here.
Download ModelNet40-C from Google Drive.
Download ModelNet40-C using our provided script.
Download ModelNet40-C from Zenodo.
ModelNet40-C Leaderboard
Architecture+Data Augmentation Leaderboard </br>
Architecture | Data Augmentation | Corruption Error Rate (%) | Clean Error Rate (%) | Checkpoint |
---|---|---|---|---|
PCT | PointCutMix-R | 16.3 | 7.2 | checkpoint |
PCT | PointCutMix-K | 16.5 | 6.9 | checkpoint |
DGCNN | PointCutMix-R | 17.3 | 6.8 | checkpoint |
PCT | RSMix | 17.3 | 6.9 | checkpoint |
DGCNN | PointCutMix-K | 17.3 | 7.4 | checkpoint |
RSCNN | PointCutMix-R | 17.9 | 7.6 | checkpoint |
DGCNN | RSMix | 18.1 | 7.1 | checkpoint |
PCT | PGD Adv Train | 18.4 | 8.9 | checkpoint |
PointNet++ | PointCutMix-R | 19.1 | 7.1 | checkpoint |
PointNet++ | PointMixup | 19.3 | 7.1 | checkpoint |
PCT | PointMixup | 19.5 | 7.4 | checkpoint |
SimpleView | PointCutMix-R | 19.7 | 7.9 | checkpoint |
RSCNN | PointMixup | 19.8 | 7.2 | checkpoint |
PointNet++ | PointCutMix-K | 20.2 | 6.7 | checkpoint |
We allow users to directly download all pre-trained models with every data augmentation method here.
Architecture Leaderboard </br>
Architecture | Corruption Error Rate (%) | Clean Error Rate (%) | Checkpoint |
---|---|---|---|
CurveNet | 22.7 | 6.6 | checkpoint |
PointNet++ | 23.6 | 7.0 | checkpoint |
PCT | 25.5 | 7.1 | checkpoint |
GDANet | 25.6 | 7.5 | checkpoint |
DGCNN | 25.9 | 7.4 | checkpoint |
RSCNN | 26.2 | 7.7 | checkpoint |
SimpleView | 27.2 | 6.1 | checkpoint |
PointNet | 28.3 | 9.3 | checkpoint |
PointMLP | 31.9 | 6.3 | checkpoint |
PointMLP-Elite | 32.4 | 7.2 | checkpoint |
More models' results coming soon ......
We allow users to directly download all pre-trained models with standard training here.
Getting Started
First clone the repository. We would refer to the directory containing the code as ModelNet40-C
.
git clone --recurse-submodules git@github.com:jiachens/ModelNet40-C.git
Requirements
The code is tested on Linux OS with Python version 3.7.5, CUDA version 10.0, CuDNN version 7.6 and GCC version 5.4. We recommend using these versions especially for installing pointnet++ custom CUDA modules.
[02-23-2022] The updated codes are tested on Python version 3.7.5, CUDA version 11.4, CuDNN version 8.2 and GCC version 7.5 with the latest torch
and torchvision
libs, but we still suggest the original setup in case of any unstable bugs.
Install Libraries
We recommend you first install Anaconda and create a virtual environment.
conda create --name modelnetc python=3.7.5
Activate the virtual environment and install the libraries. Make sure you are in ModelNet40-C
.
conda activate modelnetc
pip install -r requirements.txt
conda install sed # for downloading data and pretrained models
For PointNet++, we need to install custom CUDA modules. Make sure you have access to a GPU during this step. You might need to set the appropriate TORCH_CUDA_ARCH_LIST
environment variable depending on your GPU model. The following command should work for most cases export TORCH_CUDA_ARCH_LIST="6.0;6.1;6.2;7.0;7.5"
. However, if the install fails, check if TORCH_CUDA_ARCH_LIST
is correctly set. More details could be found here.
Third-party modules pointnet2_pyt
, PCT_Pytorch
, emd
, and PyGeM
can be installed by the following script.
./setup.sh
Download Datasets Including ModelNet40-C and Pre-trained Models
Make sure you are in ModelNet40-C
. download.sh
script can be used for downloading all the data and the pretrained models. It also places them at the correct locations.
To download ModelNet40 execute the following command. This will download the ModelNet40 point cloud dataset released with pointnet++ as well as the validation splits used in our work.
./download.sh modelnet40
To generate the ModelNet40-C dataset, please run:
python data/process.py
python data/generate_c.py
NOTE that the generation needs a monitor connected since Open3D library does not support background rendering.
We also allow users to download ModelNet40-C directly. Please fill this Google form while downloading our dataset.
./download.sh modelnet40_c
To download the pretrained models with standard training recipe, execute the following command.
./download.sh cor_exp
To download the pretrained models using different data augmentation strategies, execute the following command.
./download.sh runs
New Features
[02-23-2022]
- We include PointMLP-Elite and GDANet in our benchmark
[02-18-2022]
- We include CurveNet and PointMLP in our benchmark
[01-28-2022]
- We include Point Cloud Transformer (PCT) in our benchmark
ModelNet40-C/configs
contains config files to enable different data augmentations and test-time adaptation methodsModelNet40-C/aug_utils.py
contains the data augmentation codes in our paperModelNet40-C/third_party
contains the test-time adaptation used in our paper
Code Organization In Originial SimpleView
-
ModelNet40-C/models
: Code for various models in PyTorch. -
ModelNet40-C/configs
: Configuration files for various models. -
ModelNet40-C/main.py
: Training and testing any models. -
ModelNet40-C/configs.py
: Hyperparameters for different models and dataloader. -
ModelNet40-C/dataloader.py
: Code for different variants of the dataloader. -
ModelNet40-C/*_utils.py
: Code for various utility functions.
Running Experiments
Training and Config files
To train or test any model, we use the main.py
script. The format for running this script is as follows.
python main.py --exp-config <path to the config>
The config files are named as <protocol>_<model_name><_extra>_run_<seed>.yaml
(<protocol> ∈ [dgcnn, pointnet2, rscnn]
; <model_name> ∈ [dgcnn, pointnet2, rscnn, pointnet, simpleview]
). For example, the config file to run an experiment for PointNet++ in DGCNN protocol with seed 1 dgcnn_pointnet2_run_1.yaml
. To run a new experiment with a different seed, you need to change the SEED
parameter in the config file. All of our experiments are done based on seed 1.
We additionally leverage PointCutMix: configs/cutmix
, PointMixup: configs/mixup
, RSMix: configs/rsmix
, and PGD-based adversarial training configs/pgd
as the training-time config files.
For example, to train PCT with PointCutMix-R, please use the following command:
python main.py --exp-config configs/cutmix/pct_r.yaml
Evaluate a pretrained model
We provide pretrained models. They can be downloaded using the ./download.sh cor_exp
and ./download.sh runs
commands and are stored in the ModelNet40-C/runs
(for data augmentation recipes) and ModelNet40-C/cor_exp
(for standard trained models) folders. To test a pretrained model, the command is of the following format.
Additionally, we provide test-time config files in configs/bn
and configs/tent
for BN and TENT in our paper with the following commands:
python main.py --entry test --model-path <cor_exp/runs>/<cfg_name>/<model_name>.pth --exp-config configs/<cfg_name>.yaml
We list all the evaluation commands in the eval_cor.sh
, eval_og.sh
, eval_tent_cutmix.sh
scripts. Note that in eval_cor.sh
it is expected that pgd with PointNet++, RSCNN, and SimpleView do not have outputs since they do not fit the adversarial training framework. We have mentioned this in our paper.
Citation
Please cite our paper and SimpleView if you use our benchmark and analysis results. Thank you!
@article{sun2022benchmarking,
title={Benchmarking Robustness of 3D Point Cloud Recognition Against Common Corruptions},
author={Jiachen Sun and Qingzhao Zhang and Bhavya Kailkhura and Zhiding Yu and Chaowei Xiao and Z. Morley Mao},
journal={arXiv preprint arXiv:2201.12296},
year={2022}
}
@article{goyal2021revisiting,
title={Revisiting Point Cloud Shape Classification with a Simple and Effective Baseline},
author={Goyal, Ankit and Law, Hei and Liu, Bowei and Newell, Alejandro and Deng, Jia},
journal={International Conference on Machine Learning},
year={2021}
}