Home

Awesome

Spec-Gaussian: Anisotropic View-Dependent Appearance for 3D Gaussian Splatting

Project Page | Paper | Anisotropic Dataset

teaser

This project was built on my previous released My-exp-Gaussian, aiming to enhance 3D Gaussian Splatting in modeling scenes with specular highlights. I hope this work can assist researchers who need to model specular highlights through splatting.

Note that the current Spec-Gaussian has significantly improved in quality compared to the first version on arxiv (2024.02). Please pay attention to the latest version on arxiv.

News

Dataset

In our paper, we use:

And the data structure should be organized as follows:

data/
├── NeRF
│   ├── Chair/
│   ├── Drums/
│   ├── ...
├── NSVF
│   ├── Bike/
│   ├── Lifestyle/
│   ├── ...
├── Spec-GS
│   ├── ashtray/
│   ├── dishes/
│   ├── ...
├── Mip-360
│   ├── bicycle/
│   ├── bonsai/
│   ├── ...
├── tandt_db
│   ├── db/
│   │   ├── drjohnson/
│   │   ├── playroom/
│   ├── tandt/
│   │   ├── train/
│   │   ├── truck/

Pipeline

pipeline

Run

Environment

git clone https://github.com/ingra14m/Spec-Gaussian --recursive
cd Spec-Gaussian

conda create -n spec-gaussian-env python=3.7
conda activate spec-gaussian-env

# install pytorch
pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 --extra-index-url https://download.pytorch.org/whl/cu116
pip install torch-scatter -f https://data.pyg.org/whl/torch-1.13.0+cu116.html

# install dependencies
pip install -r requirements.txt

Train

We have provided scripts run_anchor.sh and run_wo_anchor.sh that were used to generate the table in the paper.

In general, using the version without anchor Gaussian can achieve better rendering effects for synthesized bounded scenes. For real-world unbounded scenes, using the version with anchor Gaussian can achieve better results. For researchers who want to explore the use of with anchor Gaussian in bounded scenes and without anchor Gaussian in unbounded scenes, we have provided the following general training command.

Train without anchor

python train.py -s your/path/to/the/dataset -m your/path/to/save --eval

## For synthetic bounded scenes
python train.py -s data/nerf_synthetic/drums -m outputs/nerf/drums --eval

## For real-world unbounded indoor scenes
python train.py -s data/mipnerf-360/bonsai -m outputs/mip360/bonsai --eval -r 2 --is_real --is_indoor --asg_degree 12

## For real-world unbounded outdoor scenes
python train.py -s data/mipnerf-360/bicycle -m outputs/mip360/bicycle --eval -r 4 --is_real --asg_degree 12

[Extra, for acceleration] Train with anchor

python train_anchor.py -s your/path/to/the/dataset -m your/path/to/save --eval

## For synthetic bounded scenes
python train_anchor.py -s data/nerf_synthetic/drums -m outputs/nerf/drums --eval --voxel_size 0.001 --update_init_factor 4 --iterations 30_000

## For mip360 scenes
python train_anchor.py -s data/mipnerf-360/bonsai -m outputs/mip360/bonsai --eval --voxel_size 0.001 --update_init_factor 16 --iterations 30_000 -r [2|4]

Results

Synthetic Scenes

synthetic

Real-world Scenes

real

Ablation

ablation-asg

ablation-c2f

Align with Rip-NeRF

The Tri-MipRF and Rip-NeRF use both train and val set and the training data. I provided the results on NeRF-synthetic dataset with the same setting.

ScenePSNRSSIMLPIPS
chair37.330.99070.0088
drums28.500.96690.0288
ficus38.080.99220.0081
hotdog39.860.98950.0148
lego38.440.98760.0121
materials32.640.97380.0285
mic38.570.9950.0045
ship33.660.92480.0906
Average35.890.97760.0245
Rip-NeRF35.440.9730.037

Acknowledgments

This work was mainly supported by ByteDance MMLab. I'm very grateful for the help from Chao Wan of Cornell University during the rebuttal.

BibTex

@article{yang2024spec,
  title={Spec-Gaussian: Anisotropic View-Dependent Appearance for 3D Gaussian Splatting},
  author={Yang, Ziyi and Gao, Xinyu and Sun, Yangtian and Huang, Yihua and Lyu, Xiaoyang and Zhou, Wen and Jiao, Shaohui and Qi, Xiaojuan and Jin, Xiaogang},
  journal={arXiv preprint arXiv:2402.15870},
  year={2024}
}

And thanks to the authors of 3D Gaussians and Scaffold-GS for their excellent code, please consider citing these repositories.