Awesome
RobIR: Robust Inverse Rendering for High-Illumination Scenes
Project page | Paper | Data
News
- [10/08/2024] The complete code has been released.
- [10/03/2024] Project page has been released.
- [9/26/2024] RobIR (formerly known as SIRe-IR) has been accepted by NeurIPS 2024. We will release the code these days.
Dataset
In our paper, we use:
- synthetic dataset from NeRF and our RobIR dataset.
- real-world dataset from NeuS.
We organize the datasets as follows:
├── data
│ | nerf
│ ├── hotdog
│ ├── lego
│ ├── ...
│ | robir_dataset
│ ├── truck
│ ├── chessboard
│ ├── ...
│ | blendedMVS
│ ├── bear
│ ├── clock
│ ├── ...
│ | dtu
│ ├── scan83
│ ├── scan118
│ ├── ...
Run
Environment
- Set up the Python environment
git clone https://github.com/ingra14m/RobIR
cd RobIR
conda create -n robust-ir-env python=3.7
conda activate robust-ir-env
pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 --extra-index-url https://download.pytorch.org/whl/cu116
pip install pyg-lib torch-scatter torch-sparse torch-cluster torch-spline-conv torch-geometric -f https://data.pyg.org/whl/torch-1.13.0+cu116.html
pip install -r requirements.txt
Stage 1: NeuS (Geometry Prior)
cd neus
python exp_runner.py --gin_file config/blender.gin # for blender dataset
python exp_runner.py --gin_file config/blendedMVS/neus_bear.gin # for blendedMVS dataset
python exp_runner.py --gin_file config/dtu/neus_dtu83_toy.gin # for dtu dataset
The mesh and other useful settings are saved in logs
.
Stage 2: BRDF Estimation
- We provide
confs_sg/hotdog.conf
for general blender scenes andconfs_sg/truck.conf
for thetruck
in our robir dataset. - We also provide
confs_sg/dtu.conf
for general real-world scenes.
If you wanna train other scenes, please change the config files, neus_pretrained_path
, data_split_dir
and exp_name
.
Here we take the blender scene hotdog
as an example.
2.1 Train Norm
PYTHONPATH=. python training/exp_runner.py --conf confs_sg/hotdog.conf --neus_pretrained_path neus/logs/blender/hotdog-neus --data_split_dir data/nerf/hotdog --expname hotdog --trainstage Norm
2.2 Train Visibility and Indirect Illumination
PYTHONPATH=. python training/exp_runner.py --conf confs_sg/hotdog.conf --neus_pretrained_path neus/logs/blender/hotdog-neus --data_split_dir data/nerf/hotdog --expname hotdog --trainstage Vis
2.3 Train PBR
PYTHONPATH=. python training/exp_runner.py --conf confs_sg/hotdog.conf --neus_pretrained_path neus/logs/blender/hotdog-neus --data_split_dir data/nerf/hotdog --expname hotdog --trainstage PBR
2.4 Train RVE
PYTHONPATH=. python training/exp_runner.py --conf confs_sg/hotdog.conf --neus_pretrained_path neus/logs/blender/hotdog-neus --data_split_dir data/nerf/hotdog --expname hotdog --trainstage CESR
Results
Albedo
<img src="assets/albedo.png" alt="image-20231020012659356" style="zoom:50%;" />Roughness
<img src="assets/roughness.png" alt="image-20231020012659356" style="zoom:50%;" />Envmap
<img src="assets/envmap.png" alt="image-20231020012659356" style="zoom:50%;" />Relighting
<img src="assets/relighting.png" alt="image-20231020012659356" style="zoom:50%;" />De-shadow
See more in the project page.
Acknowledgments
This work was supported by Key R&D Program of Zhejiang (No.2024C01069). We thank Wenxin Sun for her help in pipeline illustration. We also thank Yuan Liu and Wen Zhou for the constructive suggestions.
BibTex
@article{yang2023sireir,
title={SIRe-IR: Inverse Rendering for BRDF Reconstruction with Shadow and Illumination Removal in High-Illuminance Scenes},
author={Yang, Ziyi and Chen, Yanzhen and Gao, Xinyu and Yuan, Yazhen and Wu, Yu and Zhou, Xiaowei and Jin, Xiaogang},
journal={arXiv preprint arXiv:2310.13030},
year={2023}
}
This work was built on InvRender and NeuS. Please consider citing these two awesome works.