Home

Awesome

<h1 align="center">REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices (ECCV 2024)</h1>

paper

Chaojie Ji, Yufeng Li, Yiyi Liao

Keywords: Reflective surface · Real-time rendering · Mobile device

Abstracts: This work tackles the challenging task of achieving real-time novel view synthesis for reflective surfaces across various scenes. Existing real-time rendering methods, especially those based on meshes, often have subpar performance in modeling surfaces with rich view-dependent appearances. Our key idea lies in leveraging meshes for rendering acceleration while incorporating a novel approach to parameterize view-dependent information. We decompose the color into diffuse and specular, and model the specular color in the reflected direction based on a neural environment map. Our experiments demonstrate that our method achieves comparable reconstruction quality for highly reflective surfaces compared to state-of-the-art offline methods, while also efficiently enabling real-time rendering on edge devices such as smartphones.

Our project page can be seen at https://xdimlab.github.io/REFRAME/.

<img src="assets/teaser.png"/>

:book: Table Of Contents

:house: Installation

A suitable conda environment named REFRAME can be created and activated with:

# clone this repository
git clone https://github.com/MARVELOUSJI/REFRAME

# create new anaconda environment and activate it
conda create -n REFRAME python=3.8
conda activate REFRAME

#install pytorch 
conda install pytorch==1.12.1 torchvision==0.13.1 torchaudio==0.12.1 cudatoolkit=11.3 -c pytorch

#install nvdiffrast
git clone https://github.com/NVlabs/nvdiffrast.git
cd nvdiffrast
python -m pip install .

#install tiny-cuda-nn
cd ../
sudo apt-get install build-essential git

#export cuda path (change the cuda version of your own)
export PATH="/usr/local/cuda-11.3/bin:$PATH"
export LD_LIBRARY_PATH="/usr/local/cuda-11.3/lib64:$LD_LIBRARY_PATH"

git clone --recursive https://github.com/nvlabs/tiny-cuda-nn
cd tiny-cuda-nn
cmake . -B build
cmake --build build --config RelWithDebInfo -j
cd bindings/torch
python setup.py install

#install the rest package
cd ../../../REFRAME
pip install -r requirements.txt

For more details on tiny-cuda-nn and nvdiffrast, you can visit tiny-cuda-nn and nvdiffrast.

:framed_picture: Initialization (Dataset and Initial mesh)

  1. NeRF Synthetic Dataset
  1. Shiny Blender Dataset
<pre> +-- ShinyBlender | +-- helmet | +-- test | +-- train | +-- points_of_interest.ply | +-- transforms_test.json | +-- transforms_train.json +-- toaster </pre>
  1. Real Captured Dataset
  1. Self Captured Dataset

:computer: Usage

#Training with default setting
python main.py --datadir your_dataset_path --initial_mesh your_initial_mesh --run_name experiment_name

#Training without environment learner (directly optimizing a feature map.) Should speed up training process but loss of quality.
python main.py --datadir your_dataset_path --initial_mesh your_initial_mesh --run_name experiment_name --wenvlearner 0

#Testing and UVmapping (Default setting)
python main.py --datadir your_dataset_path --initial_mesh your_trained_mesh --run_name experiment_name --shader_path trained_shader --test 1 --uvmap 1 

#Testing and UVmapping (Training without environment learner)
python main.py --datadir your_dataset_path --initial_mesh your_trained_mesh --run_name experiment_name --wenvlearner 0 --shader_path trained_shader --envmap_path trained_envmap --test 1 --uvmap 1 

:chart_with_upwards_trend: Results

<img src="assets/table.png"/>

Quantitative Comparison. Baseline comparisons of the rendering quality on three different datasets. <span style="color:red; font-weight:bold;">Red</span> represents the optimal, <span style="color:orange; font-weight:bold;">orange</span> represents the second best, and <span style="color:#FFD700; font-weight:bold;">yellow</span> represents the third.

<img src="assets/visualization.png"/>

Qualitative Comparison. Our method achieves optimal rendering quality in most scenes and provides better modeling of reflective appearance compared to the baselines.

:clipboard: Citation

If our work is useful for your research, please consider citing:

@article{ji2024reframe,
  title={REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices},
  author={Ji, Chaojie and Li, Yufeng and Liao, Yiyi},
  journal={arXiv preprint arXiv:2403.16481},
  year={2024}
}

:sparkles: Acknowledgement

:e-mail: Contact

If you have any questions, please feel free to reach out at jichaojie@zju.edu.cn.