Home

Awesome

<div> <!-- <img src='https://i.imgur.com/tFP6Q3p.gif' align="right" height="120px" width="180px" alt='house'> --> <img src='https://i.imgur.com/Tq07diD.gif' align="right" height="120px" width="66px" alt='sculpture'> <img src='https://i.imgur.com/3boKX8u.gif' align="right" height="120px" width="180px" alt='printer'> </div>

<br><br><br><br>

MatchNeRF

Official PyTorch implementation for MatchNeRF, a new generalizable NeRF approach that employs explicit correspondence matching as the geometry prior and can perform novel view synthesis on unseen scenarios with as few as two source views as input, without requiring any retraining and fine-tuning. <br>

Explicit Correspondence Matching for Generalizable Neural Radiance Fields
Yuedong Chen<sup>1</sup>, Haofei Xu<sup>2</sup>, Qianyi Wu<sup>1</sup>, Chuanxia Zheng<sup>3</sup>, Tat-Jen Cham<sup>4</sup>, Jianfei Cai<sup>1</sup>
<sup>1</sup>Monash University, <sup>2</sup>ETH Zurich, <sup>3</sup>University of Oxford, <sup>4</sup>Nanyang Technological University
arXiv 2023

Paper | Project Page | Code

<img src="docs/matchnerf.png"> <details> <summary>Recent Updates</summary> </details> <br>

Table of Contents

Setup Environment

This project is developed and tested on a CUDA11 device. For other CUDA version, manually update the requirements.txt file to match the settings before preceding.

git clone --recursive https://github.com/donydchen/matchnerf.git
cd matchnerf
conda create --name matchnerf python=3.8
conda activate matchnerf
pip install -r requirements.txt

For rendering video output, it requires ffmpeg to be installed on the system, you can double check by running ffmpeg -version. If ffmpeg does not exist, consider installing it by running conda install ffmpeg.

Download Datasets

DTU (for both training and testing)

data/DTU/
    |__ Cameras/
    |__ Depths/
    |__ Rectified/

Blender (for testing only)

Real Forward Facing (for testing only)

Testing

MVSNeRF Setting (3 Nearest Views)

Download the pretrained model matchnerf_3v.pth and save to configs/pretrained_models/matchnerf_3v.pth, then run

python test.py --yaml=test --name=matchnerf_3v

If encounters CUDA out-of-memory, please reduce the ray sampling number, e.g., append --nerf.rand_rays_test==4096 to the command.

Performance should be exactly the same as below,

DatasetPSNRSSIMLPIPS
DTU26.910.9340.159
Real Forward Facing22.430.8050.244
Blender23.200.8970.164

Training

Download the GMFlow pretrained weight (gmflow_sintel-0c07dcb3.pth) from the original GMFlow repo, and save it to configs/pretrained_models/gmflow_sintel-0c07dcb3.pth, then run

python train.py --yaml=train

Rendering Video

python test.py --yaml=test_video --name=matchnerf_3v_video

Results (without any per-scene fine-tuning) should be similar as below,

<details> <summary>Visual Results</summary>

dtu_scan38_view24<br> DTU: scan38_view24

blender_materials_view36<br> Blender: materials_view36

llff_leaves_view13<br> Real Forward Facing: leaves_view13

</details>

Use Your Own Data

We provide the following 3 input views demo for your reference.

# lower resolution but fast
python test.py --yaml=demo_own
# full version
python test.py --yaml=test_video_own

The generated video will look like,

colmap_printer<br> Demo: own data, printer

Miscellaneous

Citation

If you use this project for your research, please cite our paper.

@article{chen2023matchnerf,
    title={Explicit Correspondence Matching for Generalizable Neural Radiance Fields},
    author={Chen, Yuedong and Xu, Haofei and Wu, Qianyi and Zheng, Chuanxia and Cham, Tat-Jen and Cai, Jianfei},
    journal={arXiv preprint arXiv:2304.12294},
    year={2023}
}

Pull Request

You are more than welcome to contribute to this project by sending a pull request.

Acknowledgments

This implementation borrowed many code snippets from GMFlow, MVSNeRF, BARF and GIRAFFE. Many thanks for all the above mentioned projects.