Awesome
DiSR-NeRF: Diffusion-Guided View-Consistent Super-Resolution NeRF
Jie Long Lee , Chen Li, Gim Hee Lee
Abstract
We present DiSR-NeRF, a diffusion-guided framework for view-consistent super-resolution (SR) NeRF. Unlike prior works, we circumvent the requirement for high-resolution (HR) reference images by leveraging existing powerful 2D super-resolution models. Nonetheless, independent SR 2D images are often inconsistent across different views. We thus propose Iterative 3D Synchronization (I3DS) to mitigate the inconsistency problem via the inherent multi-view consistency property of NeRF. Specifically, our I3DS alternates between upscaling low-resolution (LR) rendered images with diffusion models, and updating the underlying 3D representation with standard NeRF training. We further introduce Renoised Score Distillation (RSD), a novel score-distillation objective for 2D image resolution. Our RSD combines features from ancestral sampling and Score Distillation Sampling (SDS) to generate sharp images that are also LR-consistent. Qualitative and quantitative results on both synthetic and real-world datasets demonstrate that our DiSR-NeRF can achieve better results on NeRF super-resolution compared with existing works. Code and video results available at the project website.
Installation
git clone https://github.com/leejielong/DiSR-NeRF
cd DiSR-NeRF
conda create -n disrnerf
conda activate disrnerf
# Install packages
pip install -r requirements.txt
Training
Download NeRF-Synthetic and LLFF datasets here. Create data directory as follows:
configs
data
├── blender
│ ├── chair
│ └── drums
└── nerf_llff_data
├── fern
└── flower
python launch.py --config configs/nerfdiffusr-sr.yaml --train
Testing
python launch.py --config configs/nerfdiffusr-sr.yaml --test
Citations
@misc{lee2024disrnerf,
title={DiSR-NeRF: Diffusion-Guided View-Consistent Super-Resolution NeRF},
author={Jie Long Lee and Chen Li and Gim Hee Lee},
year={2024},
eprint={2404.00874},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
Acknowledgement
This implementation is built upon threestudio. We thank the authors for the contribution.