Awesome
[CVPR 2022] GeoNeRF: Generalizing NeRF with Geometry Priors <br>
Mohammad Mahdi Johari, Yann Lepoittevin, François Fleuret <br> Project Page | Paper
This repository contains a PyTorch Lightning implementation of our paper, GeoNeRF: Generalizing NeRF with Geometry Priors.
Installation
Tested on NVIDIA Tesla V100 and GeForce RTX 3090 GPUs with PyTorch 1.9 and PyTorch Lightning 1.3.7
To install the dependencies, in addition to PyTorch, run:
pip install -r requirements.txt
Evaluation and Training
To reproduce our results, download pretrained weights from here and put them in pretrained_weights folder. Then, follow the instructions for each of the LLFF (Real Forward-Facing), NeRF (Realistic Synthetic), and DTU datasets.
LLFF (Real Forward-Facing) Dataset
Download nerf_llff_data.zip
from here and set its path as llff_path
in the config_llff.txt file.
For evaluating our generalizable model (pretrained.ckpt
model in the pretrained_weights folder), set the scene
properly (e.g. fern) and set the number of source views to 9 (nb_views = 9) in the config_llff.txt file and run the following command:
python run_geo_nerf.py --config configs/config_llff.txt --eval
For fine-tuning on a specific scene, set nb_views = 7 and run the following command:
python run_geo_nerf.py --config configs/config_llff.txt
Once fine-tuning is finished, run the evaluation command with nb_views = 9 to get the final rendered results.
NeRF (Realistic Synthetic) Dataset
Download nerf_synthetic.zip
from here and set its path as nerf_path
in the config_nerf.txt file.
For evaluating our generalizable model (pretrained.ckpt
model in the pretrained_weights folder), set the scene
properly (e.g. lego) and set the number of source views to 9 (nb_views = 9) in the config_nerf.txt file and run the following command:
python run_geo_nerf.py --config configs/config_nerf.txt --eval
For fine-tuning on a specific scene, set nb_views = 7 and run the following command:
python run_geo_nerf.py --config configs/config_nerf.txt
Once fine-tuning is finished, run the evaluation command with nb_views = 9 to get the final rendered results.
DTU Dataset
Download the preprocessed DTU training data
and replace its Depths
directory with Depth_raw from original MVSNet repository, and set dtu_pre_path
referring to this dataset in the config_dtu.txt file.
Then, download the original Rectified
images from DTU Website, and set dtu_path
in the config_dtu.txt file accordingly.
For evaluating our generalizable model (pretrained.ckpt
model in the pretrained_weights folder), set the scene
properly (e.g. scan21) and set the number of source views to 9 (nb_views = 9) in the config_dtu.txt file and run the following command:
python run_geo_nerf.py --config configs/config_dtu.txt --eval
For fine-tuning on a specific scene, use the same nb_views = 9 and run the following command:
python run_geo_nerf.py --config configs/config_dtu.txt
Once fine-tuning is finished, run the evaluation command with nb_views = 9 to get the final rendered results.
RGBD Compatible model
By adding --use_depth
argument to the aforementioned commands, you can use our RGB compatible model on the DTU dataset and exploit the ground truth, low-resolution depths to help the rendering process. The pretrained weights for this model is pretrained_w_depth.ckpt
.
Training From Scratch
For training our model from scratch, first, prepare the following datasets:
-
The original
Rectified
images from DTU. Set the corresponding path asdtu_path
in the config_general.txt file. -
The preprocessed DTU training data with the replacement of its
Depths
directory with Depth_raw. Set the corresponding path asdtu_pre_path
in the config_general.txt file. -
LLFF released scenes. Download real_iconic_noface.zip and remove the test scenes with the following command:
unzip real_iconic_noface.zip cd real_iconic_noface/ rm -rf data2_fernvlsb data2_hugetrike data2_trexsanta data3_orchid data5_leafscene data5_lotr data5_redflower
Then, set the corresponding path as
llff_path
in the config_general.txt file. -
Collected scenes from IBRNet (Subset1 and Subset2). Set the corresponding paths as
ibrnet1_path
andibrnet2_path
in the config_general.txt file.
Also, download nerf_llff_data.zip
and nerf_synthetic.zip
from here for validation and testing and set their corresponding paths as llff_test_path
and nerf_path
in the config_general.txt file.
Once all the datasets are available, train the network from scratch with the following command:
python run_geo_nerf.py --config configs/config_general.txt
Contact
You can contact the author through email: mohammad.johari At idiap.ch.
Citing
If you find our work useful, please consider citing:
@inproceedings{johari-et-al-2022,
author = {Johari, M. and Lepoittevin, Y. and Fleuret, F.},
title = {GeoNeRF: Generalizing NeRF with Geometry Priors},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022}
}
Acknowledgement
This work was supported by ams OSRAM.