Home

Awesome

CLNeRF

Official implementation of 'CLNeRF: Continual Learning Meets NeRF' (accepted to ICCV'23)

[Paper] [Video] [Dataset] [Web Demo (coming soon)]

Example Image

We study the problem of continual learning in the context of NeRFs. We propose a new dataset World Across Time (WAT) for this purpose, where during continual learning, the scene appearance and geometry can change over time (at different time step/task of continual learning). We propose a simple yet effective method CLNeRF which combines generative replay with advanced NeRF architectures so that a single NeRF model can efficiently adapt to gradually revealed new data, i.e., render scenes at different time with potential appearance and geometry changes, without the need to store historical images.

To facilitate future research on continual NeRF, we provide the code to run different continual learning methods on different NeRF datasets (including WAT).

Please give us a star or cite our paper if you find it useful.

Contact

Please contact Zhipeng Cai (homepage: https://zhipengcai.github.io/, email: czptc2h@gmail.com) if you have questions, comments or want to collaborate on this repository to make it better.

We are actively looking for good research interns, contact Zhipeng if you are interested (multiple bases are possible, e.g., US, Munich, China).

@inproceedings{iccv23clnerf,
title={CLNeRF: Continual Learning Meets NeRF},
author={Zhipeng Cai, Matthias Müller},
year={2023},
booktitle={ICCV},
}

Installation

Hardware

Environment setup

Docker (optional)

First make sure you have installed Docker, and cloned the repository and it's your current working directory.

git clone --recursive https://github.com/IntelLabs/CLNeRF.git
cd CLNeRF
docker pull joaquingajardo/clnerf:latest
docker run -d --name CLNeRF --gpus=all --shm-size=24g -w /workspace/CLNeRF -v ${PWD}:/workspace/CLNeRF -t joaquingajardo/clnerf:latest
docker exec -it CLNeRF bash
# Optionally in vscode you can attach to the container just created for easier debugging and developing

Dataset prepare (Naming follows Fig.4 of the main paper, currently support WAT, SynthNeRF and NeRF++)

bash prepare_datasets.sh

Run experiments

# run experiments on CLNeRF (WAT, SynthNeRF and NeRF++ datasets are currently supported)
bash run_CLNeRF.sh
# run experiments on MEIL-NeRF
bash run_MEIL.sh
# run experiments on ER (experience replay)
bash run_ER.sh
# run experiments on EWC 
bash run_EWC.sh
# run experiments on NT (naive training/finetuning on the sequential data)
bash run_NT.sh
# render video using CLNeRF model
scene=breville
task_number=5
task_curr=4
rep=10
scale=8.0 # change to the right scale according to the corresponding training script (scripts/NT/WAT/breville.sh)
ckpt_path=/export/work/zcai/WorkSpace/CLNeRF/CLNeRF/ckpts/NGPGv2_CL/colmap_ngpa_CLNerf/${scene}_10/epoch=19-v4.ckpt # change to your ckpt path
bash scripts/CLNeRF/WAT/render_video.sh $task_number $task_curr $scene $ckpt_path $rep $scale $render_fname

License

This repository is under the Apache 2.0 License, it is free for non-commercial use. Please contact Zhipeng for other use cases.