Home

Awesome

E-NeRF

E-NeRF computes a Neural Radiance Field from event-camera data. It is a joint work by

<p align="center"> <a href="https://vision.in.tum.de/members/klenk">Simon Klenk</a><sup>1</sup> &emsp; <a href="https://lukaskoestler.com">Lukas Koestler</a><sup>1</sup> &emsp; <a href="https://rpg.ifi.uzh.ch/people_scaramuzza.html">Davide Scaramuzza</a><sup>2</sup> &emsp; <a href="https://vision.in.tum.de/members/cremers">Daniel Cremers</a><sup>1</sup> &emsp; </p> <p align="center"> <sub> <sup>1</sup>Computer Vision Group, Technical University of Munich, Germany <br> &emsp; <sup>2</sup> Robotics and Perception Group, University of Zurich, Switzerland.</sup> &emsp; </p> <p align="center"> IEEE Robotics and Automation Letters (RA-L), 2023 & <br> International Conference on Intelligent Robots and Systems (IROS), 2023 </p>

teaser

If you use this code or our paper results, please cite our work. <a href="https://arxiv.org/abs/2208.11300">link to paper</a> <br>

@article{klenk2022nerf,
  title={E-NeRF: Neural Radiance Fields from a Moving Event Camera},
  author={Klenk, Simon and Koestler, Lukas and Scaramuzza, Davide and Cremers, Daniel},
  journal={IEEE Robotics and Automation Letters},
  year={2023}
} 

Data

Datasets were simulated using esim. It can be found under https://vision.in.tum.de/research/enerf.

Code

Our python implementation is based on torch-ngp - a pytorch implementation of instant-ngp, as described in Instant Neural Graphics Primitives with a Multiresolution Hash Encoding.

Install

git clone --recursive https://github.com/knelk/enerf
cd enerf

Install with conda

conda env create -f environment.yml
conda activate enerf

Install with pip

python3 -m venv env
source env/bin/activate
pip install -r requirements.txt

Tested environments

Usage

We support 3 data formats: esim, tumvie, and eds. The data format is quite flexible and easy-to-adapt, e.g. to feed COLMAP poses or similar (have a look at nerf/provider.py and the scripts folder for data preprocessing, and associated issues in torch-ngp).

Training

./configs should be a good starting point to understand the most important flags of our code. Important event-related flags include C_thres (-1 for using the normalized loss function), events (boolean), event_only (boolean) and accumulate_evs (boolean). The scene is assumed to be in [-bound, bound], and centered at (0, 0, 0).

Please refer to torch-ngp and its issues for more details regarding the installation, usage and scene assumptions.

Data Preprocessing:

Rendering

We provide a script for rendering. For example, to render the trained E-NeRF model at the keyframe (kfs) validation (val) poses, simply run python scripts/render.py --model_dir EXPDIR --infile EXPDIR/val_final_quatlist_kfs_ns.txt. If you want random poses (around the training poses), do not provide and infile and set rand_poses=1.

Troubleshooting

Acknowledgement

@misc{torch-ngp,
    Author = {Jiaxiang Tang},
    Year = {2022},
    Note = {https://github.com/ashawkey/torch-ngp},
    Title = {Torch-ngp: a PyTorch implementation of instant-ngp}
}
@inproceedings{klenk2021tum,
  title={Tum-vie: The tum stereo visual-inertial event dataset},
  author={Klenk, Simon and Chui, Jason and Demmel, Nikolaus and Cremers, Daniel},
  booktitle={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={8601--8608},
  year={2021},
  organization={IEEE}
}
 @inproceedings{hidalgo2022event,
  title={Event-aided Direct Sparse Odometry},
  author={Hidalgo-Carri{\'o}, Javier and Gallego, Guillermo and Scaramuzza, Davide},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={5781--5790},
  year={2022}
}
@Article{Rebecq19pami,
  author        = {Henri Rebecq and Ren{\'{e}} Ranftl and Vladlen Koltun and Davide Scaramuzza},
  title         = {High Speed and High Dynamic Range Video with an Event Camera},
  journal       = {{IEEE} Trans. Pattern Anal. Mach. Intell. (T-PAMI)},
  url           = {http://rpg.ifi.uzh.ch/docs/TPAMI19_Rebecq.pdf},
  year          = 2019
}