Awesome
Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural Real-Time SLAM
Paper | Project Page | Video
<p align="center"> <a href=""> <img src="./media/coslam_teaser.gif" alt="Logo" width="80%"> </a> </p>Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural Real-Time SLAM <br /> Hengyi Wang, Jingwen Wang, Lourdes Agapito<br /> CVPR 2023
This repository contains the code for the paper Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural Real-Time SLAM, a neural SLAM method that perform real-time camera tracking and dense reconstruction based on a joint encoding.
Update
- Code for Co-SLAM [2023-5-12]
- Code for offline RGB-D reconstruction click here. [2023-5-12]
- Code for evaluation strategy, performance analysis click here. [2023-5-18]
- Tutorials on params & creating sequences using iPhone/iPad Pro click here. [2023-5-26]
- Tutorials on creating sequences using RealSense
Installation
Please follow the instructions below to install the repo and dependencies.
git clone https://github.com/HengyiWang/Co-SLAM.git
cd Co-SLAM
Install the environment
# Create conda environment
conda create -n coslam python=3.7
conda activate coslam
# Install the pytorch first (Please check the cuda version)
pip install torch==1.10.1+cu113 torchvision==0.11.2+cu113 torchaudio==0.10.1 -f https://download.pytorch.org/whl/cu113/torch_stable.html
# Install all the dependencies via pip (Note here pytorch3d and tinycudann requires ~10min to build)
pip install -r requirements.txt
# Build extension (marching cubes from neuralRGBD)
cd external/NumpyMarchingCubes
python setup.py install
For tinycudann, if you cannot access network when you use GPUs, you can also try build from source as below:
# Build tinycudann
git clone --recursive https://github.com/nvlabs/tiny-cuda-nn
# Try this version if you cannot use the latest version of tinycudann
#git reset --hard 91ee479d275d322a65726435040fc20b56b9c991
cd tiny-cuda-nn/bindings/torch
python setup.py install
Dataset
Replica
Download the sequences of the Replica Dataset generated by the authors of iMAP into ./data/Replica
folder.
bash scripts/download_replica.sh # Released by authors of NICE-SLAM
ScanNet
Please follow the procedure on ScanNet website, and extract color & depth frames from the .sens
file using the code.
Synthetic RGB-D dataset
Download the sequences of the synethetic RGB-D dataset generated by the authors of neuralRGBD into ./data/neural_rgbd_data
folder. We exclude the scenes with NaN poses generated by BundleFusion.
bash scripts/download_rgbd.sh
TUM RGB-D
Download 3 sequences of TUM RGB-D dataset into ./data/TUM
folder.
bash scripts/download_tum.sh
Run
You can run Co-SLAM using the code below:
python coslam.py --config './configs/{Dataset}/{scene}.yaml
You can run Co-SLAM with multi-processing using the code below:
python coslam_mp.py --config './configs/{Dataset}/{scene}.yaml
Evaluation
We employ a slightly different evaluation strategy to measure the quality of the reconstruction, you can find out the code here. Note if you want to follow the evaluation protocol of NICE-SLAM, please refer to our supplementary material for detailed parameters setting.
Acknowledgement
We adapt codes from some awesome repositories, including NICE-SLAM, NeuralRGBD, tiny-cuda-nn. Thanks for making the code available. We also thank Zihan Zhu of NICE-SLAM, Edgar Sucar of iMAP for their prompt responses to our inquiries regarding the details of their methods.
The research presented here has been supported by a sponsored research award from Cisco Research and the UCL Centre for Doctoral Training in Foundational AI under UKRI grant number EP/S021566/1. This project made use of time on Tier 2 HPC facility JADE2, funded by EPSRC (EP/T022205/1).
Citation
If you find our code or paper useful for your research, please consider citing:
@inproceedings{wang2023coslam,
title={Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural Real-Time SLAM},
author={Wang, Hengyi and Wang, Jingwen and Agapito, Lourdes},
booktitle={CVPR},
year={2023}
}