Home

Awesome

<!-- PROJECT LOGO --> <p align="center"> <h1 align="center"> Gaussian Splatting SLAM </h1> <p align="center"> <a href="https://muskie82.github.io/"><strong>*Hidenobu Matsuki</strong></a> · <a href="https://rmurai.co.uk/"><strong>*Riku Murai</strong></a> · <a href="https://www.imperial.ac.uk/people/p.kelly/"><strong>Paul H.J. Kelly</strong></a> · <a href="https://www.doc.ic.ac.uk/~ajd/"><strong>Andrew J. Davison</strong></a> </p> <p align="center">(* Equal Contribution)</p> <h3 align="center"> CVPR 2024 (Highlight)</h3> <h3 align="center"><a href="https://arxiv.org/abs/2312.06741">Paper</a> | <a href="https://youtu.be/x604ghp9R_Q?si=nYoWr8h2Xh-6L_KN">Video</a> | <a href="https://rmurai.co.uk/projects/GaussianSplattingSLAM/">Project Page</a></h3> <div align="center"></div> <p align="center"> <a href=""> <img src="./media/teaser.gif" alt="teaser" width="100%"> </a> <a href=""> <img src="./media/gui.jpg" alt="gui" width="100%"> </a> </p> <p align="center"> This software implements dense SLAM system presented in our paper <a href="https://arxiv.org/abs/2312.06741">Gaussian Splatting SLAM</a> in CVPR'24. The method demonstrates the first monocular SLAM solely based on 3D Gaussian Splatting (left), which also supports Stereo/RGB-D inputs (middle/right). </p> <br>

Note

Getting Started

Installation

git clone https://github.com/muskie82/MonoGS.git --recursive
cd MonoGS

Setup the environment.

conda env create -f environment.yml
conda activate MonoGS

Depending on your setup, please change the dependency version of pytorch/cudatoolkit in environment.yml by following this document.

Our test setup were:

Quick Demo

bash scripts/download_tum.sh
python slam.py --config configs/mono/tum/fr3_office.yaml

You will see a GUI window pops up.

Downloading Datasets

Running the following scripts will automatically download datasets to the ./datasets folder.

TUM-RGBD dataset

bash scripts/download_tum.sh

Replica dataset

bash scripts/download_replica.sh

EuRoC MAV dataset

bash scripts/download_euroc.sh

Run

Monocular

python slam.py --config configs/mono/tum/fr3_office.yaml

RGB-D

python slam.py --config configs/rgbd/tum/fr3_office.yaml
python slam.py --config configs/rgbd/replica/office0.yaml

Or the single process version as

python slam.py --config configs/rgbd/replica/office0_sp.yaml

Stereo (experimental)

python slam.py --config configs/stereo/euroc/mh02.yaml

Live demo with Realsense

First, you'll need to install pyrealsense2. Inside the conda environment, run:

pip install pyrealsense2

Connect the realsense camera to the PC on a USB-3 port and then run:

python slam.py --config configs/live/realsense.yaml

We tested the method with Intel Realsense d455. We recommend using a similar global shutter camera for robust camera tracking. Please avoid aggressive camera motion, especially before the initial BA is performed. Check out the first 15 seconds of our YouTube video to see how you should move the camera for initialisation. We recommend to use the code in dev.speed-up branch for live demo.

<p align="center"> <a href=""> <img src="./media/realsense.png" alt="teaser" width="50%"> </a> </p>

Evaluation

<!-- To evaluate the method, please run the SLAM system with `save_results=True` in the base config file. This setting automatically outputs evaluation metrics in wandb and exports log files locally in save_dir. For benchmarking purposes, it is recommended to disable the GUI by setting `use_gui=False` in order to maximise GPU utilisation. For evaluating rendering quality, please set the `eval_rendering=True` flag in the configuration file. -->

To evaluate our method, please add --eval to the command line argument:

python slam.py --config configs/mono/tum/fr3_office.yaml --eval

This flag will automatically run our system in a headless mode, and log the results including the rendering metrics.

Reproducibility

There might be minor differences between the released version and the results in the paper. Please bear in mind that multi-process performance has some randomness due to GPU utilisation. We run all our experiments on an RTX 4090, and the performance may differ when running with a different GPU.

Acknowledgement

This work incorporates many open-source codes. We extend our gratitude to the authors of the software.

License

MonoGS is released under a LICENSE.md. For a list of code dependencies which are not property of the authors of MonoGS, please check Dependencies.md.

Citation

If you found this code/work to be useful in your own research, please considering citing the following:

@inproceedings{Matsuki:Murai:etal:CVPR2024,
  title={{G}aussian {S}platting {SLAM}},
  author={Hidenobu Matsuki and Riku Murai and Paul H. J. Kelly and Andrew J. Davison},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2024}
}