Home

Awesome

<h2 align="center"> <a href="https://arxiv.org/abs/2403.20309">InstantSplat: Sparse-view SfM-free <a href="https://arxiv.org/abs/2403.20309"> Gaussian Splatting in Seconds </a> <h5 align="center">

arXiv Gradio Home PageX youtube youtube

</h5> <div align="center"> This repository is the official implementation of InstantSplat, an sparse-view, SfM-free framework for large-scale scene reconstruction method using Gaussian Splatting. InstantSplat supports 3D-GS, 2D-GS, and Mip-Splatting. </div> <br>

Table of Contents

Free-view Rendering

https://github.com/zhiwenfan/zhiwenfan.github.io/assets/34684115/748ae0de-8186-477a-bab3-3bed80362ad7

TODO List

Get Started

Installation

  1. Clone InstantSplat and download pre-trained model.
git clone --recursive https://github.com/NVlabs/InstantSplat.git
cd InstantSplat
git submodule update --init --recursive
cd submodules/dust3r/
mkdir -p checkpoints/
wget https://download.europe.naverlabs.com/ComputerVision/DUSt3R/DUSt3R_ViTLarge_BaseDecoder_512_dpt.pth -P checkpoints/
cd ../../
  1. Create the environment (or use pre-built docker), here we show an example using conda.
conda create -n instantsplat python=3.11 cmake=3.14.0
conda activate instantsplat
conda install pytorch torchvision pytorch-cuda=12.1 -c pytorch -c nvidia  # use the correct version of cuda for your system
pip install -r requirements.txt
pip install submodules/simple-knn
# modify the rasterizer
vim submodules/diff-gaussian-rasterization/cuda_rasterizer/auxiliary.h
'p_view.z <= 0.2f' -> 'p_view.z <= 0.001f' # line 154
pip install submodules/diff-gaussian-rasterization
  1. Optional but highly suggested, compile the cuda kernels for RoPE (as in CroCo v2).
# DUST3R relies on RoPE positional embeddings for which you can compile some cuda kernels for faster runtime.
cd submodules/dust3r/croco/models/curope/
python setup.py build_ext --inplace

Alternative: use the pre-built docker image: pytorch/pytorch:2.1.2-cuda11.8-cudnn8-devel

docker pull dockerzhiwen/instantsplat_public:2.0

if docker failed to produce reasonable results, try Installation step again within the docker.

Usage

  1. Data preparation (Our pre-processed data: link)
  cd <data_path>
  # then do whatever data preparation
  1. Command
  # InstantSplat train and output video (no GT reference, render by interpolation) using the following command.
  bash scripts/run_train_infer.sh

  # InstantSplat train and evaluate (with GT reference) using the following command.
  bash scripts/run_train_eval.sh

Acknowledgement

This work is built on many amazing research works and open-source projects, thanks a lot to all the authors for sharing!

Citation

If you find our work useful in your research, please consider giving a star :star: and citing the following paper :pencil:.

@misc{fan2024instantsplat,
        title={InstantSplat: Unbounded Sparse-view Pose-free Gaussian Splatting in 40 Seconds},
        author={Zhiwen Fan and Wenyan Cong and Kairun Wen and Kevin Wang and Jian Zhang and Xinghao Ding and Danfei Xu and Boris Ivanovic and Marco Pavone and Georgios Pavlakos and Zhangyang Wang and Yue Wang},
        year={2024},
        eprint={2403.20309},
        archivePrefix={arXiv},
        primaryClass={cs.CV}
      }