Awesome
EndoSurf: Neural Surface Reconstruction of Deformable Tissues with Stereo Endoscope Videos
Paper | Models
Code for MICCAI 2023 Oral paper EndoSurf: Neural Surface Reconstruction of Deformable Tissues with Stereo Endoscope Videos by Ruyi Zha, Xuelian Cheng, Hongdong Li, Mehrtash Harandi and Zongyuan Ge.
EndoSurf is a neural-field-based method that reconstructs the deforming surgical sites with stereo endoscope videos.
<img src="media/pipeline.jpg" alt="Pipeline" style="zoom:15%;" />Demo
EndoSurf (Ours) | EndoNeRF (baseline) |
---|---|
<img src="media/demo_pull_endosurf.gif" style="zoom: 40%;" /> | <img src="media/demo_pull_endonerf.gif" style="zoom: 40%;" /> |
<img src="media/demo_cut_endosurf.gif" style="zoom: 40%;" /> | <img src="media/demo_cut_endonerf.gif" style="zoom: 40%;" /> |
Setup
We recommend using Miniconda to set up the environment.
# Create conda environment
conda create -n endosurf python=3.9
conda activate endosurf
# Install packages
pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 --extra-index-url https://download.pytorch.org/whl/cu113
pip install -r requirements.txt
Dataset and checkpoints
- Follow this instruction to prepare ENDONERF dataset.
- Follow this instruction to prepare SCARED2019 dataset.
- Follow this instruction to download checkpoints.
Training
Use src/trainer/trainer_endosurf.py
for training. You can find all configurations in configs/endosurf
and training commands in scripts.sh
.
# Train EndoSurf on pulling_soft_tissues
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_pull.yml --mode train
# Train EndoSurf on cutting_tissues_twice
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_cut.yml --mode train
# Train EndoSurf on scared2019_dataset_1_keyframe_1
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_d1k1.yml --mode train
# Train EndoSurf on scared2019_dataset_2_keyframe_1
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_d2k1.yml --mode train
# Train EndoSurf on scared2019_dataset_3_keyframe_1
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_d3k1.yml --mode train
# Train EndoSurf on scared2019_dataset_6_keyframe_1
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_d6k1.yml --mode train
# Train EndoSurf on scared2019_dataset_7_keyframe_1
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py --cfg configs/endosurf/baseline/base_d7k1.yml --mode train
Test
After training, run src/trainer/trainer_endosurf.py
with test
mode to evaluate reconstruction results on the test set. You can test 2D images with --mode test_2d
, 3D meshes with --mode test_3d
, or both results with --mode test
. Example of testing EndoSurf on case pulling_soft_tissues
is:
# Evaluate all results
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode test
# Evaluate 2D images only
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode test_2d
# Evaluate 3D meshes only
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode test_3d
You can also generate reconstruction results of all video frames with --mode demo
.
# Demonstrate all results
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode demo
# Demonstrate 2D images only
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode demo_2d
# Demonstrate 3D meshes only
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode demo_3d
To render 2D images and 3D meshes of all frames (e.g. GIFs in demo), use mode --mode demo
.
# Render both images and meshes
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode demo
# Render 2D images only
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode demo_2d
# Render 3D meshes only
CUDA_VISIBLE_DEVICES=0 python src/trainer/trainer_endosurf.py \
--cfg configs/endosurf/baseline/base_pull.yml --mode demo_3d
Reproducing results in the paper
To reproduce all results shown in the paper, first download data information files *.pkl
from here and replace the previous files in data/data_info
. This is because preprocess.py
involves some random operations e.g., point cloud noise removal. Then download the pretrained models from here. You can find all training/test/demo commands for our method, baseline methods and ablation study from scripts.sh
.
Contact
For any queries, please contact ruyi.zha@anu.edu.au.