Awesome
D2GPLand [MICCAI'24][Oral]
<div align=center> <img src="assets/Figure1.png" height=230 width=750> </div>Official Implementation of MICCAI-2024 Oral paper "Depth-Driven Geometric Prompt Learning for Laparoscopic Liver Landmark Detection".
Jialun Pei, Ruize Cui, Yaoqian Li, Weixin Si, Jing Qin, and Pheng-Ann Heng
Contact: wx.si@siat.ac.cn, peijialun@gmail.com
🔧 Environment preparation
The code is tested on python 3.9.19, pytorch 2.0.1, and CUDA 11.7, change the versions below to your desired ones.
- Clone repository:
git clone https://github.com/PJLallen/D2GPLand.git
cd D2GPLand
- Set up anaconda environment:
# Create D2GPLand anaconda environment from YAML.file
conda env create -f D2GPLand.yaml
# Activate environment
conda activate D2GPLand
📈 Dataset preparation
💥 Download proposed L3D dataset
- L3D dataset: Google Drive
Register datasets
Change the path of the datasets as:
DATASET_ROOT = 'D2GPLand/L3D/'
TRAIN_PATH = os.path.join(DATASET_ROOT, 'Train/')
TEST_PATH = os.path.join(DATASET_ROOT, 'Test/')
VAL_PATH = os.path.join(DATASET_ROOT, 'Val/')
🚀 Pre-trained weights
D2GPLand with SAM-b and ResNet-34: Google Drive
⚙️ Usage
Train
python train.py --data_path {PATH_TO_DATASET} \
--batch_size 4 --lr 1e-4 --decay_lr 1e-6 --epoch 60
Please replace {PATH_TO_DATASET} to your own dataset dir
Eval
python test.py --model_path {PATH_TO_THE_MODEL_WEIGHTS} \
--prototype_path {PATH_TO_THE_PROTOTYPE_WEIGHTS} \
--data_path {PATH_TO_DATASET}
{PATH_TO_THE_MODEL_WEIGHTS}
: please put the pre-trained model weights here{PATH_TO_THE_PROTOTYPE_WEIGHTS}
: please put the pre-trained prototype weights here{PATH_TO_DATASET}
: please put the dataset dir here
Acknowledgement
This work is based on:
Thanks them for their great work!
📚 Citation
If this helps you, please cite this work:
@inproceedings{pei2024land,
title={Depth-Driven Geometric Prompt Learning for Laparoscopic Liver Landmark Detection},
author={Pei, Jialun and Cui, Ruize and Li, Yaoqian and Si, Weixin and Qin, Jing and Heng, Pheng-Ann},
booktitle={MICCAI},
year={2024},
organization={Springer}
}