Awesome
[MICCAI2023] DHC
This repo is the official implementation of DHC: Dual-debiased Heterogeneous Co-training Framework for Class-imbalanced Semi-supervised Medical Image Segmentation which is accepted at MICCAI-2023.
ššš We highly recommend you try our new work: https://github.com/xmed-lab/GenericSSL, which considers more practical scenarios of semi-supervised segmentation and the paper is accepted at NeurIPS-2023!
1. Environment
This code has been tested with Python 3.6, PyTorch 1.8, torchvision 0.9.0, and CUDA 11.1 on Ubuntu 20.04.
Before running the code, set the PYTHONPATH
to pwd
:
export PYTHONPATH=$(pwd)/code:$PYTHONPATH
2. Data Preparation
2.1 Synapse
The MR imaging scans are available at https://www.synapse.org/#!Synapse:syn3193805/wiki/. Please sign up and download the dataset.
Put the data in anywhere you want then change the file paths in config.py
.
Run ./code/data/preprocess.py
to
- convert
.nii.gz
files into.npy
for faster loading. - generate the train/validation/test splits
- generate the labeled/unlabeled splits
š„š„š„ The preprocessed Synapse dataset is available for downloading via this link.
After preprocessing, the ./synapse_data/
folder should be organized as follows:
./synapse_data/
āāā npy
ā āāā <id>_image.npy
ā āāā <id>_label.npy
āāā splits
ā āāā labeled_20p.txt
ā āāā unlabeled_20p.txt
ā āāā train.txt
ā āāā eval.txt
ā āāā test.txt
ā āāā ...
2.2 AMOS
The dataset can be downloaded from https://amos22.grand-challenge.org/Dataset/
Run ./code/data/preprocess_amos.py
to pre-process.
š„š„š„ The preprocessed AMOS22 dataset is available for downloading via this link.
3. Training & Testing & Evaluating
Run the following commands for training, testing and evaluating.
bash train3times_seeds_20p.sh -c 0 -t synapse -m dhc -e '' -l 3e-2 -w 0.1
20p
denotes training with 20% labeled data, you can change this to 2p
, 5p
, ... for 2%, 5%, ... labeled data.
Parameters:
-c
: use which gpu to train
-t
: task, can be synapse
or amos
-m
: method, dhc
is our proposed method, other available methods including:
- cps
- uamt
- urpc
- ssnet
- dst
- depl
- adsh
- crest
- simis
- acisis
- cld
-e
: name of current experiment
-l
: learning rate
-w
: weight of unsupervised loss
Weights of all the above models trained on 20% labeled Synapse can be downloaded from here.
Weights of all the above models trained on 5% labeled AMOS can be downloaded from here.
4. Results
4.1 Synapse
13 classes: Sp: spleen, RK: right kidney, LK: left kidney, Ga: gallbladder, Es: esophagus, Li: liver, St: stomach, Ao: aorta, IVC: inferior vena cava, PSV: portal & splenic veins, Pa: pancreas, RAG: right adrenal gland, LAG: left adrenal gland.
4.1.1 Trained with 10% labeled data
4.1.2 Trained with 20% labeled data
4.1.3 Trained with 40% labeled data
4.2 AMOS
15 classes: spleen, right kidney, left kidney, gallbladder, esophagus, liver, stomach, aorta, inferior vena cava, pancreas, right adrenal gland, left adrenal gland, duodenum, bladder, prostate/uterus
4.2.1 Trained with 2% labeled data
4.2.2 Trained with 5% labeled data
4.2.3 Trained with 10% labeled data
Cite
If this code is helpful for your study, please cite:
@inproceedings{wang2023dhc,
title={DHC: Dual-debiased Heterogeneous Co-training Framework for Class-imbalanced Semi-supervised Medical Image Segmentation},
author={Wang, Haonan and Li, Xiaomeng},
booktitle={International Conference on Medical Image Computing and Computer-Assisted Intervention},
pages={582--591},
year={2023},
organization={Springer}
}
License
This repository is released under MIT License.