Awesome
S<sup>2</sup>ME
Introduction
S<sup>2</sup>ME: Spatial-Spectral Mutual Teaching and Ensemble Learning for Scribble-supervised Polyp Segmentation
An Wang, Mengya Xu, Yang Zhang, Mobarakol Islam, and Hongliang Ren
Medical Image Computing and Computer-Assisted Intervention (MICCAI) - 2023
Early Accepted (Top 14% of 2253 manuscripts)
To our best knowledge, we propose the first spatial-spectral dual-branch network structure for weakly-supervised medical image segmentation that efficiently leverages cross-domain patterns with collaborative mutual teaching and ensemble learning. Our pixel-level entropy-guided fusion strategy advances the reliability of the aggregated pseudo labels, which provides valuable supplementary supervision signals. Moreover, we optimize the segmentation model with the hybrid mode of loss supervision from scribbles and pseudo labels in a holistic manner and witness improved outcomes. With extensive in-domain and out-ofdomain evaluation on four public datasets, our method shows superior accuracy, generalization, and robustness, indicating its clinical significance in alleviating data-related issues such as data shift and corruption which are commonly encountered in the medical field.
Environment
- NVIDIA RTX3090
- Python 3.8
- Pytorch 1.10
- Check environment.yml for more dependencies.
Usage
-
Dataset
- SUN-SEG: Download from SUN-SEG, then follow the json files in the folder data/polyp for splits.
- Kvasir-SEG: Download from Kvasir-SEG.
- CVC-ClinicDB: Download from CVC-ClinicDB.
- PolypGen: Download from PolypGen.
-
Training and Testing
-
Command:
CUDA_VISIBLE_DEVICES=1 python train_s2me.py --model1 unet --model2 ynet_ffc --sup Scribble --exp s2me-ent-css_5.0_25k --mps True --mps_type entropy --cps True
-
Some essential hyperparameters:
- mps and mps_type: Whether apply Mutual Teaching and its type (entropy is our entropy-guided fusion);
- cps: Apply Ensemble Learning or not;
- Refer to train_s2me.py for more explanation on other hyperparameters.
-
Trained model and training log
- One trained model which yields our best result on the SUN-SEG dataset is available in the folder model.
-
Test Result
- In-domain quantitative performance
- In-domain qualitative performance
- Generalization performance
- Ablation Studies
Citation
@InProceedings{Wang2023s2me,
author="Wang, An
and Xu, Mengya
and Zhang, Yang
and Islam, Mobarakol
and Ren, Hongliang",
title="S{\$}{\$}^2{\$}{\$}ME: Spatial-Spectral Mutual Teaching and Ensemble Learning for Scribble-Supervised Polyp Segmentation",
booktitle="Medical Image Computing and Computer Assisted Intervention -- MICCAI 2023",
year="2023",
publisher="Springer Nature Switzerland",
address="Cham",
pages="35--45",
}
Acknowledgement
Some of the codes are borrowed/refer from below repositories: