Home

Awesome

Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining

Official repository for: Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining

network

Main Results

Installation

Step-1: Create a new conda environment & install requirements

conda create -n swin_umamba python=3.10
conda activate swin_umamba

pip install torch==2.0.1 torchvision==0.15.2
pip install causal-conv1d==1.1.1
pip install mamba-ssm
pip install torchinfo timm numba

Step-2: Install Swin-UMamba

git clone https://github.com/JiarunLiu/Swin-UMamba
cd Swin-UMamba/swin_umamba
pip install -e .

Prepare data & pretrained model

Dataset:

We use the same data & processing strategy following U-Mamba. Download dataset from U-Mamba and put them into the data folder. Then preprocess the dataset with following command:

nnUNetv2_plan_and_preprocess -d DATASET_ID --verify_dataset_integrity

ImageNet pretrained model:

We use the ImageNet pretrained VMamba-Tiny model from VMamba. You need to download the model checkpoint and put it into data/pretrained/vmamba/vmamba_tiny_e292.pth

wget https://github.com/MzeroMiko/VMamba/releases/download/%2320240218/vssmtiny_dp01_ckpt_epoch_292.pth
mv vssmtiny_dp01_ckpt_epoch_292.pth data/pretrained/vmamba/vmamba_tiny_e292.pth

Training

Using the following command to train & evaluate Swin-UMamba

# AbdomenMR dataset
bash scripts/train_AbdomenMR.sh MODEL_NAME
# Endoscopy dataset
bash scripts/train_Endoscopy.sh MODEL_NAME
# Microscopy dataset 
bash scripts/train_Microscopy.sh MODEL_NAME

Here MODEL_NAME can be:

You can download our model checkpoints here.

Acknowledgements

We thank the authors of nnU-Net, Mamba, UMamba, VMamba, and Swin-Unet for making their valuable code & data publicly available.

Citation

@article{Swin-UMamba,
    title={Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining},
    author={Jiarun Liu and Hao Yang and Hong-Yu Zhou and Yan Xi and Lequan Yu and Yizhou Yu and Yong Liang and Guangming Shi and Shaoting Zhang and Hairong Zheng and Shanshan Wang},
    journal={arXiv preprint arXiv:2402.03302},
    year={2024}
}