Home

Awesome

MT-UNet

Update 2024/04/15

We have uploaded the lists_Synapse.zip file in ./dataset/ based on the recently raised issues. Please use it along with the Synapse dataset you downloaded.

Update 2022/03/05

The paper has been accepted by ICASSP 2022. The complete code is released today.

Please note that, if you have requested our code before, that code is depreciated right now and you are encouraged to use the newest version in this repo.


1. Prepare your dataset.

2. Clone the code

git clone git@github.com:Dootmaan/MT-UNet.git
cd MT-UNet

3. Start training

CUDA_VISIBLE_DEVICES=0 nohup python3 -u train_mtunet_ACDC.py >train_mtunet_ACDC.log 2>&1 &

The weights will be saved to "./checkpoint/ACDC/mtunet" while the predictions will be saved to "./predictions" by default. You can also load our weights before training with the argparser.

ACDC weights

Synapse weights


We have tested the code to make sure it works. However, if you still find some bugs, feel free to make a pull request or simply raise an issue.

You are also encouraged to read the update log below to know more about this repo.

Update 2022/01/05

By another round of training based on previous weights, our model also achieved a better performance on ACDC (91.61% DSC). We have changed the weights for ACDC to this newest version and you can check it out for yourself. However, previous versions of weights are still available on Google Drive, and you can access them via previous commits.

Update 2022/01/04

We have further trained our MT-UNet and it turns out to have a better result on Synapse with 79.20% DSC. We have changed the public weights of Synapse to this version and will also update the results in our paper.

Update 2022/01/03

It should be mentioned that we are currently conducting some statistical evaluations on our model and these results will also be made public on this site.

Update 2021/11/19


This is the official implementation for our ICASSP2022 paper MIXED TRANSFORMER UNET FOR MEDICAL IMAGE SEGMENTATION

The entire code will be released upon paper publication.