Home

Awesome

Winner of MyoPS 2020 Challenge

This repository provides source code for myocardial pathology segmentation (MyoPS) Challenge 2020. The method is detailed in the paper, and it won the 1st place of MyoPS 2020. Our code is based on PyMIC, a pytorch-based toolkit for medical image computing with deep learning, that is lightweight and easy to use, and nnUNet, a self-adaptive segmentation method for medical images.

@inproceedings{zhai2020myocardial,
  title={Myocardial edema and scar segmentation using a coarse-to-fine framework with weighted ensemble},
  author={Zhai, Shuwei and Gu, Ran and Lei, Wenhui and Wang, Guotai},
  booktitle={Myocardial Pathology Segmentation Combining Multi-Sequence CMR Challenge},
  pages={49--59},
  year={2020},
  organization={Springer}
}
<img src='./picture/method.png' width="400">

Method overview

Our solution is a coarse-to-fine method. PyMIC and nnUNet are used in the coarse and fine stages, respectively.

In the coarse segmentation stage, we use 2D U-Net to segment there foreground classes: complete ring-shaped myocardium, left ventricular (LV) blood pool and rigth ventricular (RV) blood pool. The network is trained with a combination of Dice loss and cross entropy loss.

In the fine stage, we use nnUNet to segment all the five foreground classes: LV blood pool, RV blood pool, LV normal myocardium, LV myocardial edema and LV myocardial scar. The coarse segmentation result will serve as an extra channel for the input of the network, i.e., the first 3 modalities are C0(_0000), DE(_0001) and T2(_0002), respectively. The 4th modality(_0003) is the coarse segmentation result.

Requirements

This code depends on Pytorch, PyMIC, GeodisTK and nnUNet. To install PyMIC and GeodisTK, run:

pip install PYMIC==0.2.4
pip install GeodisTK

To use nnUNet, Download nnUNet, and put them in the ProjectDir such as /mnt/data1/swzhai/projects/MyoPS2020. Other requirements can be found in requirements.txt.

Configure data directories and environmental variables

path_dict['MyoPS_data_dir'] = "/mnt/data1/swzhai/dataset/MyoPS"
path_dict['nnunet_raw_data_dir'] = "/mnt/data1/swzhai/dataset/MyoPS/nnUNet_raw_data_base/nnUNet_raw_data"

where MyoPS_data_dir is the path of the MyoPS dataset, and nnunet_raw_data_dir is the path of raw data used by nnU-Net in the second stage of our method.

cd nnUNet
pip install -e .
export nnUNet_raw_data_base="MyoPS_data_dir/nnUNet_raw_data_base"
export nnUNet_preprocessed="MyoPS_data_dir/nnUNet_preprocessed"
export RESULTS_FOLDER="ProjectDir/result/nnunet"

Dataset and Preprocessing

python crop_for_coarse_stage.py

This will crop the images with the maximal bounding box in the training set, and the cropped results are saved in MyoPS_data_dir/data_preprocessed/imagesTr, MyoPS_data_dir/data_preprocessed/labelsTr and MyoPS_data_dir/data_preprocessed/imagesTs respectively. crop_information.json in each folder contains bounding box coordinates that will be used when putting the final segmentation results to the original image space.

Coarse segmentation Model

We use five-fold cross validation for training and validation of the coarse model.

Training and cross validation

python write_csv_files.py
python myops_run.py train config/train_val.cfg 1
python myops_run.py test  config/train_val.cfg 1
pymic_evaluate_seg config/evaluation.cfg
python  postprocess.py result/unet2d result/unet2d_post
---class_1class_2class_3average
No postprocess0.87800.90670.91800.9009
with postprocess0.87850.90950.92340.9038

Inference for testing data

python myops_test.py test config/test.cfg
python postprocess.py result/unet2d_test result/unet2d_test_post

Fine segmentation

In the fine segmentation stage, we use nnUNet to segment all the classes. This section is highly dependent on nnUNet, so make sure that you have some basic experience of using nnUNet before you do the following operations.

Tips: In order to save unnecessary time, you can change self.max_num_epochs = 1000 to self.max_num_epochs = 300 in nnUNet/nnunet/training/network_training/nnUNetTrainerV2.py.

Data preparation

python crop_for_fine_stage.py
python create_dataset_json.py

Training

nnUNet_plan_and_preprocess -t 112 --verify_dataset_integrity
nnUNet_train 2d nnUNetTrainerV2 Task112_MyoPS FOLD --npz
nnUNet_train 3d_fullres nnUNetTrainerV2 Task112_MyoPS FOLD --npz

Inference

nnUNet_find_best_configuration -m 2d 3d_fullres -t 112
nnUNet_predict -i FOLDER_WITH_TEST_CASES -o OUTPUT_FOLDER_MODEL1 -tr nnUNetTrainerV2 -ctr nnUNetTrainerV2CascadeFullRes -m 2d -p nnUNetPlansv2.1 -t Task112_MyoPS

nnUNet_predict -i FOLDER_WITH_TEST_CASES -o OUTPUT_FOLDER_MODEL2 -tr nnUNetTrainerV2 -ctr nnUNetTrainerV2CascadeFullRes -m 3d_fullres -p nnUNetPlansv2.1 -t Task112_MyoPS

nnUNet_ensemble -f OUTPUT_FOLDER_MODEL1 OUTPUT_FOLDER_MODEL2 -o OUTPUT_FOLDER -pp result/nnunet/nnUNet/ensembles/Task112_MyoPS/ensemble_2d__nnUNetTrainerV2__nnUNetPlansv2.1--3d_fullres__nnUNetTrainerV2__nnUNetPlansv2.1/postprocessing.json
python get_final_test.py result/nnunet/test_ensemble  result/nnunet/test_ensemble_original