Home

Awesome

<p align="center"> <img width="150" alt="logo" src="https://i.imgur.com/0OaOlKO.png"> </p>

Progressive Semantic Segmentation (MagNet)

Open In Colab arXiv video

MagNet, a multi-scale framework that resolves local ambiguity by looking at the image at multiple magnification levels, has multiple processing stages, where each stage corresponds to a magnification level, and the output of one stage is fed into the next stage for coarse-to-fine information propagation. Experiments on three high-resolution datasets of urban views, aerial scenes, and medical images show that MagNet consistently outperforms the state-of-the-art methods by a significant margin.

Details of the MagNet model architecture and experimental results can be found in our following paper:

@inproceedings{m_Huynh-etal-CVPR21,
  author = {Chuong Huynh and Anh Tran and Khoa Luu and Minh Hoai},
  title = {Progressive Semantic Segmentation},
  year = {2021},
  booktitle = {Proceedings of the {IEEE} Conference on Computer Vision and Pattern Recognition (CVPR)},
}

Please CITE our paper when MagNet is used to help produce published results or incorporated into other software.

Datasets

This current code provides configurations to train, evaluate on two datasets: Cityscapes and DeepGlobe. To prepare the datasets, in the ./data directory, please do following steps:

For Cityscapes

  1. Register an account on this page and log in.
  2. Download leftImg8bit_trainvaltest.zip and gtFine_trainvaltest.zip.
  3. Run the script below to extract zip files to correct locations:
sh ./prepare_cityscapes.sh

For DeepGlobe

  1. Register an account on this page and log in.
  2. Go to this page and download Starting Kit of the #1 Development Phase.
  3. Run the script below to extract zip files to correct locations:
sh ./prepare_deepglobe.sh

If you want to train/evaluate with your dataset, follow the steps in this document

Getting started

Requirements

The framework is tested on machines with the following environment:

To install dependencies, please run the following command:

pip install -r requirements.txt

Pretrained models

Performance of pre-trained models on datasets:

DatasetBackboneBaseline IoU (%)MagNet IoU (%)MagNet-Fast IoU (%)Download
CityscapesHRNetW18+OCR63.2468.2067.37backbone<br>refine_512x256<br>refine_1024x512<br>refine_2048x1024
DeepGlobeResnet50-FPN67.2272.1068.22backbone<br>refine

Please manually download pre-trained models to ./checkpoints or run the script below:

cd checkpoints
sh ./download_cityscapes.sh # for Cityscapes
# or
sh ./download_deepglobe.sh # for DeepGlobe

Usage

You can run this Google Colab Notebook to test our pre-trained models with street-view images. Please follow the instructions in the notebook to experience the performance of our network.

If you want to test our framework on your local machine:

  1. To test with a Cityscapes image, e.g data/frankfurt_000001_003056_leftImg8bit.png:
python demo.py --dataset cityscapes \
               --image data/frankfurt_000001_003056_leftImg8bit.png \
               --scales 256-128,512-256,1024-512,2048-1024 \
               --crop_size 256 128 \
               --input_size 256 128 \
               --model hrnet18+ocr \
               --pretrained checkpoints/cityscapes_hrnet.pth \
               --pretrained_refinement checkpoints/cityscapes_refinement_512.pth checkpoints/cityscapes_refinement_1024.pth checkpoints/cityscapes_refinement_2048.pth \
               --num_classes 19 \
               --n_points 32768 \
               --n_patches -1 \
               --smooth_kernel 5 \
               --save_pred \
               --save_dir test_results/demo

# or in short, you can run
sh scripts/cityscapes/demo_magnet.sh data/frankfurt_000001_003056_leftImg8bit.png
python demo.py --dataset cityscapes \
               --image frankfurt_000001_003056_leftImg8bit.png \
               --scales 256-128,512-256,1024-512,2048-1024 \
               --crop_size 256 128 \
               --input_size 256 128 \
               --model hrnet18+ocr \
               --pretrained checkpoints/cityscapes_hrnet.pth \
               --pretrained_refinement checkpoints/cityscapes_refinement_512.pth checkpoints/cityscapes_refinement_1024.pth checkpoints/cityscapes_refinement_2048.pth \
               --num_classes 19 \
               --n_points 0.9 \
               --n_patches 4 \
               --smooth_kernel 5 \
               --save_pred \
               --save_dir test_results/demo

# or in short, you can run
sh scripts/cityscapes/demo_magnet_fast.sh data/frankfurt_000001_003056_leftImg8bit.png

All results will be stored at test_results/demo/frankfurt_000001_003056_leftImg8bit

  1. To test with a Deepglobe image, e.g data/639004_sat.jpg:
python demo.py --dataset deepglobe \
               --image data/639004_sat.jpg \
               --scales 612-612,1224-1224,2448-2448 \
               --crop_size 612 612 \
               --input_size 508 508 \
               --model fpn \
               --pretrained checkpoints/deepglobe_fpn.pth \
               --pretrained_refinement checkpoints/deepglobe_refinement.pth \
               --num_classes 7 \
               --n_points 0.75 \
               --n_patches -1 \
               --smooth_kernel 11 \
               --save_pred \
               --save_dir test_results/demo

# or in short, you can run
sh scripts/deepglobe/demo_magnet.sh data/639004_sat.jpg
python demo.py --dataset deepglobe \
               --image data/639004_sat.jpg \
               --scales 612-612,1224-1224,2448-2448 \
               --crop_size 612 612 \
               --input_size 508 508 \
               --model fpn \
               --pretrained checkpoints/deepglobe_fpn.pth \
               --pretrained_refinement checkpoints/deepglobe_refinement.pth \
               --num_classes 7 \
               --n_points 0.9 \
               --n_patches 3 \
               --smooth_kernel 11 \
               --save_pred \
               --save_dir test_results/demo

# or in short, you can run
sh scripts/deepglobe/demo_magnet_fast.sh data/639004_sat.jpg

All results will be stored at test_results/demo/639004_sat

Training

Training backbone networks

We customize the training script from HRNet repository to train our backbones. Please first go to this directory ./backbone and run the following scripts:

HRNetW18V2+OCR for Cityscapes

Download pre-trained weights on ImageNet:

# In ./backbone
cd pretrained_weights
wget https://public.vinai.io/chuonghm/hrnet_w18_v2_imagenet.pth

Training the model:

# In ./backbone
python train.py --cfg experiments/cityscapes/hrnet_ocr_w18_train_256x128_sgd_lr1e-2_wd5e-4_bs_12_epoch484.yaml

The logs of training are stored at ./log/cityscapes/HRNetW18_OCR.

The checkpoint of backbone after training are stored at ./output/cityscapes/hrnet_ocr_w18_train_256x128_sgd_lr1e-2_wd5e-4_bs_12_epoch484/best.pth. This checkpoint is used to train further refinement modules.

Resnet50-FPN for Deepglobe

Training the model:

# In ./backbone
python train.py --cfg experiments/deepglobe/resnet_fpn_train_612x612_sgd_lr1e-2_wd5e-4_bs_12_epoch484.yaml

The logs of training are stored at ./log/deepglobe/ResnetFPN.

The checkpoint of backbone after training are stored at ./output/deepglobe/resnet_fpn_train_612x612_sgd_lr1e-2_wd5e-4_bs_12_epoch484/best.pth. This checkpoint is used to train further refinement modules.

Training refinement modules

Available arguments for training:

train.py [-h] --dataset DATASET [--root ROOT] [--datalist DATALIST]
                --scales SCALES --crop_size N [N ...] --input_size N [N ...]
                [--num_workers NUM_WORKERS] --model MODEL --num_classes
                NUM_CLASSES --pretrained PRETRAINED
                [--pretrained_refinement PRETRAINED_REFINEMENT [PRETRAINED_REFINEMENT ...]]
                --batch_size BATCH_SIZE [--log_dir LOG_DIR] --task_name
                TASK_NAME [--lr LR] [--momentum MOMENTUM] [--decay DECAY]
                [--gamma GAMMA] [--milestones N [N ...]] [--epochs EPOCHS]

optional arguments:
  -h, --help            show this help message and exit
  --dataset DATASET     dataset name: cityscapes, deepglobe (default: None)
  --root ROOT           path to images for training and testing (default: )
  --datalist DATALIST   path to .txt containing image and label path (default:
                        )
  --scales SCALES       scales: w1-h1,w2-h2,... , e.g.
                        512-512,1024-1024,2048-2048 (default: None)
  --crop_size N [N ...]
                        crop size, e.g. 256 128 (default: None)
  --input_size N [N ...]
                        input size, e.g. 256 128 (default: None)
  --num_workers NUM_WORKERS
                        number of workers for dataloader (default: 1)
  --model MODEL         model name. One of: fpn, psp, hrnet18+ocr, hrnet48+ocr
                        (default: None)
  --num_classes NUM_CLASSES
                        number of classes (default: None)
  --pretrained PRETRAINED
                        pretrained weight (default: None)
  --pretrained_refinement PRETRAINED_REFINEMENT [PRETRAINED_REFINEMENT ...]
                        pretrained weight (s) refinement module (default:
                        [''])
  --batch_size BATCH_SIZE
                        batch size for training (default: None)
  --log_dir LOG_DIR     directory to store log file (default: runs)
  --task_name TASK_NAME
                        task name, experiment name. The final path of your
                        logs is <log_dir>/<task_name>/<timestamp> (default:
                        None)
  --lr LR               learning rate (default: 0.001)
  --momentum MOMENTUM   momentum for optimizer (default: 0.9)
  --decay DECAY         weight decay for optimizer (default: 0.0005)
  --gamma GAMMA         gamma for lr scheduler (default: 0.1)
  --milestones N [N ...]
                        milestones to reduce learning rate (default: [10, 20,
                        30, 40, 45])
  --epochs EPOCHS       number of epochs for training (default: 50)

Cityscapes

To train MagNet with Cityscapes dataset, please run this sample script:

python train.py --dataset cityscapes \
                --root data/cityscapes \
                --datalist data/list/cityscapes/train.txt \
                --scales 256-128,512-256,1024-512,2048-1024 \
                --crop_size 256 128 \
                --input_size 256 128 \
                --num_workers 8 \
                --model hrnet18+ocr \
                --pretrained checkpoints/cityscapes_hrnet.pth \
                --num_classes 19 \
                --batch_size 8 \
                --task_name cityscapes_refinement \
                --lr 0.001

# or in short, run the script below
sh scripts/cityscapes/train_magnet.sh

Deepglobe

To train MagNet with Deepglobe dataset, please run this sample script:

python train.py --dataset deepglobe \
                --root data/deepglobe \
                --datalist data/list/deepglobe/train.txt \
                --scales 612-612,1224-1224,2448-2448 \
                --crop_size 612 612 \
                --input_size 508 508 \
                --num_workers 8 \
                --model fpn \
                --pretrained checkpoints/deepglobe_fpn.pth \
                --num_classes 7 \
                --batch_size 8 \
                --task_name deepglobe_refinement \
                --lr 0.001

# or in short, run the script below
sh scripts/deepglobe/train_magnet.sh

Evaluation

Available arguments for testing:

test.py [-h] --dataset DATASET [--root ROOT] [--datalist DATALIST]
               --scales SCALES --crop_size N [N ...] --input_size N [N ...]
               [--num_workers NUM_WORKERS] --model MODEL --num_classes
               NUM_CLASSES --pretrained PRETRAINED
               [--pretrained_refinement PRETRAINED_REFINEMENT [PRETRAINED_REFINEMENT ...]]
               [--image IMAGE] --sub_batch_size SUB_BATCH_SIZE
               [--n_patches N_PATCHES] --n_points N_POINTS
               [--smooth_kernel SMOOTH_KERNEL] [--save_pred]
               [--save_dir SAVE_DIR]

optional arguments:
  -h, --help            show this help message and exit
  --dataset DATASET     dataset name: cityscapes, deepglobe (default: None)
  --root ROOT           path to images for training and testing (default: )
  --datalist DATALIST   path to .txt containing image and label path (default:
                        )
  --scales SCALES       scales: w1-h1,w2-h2,... , e.g.
                        512-512,1024-1024,2048-2048 (default: None)
  --crop_size N [N ...]
                        crop size, e.g. 256 128 (default: None)
  --input_size N [N ...]
                        input size, e.g. 256 128 (default: None)
  --num_workers NUM_WORKERS
                        number of workers for dataloader (default: 1)
  --model MODEL         model name. One of: fpn, psp, hrnet18+ocr, hrnet48+ocr
                        (default: None)
  --num_classes NUM_CLASSES
                        number of classes (default: None)
  --pretrained PRETRAINED
                        pretrained weight (default: None)
  --pretrained_refinement PRETRAINED_REFINEMENT [PRETRAINED_REFINEMENT ...]
                        pretrained weight (s) refinement module (default:
                        [''])
  --image IMAGE         image path to test (demo only) (default: None)
  --sub_batch_size SUB_BATCH_SIZE
                        batch size for patch processing (default: None)
  --n_patches N_PATCHES
                        number of patches to be refined at each stage. if
                        n_patches=-1, all patches will be refined (default:
                        -1)
  --n_points N_POINTS   number of points to be refined at each stage. If
                        n_points < 1.0, it will be the proportion of total
                        points (default: None)
  --smooth_kernel SMOOTH_KERNEL
                        kernel size of blur operation applied to error scores
                        (default: 16)
  --save_pred           save predictions or not, each image will contains:
                        image, ground-truth, coarse pred, fine pred (default:
                        False)
  --save_dir SAVE_DIR   saved directory (default: test_results)

Otherwise, there are sample scripts below to test with our pre-trained models.

Cityscapes

Full MagNet refinement:

python test.py --dataset cityscapes \
               --root data/cityscapes \
               --datalist data/list/cityscapes/val.txt \
               --scales 256-128,512-256,1024-512,2048-1024 \
               --crop_size 256 128 \
               --input_size 256 128 \
               --num_workers 8 \
               --model hrnet18+ocr \
               --pretrained checkpoints/cityscapes_hrnet.pth \
               --pretrained_refinement checkpoints/cityscapes_refinement_512.pth checkpoints/cityscapes_refinement_1024.pth checkpoints/cityscapes_refinement_2048.pth \
               --num_classes 19 \
               --sub_batch_size 1 \
               --n_points 32768 \
               --n_patches -1 \
               --smooth_kernel 5 \
               --save_pred \
               --save_dir test_results/cityscapes

# or in short, run the script below
sh scripts/cityscapes/test_magnet.sh

MagNet-Fast refinement:

python test.py --dataset cityscapes \
               --root data/cityscapes \
               --datalist data/list/cityscapes/val.txt \
               --scales 256-128,512-256,1024-512,2048-1024 \
               --crop_size 256 128 \
               --input_size 256 128 \
               --num_workers 8 \
               --model hrnet18+ocr \
               --pretrained checkpoints/cityscapes_hrnet.pth \
               --pretrained_refinement checkpoints/cityscapes_refinement_512.pth checkpoints/cityscapes_refinement_1024.pth checkpoints/cityscapes_refinement_2048.pth \
               --num_classes 19 \
               --sub_batch_size 1 \
               --n_points 0.9 \
               --n_patches 4 \
               --smooth_kernel 5 \
               --save_pred \
               --save_dir test_results/cityscapes_fast

# or in short, run the script below
sh scripts/cityscapes/test_magnet_fast.sh

Deepglobe

Full MagNet refinement:

python test.py --dataset deepglobe \
               --root data/deepglobe \
               --datalist data/list/deepglobe/test.txt \
               --scales 612-612,1224-1224,2448-2448 \
               --crop_size 612 612 \
               --input_size 508 508 \
               --num_workers 8 \
               --model fpn \
               --pretrained checkpoints/deepglobe_fpn.pth \
               --pretrained_refinement checkpoints/deepglobe_refinement.pth \
               --num_classes 7 \
               --sub_batch_size 1 \
               --n_points 0.75 \
               --n_patches -1 \
               --smooth_kernel 11 \
               --save_pred \
               --save_dir test_results/deepglobe

# or in short, run the script below
sh scripts/deepglobe/test_magnet.sh

MagNet-Fast refinement:

python test.py --dataset deepglobe \
               --root data/deepglobe \
               --datalist data/list/deepglobe/test.txt \
               --scales 612-612,1224-1224,2448-2448 \
               --crop_size 612 612 \
               --input_size 508 508 \
               --num_workers 8 \
               --model fpn \
               --pretrained checkpoints/deepglobe_fpn.pth \
               --pretrained_refinement checkpoints/deepglobe_refinement.pth \
               --num_classes 7 \
               --sub_batch_size 1 \
               --n_points 0.9 \
               --n_patches 3 \
               --smooth_kernel 11 \
               --save_pred \
               --save_dir test_results/deepglobe_fast

# or in short, run the script below
sh scripts/deepglobe/test_magnet_fast.sh

Acknowledgments

Thanks to High-resolution networks and Segmentation Transformer for Semantic Segmentation for the backbone training script.

Contact

If you have any question, please drop an email to minhchuong.itus@gmail.com or create an issue on this repository.