Home

Awesome

Unsupervised Domain Adaptation for Remote Sensing Semantic Segmentation with Transformer

Our article has been published in Remote Sensing as part of the Special Issue Deep Learning for Satellite Image Segmentation!

HTML Version | PDF Version

If you find this project useful in your research, please consider citing:

Li, W.; Gao, H.; Su, Y.; Momanyi, B.M. Unsupervised Domain Adaptation for Remote Sensing Semantic Segmentation with Transformer. Remote Sens. 2022, 14, 4942. https://doi.org/10.3390/rs14194942

Preparation

Pre-requisites

Pretrained weights & Checkpoints

  1. Please download the MiT weights pretrained on ImageNet-1K provided by the official SegFormer repository and put them in a folder pretrained/ within this project. Only mit_b5.pth are uesd in our experiments.

  2. We provide three checkpoints for each domain adaptation on POT→VAI and VAI→POT. Please put them in work_dirs/.

POT→VAI:

IDRoadBuildingVegetationTreeCarCluttermIoUurl
20220804_23164672.6490.9657.8272.7644.2139.0962.91link
20220804_21570476.9089.0463.9276.3041.9334.5363.77link
20220809_09452076.1190.9562.5874.5338.7435.1763.01link
Mean75.2290.3261.4474.5341.6336.2663.23

VAI→POT:

IDRoadBuildingVegetationTreeCarCluttermIoUurl
20220805_16451161.1673.3356.3461.7266.121.1653.3link
20220805_18043976.8387.1761.3855.9465.241.1657.81link
20220808_14360975.4787.5460.8652.2465.230.1156.91link
Mean71.1582.6859.5356.6365.530.8156.01

Setup Datasets

ISPRS Potsdam and ISPRS Vaihingen can be requested at link.

For ISPRS Potsdam dataset, the ‘2_Ortho_RGB.zip’ and ‘5_Labels_all_noBoundary.zip’ are required. Please run the following command to re-organize the dataset.

python tools/convert_datasets/potsdam.py /path/to/potsdam

For ISPRS Vaihingen dataset, the ‘ISPRS_semantic_labeling_Vaihingen.zip’ and ‘ISPRS_semantic_labeling_Vaihingen_ground_truth_eroded_COMPLETE.zip’ are required. Please run the following command to re-organize the dataset.

python tools/convert_datasets/vaihingen.py /path/to/vaihingen

After re-organizing, the datasets have the following structures:

.
├── ...
├── data
│   ├── potsdam
│   │   ├── ann_dir
│   │   │   ├── train
│   │   │   └── val
│   │   └── img_dir
│   │       ├── train
│   │       └── val
│   └── vaihingen
│       ├── ann_dir
│       │   ├── train
│       │   └── val
│       └── img_dir
│           ├── train
│           └── val
├── ...

For more information about datasets, please refer to Prepare datasets provided by MMSegmentation.

Training

To ensure reproduction, the random seed has been fixed in the code. Still, you may need to train a few times to reach the comparable performance. A training job can be launched using:

python -m tools.train "configs/uda_rs/potsdam2isprs_uda_pt7_local7_label_warm_daformer_mitb5.py" # POT to VAI
python -m tools.train "configs/uda_rs/isprs2potsdam_uda_pt7_local7_label_warm_daformer_mitb5.py" # VAI to POT

By default, logs and checkpoints are stored in work_dirs/<experiments> with this structure:

work_dirs/<experiments>/<config_name>.py  # config file
work_dirs/<experiments>/latest.pth        # checkpoint 
work_dirs/<experiments>/<log_time>.log    # log

Testing & Predictions

A testing on the validation set can be launched using:

python -m tools.test <CONFIG_FILE> <CHECKPOINT_FILE> --eval mIoU mFscore --show-dir <SHOW_DIR> --opacity 1 --gpu-id <GPU_ID>

For convenience, we provide a script to simplify the arguments:

test.sh work_dirs/<experiments> [iteration_num, [GPU_ID]] # By default, iteration 4000 and GPU 0 are used.
test.sh work_dirs/20220804_231646_potsdam2isprs_uda_pt7_dw_local7_label_warm_daformer_mitb5

The predictions are saved for inspection to work_dirs/<experiments>/preds and the mIoU of the model is printed to the console.

Acknowledgements

This project is heavily based on the following open-source projects. We thank their authors for making the source code publically available.