Home

Awesome

cap

๐Ÿ“ท Introduction

rschange is an open-source change detection toolbox, which is dedicated to reproducing and developing advanced methods for change detection of remote sensing images.

๐Ÿ”ฅ News

๐Ÿ” Preparation

๐Ÿ“’ Folder Structure

Prepare the following folders to organize this repo:

  rschangedetection
      โ”œโ”€โ”€ rscd (code)
      โ”œโ”€โ”€ work_dirs (save the model weights and training logs)
      โ”‚   โ””โ”€CLCD_BS4_epoch200 (dataset)
      โ”‚       โ””โ”€stnet (model)
      โ”‚           โ””โ”€version_0 (version)
      โ”‚              โ”‚  โ””โ”€ckpts
      โ”‚              โ”‚      โ”œโ”€test (the best ckpts in test set)
      โ”‚              โ”‚      โ””โ”€val (the best ckpts in validation set)
      โ”‚              โ”œโ”€log (tensorboard logs)
      โ”‚              โ”œโ”€train_metrics.txt (train & val results per epoch)
      โ”‚              โ”œโ”€test_metrics_max.txt (the best test results)
      โ”‚              โ””โ”€test_metrics_rest.txt (other test results)
      โ””โ”€โ”€ data
          โ”œโ”€โ”€ LEVIR_CD
          โ”‚   โ”œโ”€โ”€ train
          โ”‚   โ”‚   โ”œโ”€โ”€ A
          โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ images1.png
          โ”‚   โ”‚   โ”œโ”€โ”€ B
          โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ images2.png
          โ”‚   โ”‚   โ””โ”€โ”€ label
          โ”‚   โ”‚       โ””โ”€โ”€ label.png
          โ”‚   โ”œโ”€โ”€ val (the same with train)
          โ”‚   โ””โ”€โ”€ test(the same with train)
          โ”œโ”€โ”€ DSIFN
          โ”‚   โ”œโ”€โ”€ train
          โ”‚   โ”‚   โ”œโ”€โ”€ t1
          โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ images1.jpg
          โ”‚   โ”‚   โ”œโ”€โ”€ t2
          โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ images2.jpg
          โ”‚   โ”‚   โ””โ”€โ”€ mask
          โ”‚   โ”‚       โ””โ”€โ”€ mask.png
          โ”‚   โ”œโ”€โ”€ val (the same with train)
          โ”‚   โ””โ”€โ”€ test
          โ”‚       โ”œโ”€โ”€ t1
          โ”‚       โ”‚   โ””โ”€โ”€ images1.jpg
          โ”‚       โ”œโ”€โ”€ t2
          โ”‚       โ”‚   โ””โ”€โ”€ images2.jpg
          โ”‚       โ””โ”€โ”€ mask
          โ”‚           โ””โ”€โ”€ mask.tif
          โ”œโ”€โ”€ WHU_CD
          โ”‚   โ”œโ”€โ”€ train
          โ”‚   โ”‚   โ”œโ”€โ”€ image1
          โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ images1.png
          โ”‚   โ”‚   โ”œโ”€โ”€ image2
          โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ images2.png
          โ”‚   โ”‚   โ””โ”€โ”€ label
          โ”‚   โ”‚       โ””โ”€โ”€ label.png
          โ”‚   โ”œโ”€โ”€ val (the same with train)
          โ”‚   โ””โ”€โ”€ test(the same with train)
          โ”œโ”€โ”€ CLCD (the same with WHU_CD)
          โ””โ”€โ”€ SYSU_CD
              โ”œโ”€โ”€ train
              โ”‚   โ”œโ”€โ”€ time1
              โ”‚   โ”‚   โ””โ”€โ”€ images1.png
              โ”‚   โ”œโ”€โ”€ time2
              โ”‚   โ”‚   โ””โ”€โ”€ images2.png
              โ”‚   โ””โ”€โ”€ label
              โ”‚       โ””โ”€โ”€ label.png
              โ”œโ”€โ”€ val (the same with train)
              โ””โ”€โ”€ test(the same with train)

๐Ÿ“š Use example

๐ŸŒŸ Citation

If you are interested in our work, please consider giving a ๐ŸŒŸ and citing our work below. We will update rschange regularly.

@inproceedings{stnet,
  title={STNet: Spatial and Temporal feature fusion network for change detection in remote sensing images},
  author={Ma, Xiaowen and Yang, Jiawei and Hong, Tingfeng and Ma, Mengting and Zhao, Ziyan and Feng, Tian and Zhang, Wei},
  booktitle={2023 IEEE International Conference on Multimedia and Expo (ICME)},
  pages={2195--2200},
  year={2023},
  organization={IEEE}
}

@INPROCEEDINGS{ddlnet,
  author={Ma, Xiaowen and Yang, Jiawei and Che, Rui and Zhang, Huanting and Zhang, Wei},
  booktitle={2024 IEEE International Conference on Multimedia and Expo (ICME)}, 
  title={DDLNet: Boosting Remote Sensing Change Detection with Dual-Domain Learning}, 
  year={2024},
  volume={},
  number={},
  pages={1-6},
  doi={10.1109/ICME57554.2024.10688140}}

@article{cdmask,
  title={Rethinking Remote Sensing Change Detection With A Mask View},
  author={Ma, Xiaowen and Wu, Zhenkai and Lian, Rongrong and Zhang, Wei and Song, Siyang},
  journal={arXiv preprint arXiv:2406.15320},
  year={2024}
}

๐Ÿ“ฎ Contact

If you are confused about the content of our paper or look forward to further academic exchanges and cooperation, please do not hesitate to contact us. The e-mail address is xwma@zju.edu.cn. We look forward to hearing from you!

๐Ÿ’ก Acknowledgement

Thanks to previous open-sourced repo:

Thanks to the main contributor Zhenkai Wu