Home

Awesome

CGNet-CD:https://chengxihan.github.io/

You can still run CGNet in the open-cd repository. https://github.com/likyoo/open-cd

The Pytorch implementation for::gift::gift::gift: “Change Guiding Network: Incorporating Change Prior to Guide Change Detection in Remote Sensing Imagery,” IEEE J. SEL. TOP. APPL. EARTH OBS. REMOTE SENS., PP. 1–17, 2023, DOI: 10.1109/JSTARS.2023.3310208. C. HAN, C. WU, H. GUO, M. HU, J.Li AND H. CHEN, :yum::yum::yum:

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

[2 Sep. 2023] Release the first version of the CGNet image-20230415

Requirement

-Pytorch 1.8.0  
-torchvision 0.9.0  
-python 3.8  
-opencv-python  4.5.3.56  
-tensorboardx 2.4  
-Cuda 11.3.1  
-Cudnn 11.3  

Training,Test and Visualization Process

python train_CGNet.py --epoch 50 --batchsize 8 --gpu_id '1' --data_name 'WHU' --model_name 'CGNet'

python test.py --gpu_id '1' --data_name 'WHU' --model_name 'CGNet'

You can change data_name for different datasets like "LEVIR", "WHU", "SYSU", "S2Looking", "CDD", and "DSIFN".

Test our trained model result

You can directly test our model by our provided CGNet weights in output/WHU, LEVIR, SYSU, S2Looking, CDD, and DSIFN . Download in Baidu Disk,pwd:2023 :yum::yum::yum:

And also we provide all test results of our CGNet in the CGNetTestResult!!!! Download in CGNetTestResult or Baidu Disk,pwd:2023 :yum::yum::yum:

Dataset Download

LEVIR-CD:https://justchenhao.github.io/LEVIR/ , our paper split in Baidu Disk,pwd:2023

WHU-CD:http://gpcv.whu.edu.cn/data/building_dataset.html ,our paper split in Baidu Disk,pwd:2023

SYSU-CD: Our paper split in Baidu Disk,pwd:2023

S2Looking-CD: Our paper split in Baidu Disk,pwd:2023

CDD-CD: Our split in Baidu Disk,pwd:2023

DSIFN-CD: Our split in Baidu Disk,pwd:2023

Note: We crop all datasets to a slice of 256×256 before training with it.

Dataset Path Setting

 LEVIR-CD or WHU-CD or SYSU-CD or S2Looking-CD
     |—train  
          |   |—A  
          |   |—B  
          |   |—label  
     |—val  
          |   |—A  
          |   |—B  
          |   |—label  
     |—test  
          |   |—A  
          |   |—B  
          |   |—label

Where A contains images of the first temporal image, B contains images of the second temporal image, and label contains ground truth maps.

image-20230415 image-20230415 image-20230415 image-20230415 image-20230415

Although our proposed method of CGNet does not achieve the effect of SOTA on CDD-CD and DSIFN-CD datasets, we still provide our results here for the convenience of peer comparison experiments.

image-20230415

Acknowledgments

Thanks to all my co-authors Haonan Guo,Meiqi Hu,Jiepan Li, and Hongruixuan Chen. Thanks for their great work!!

Citation

If you use this code for your research, please cite our papers.

@ARTICLE{10093022,
  author={Han, Chengxi and Wu, Chen and Guo, Haonan and Hu, Meiqi and Jiepan Li, and Chen, Hongruixuan},
  journal={IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing}, 
  title={Change Guiding Network: Incorporating Change Prior to Guide Change Detection in Remote Sensing Imagery}, 
  year={2023},
  volume={},
  number={},
  pages={1-17},
  doi={10.1109/JSTARS.2023.3310208}}

Reference

[1] C. HAN, C. WU, H. GUO, M. HU, J.Li, AND H. CHEN, “Change Guiding Network: Incorporating Change Prior to Guide Change Detection in Remote Sensing Imagery,” IEEE J. SEL. TOP. APPL.EARTH OBS. REMOTE SENS., PP. 1–17, 2023, DOI:10.1109/JSTARS.2023.3310208 .

[2] C. HAN, C. WU, H. GUO, M. HU, AND H. CHEN, “HANet: A hierarchical attention network for change detection with bi-temporal very-high-resolution remote sensing images,” IEEE J. SEL. TOP. APPL.EARTH OBS. REMOTE SENS., PP. 1–17, 2023, DOI: 10.1109/JSTARS.2023.3264802.

[3] HCGMNET: A Hierarchical Change Guiding Map Network For Change Detection.

[4]C. Wu et al., "Traffic Density Reduction Caused by City Lockdowns Across the World During the COVID-19 Epidemic: From the View of High-Resolution Remote Sensing Imagery," in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 14, pp. 5180-5193, 2021, doi: 10.1109/JSTARS.2021.3078611.