Awesome
DeSRA (ICML 2023)
š© Updates
- ā The collected datasets, the codes of detecting artifacts and calculating metrics are released.
- ā Release the MSE and GAN models (Real-ESRGAN, LDL, SwinIR) and SegFormer (the checkpoint and configuration).
- ā Release the GAN-DeSRA models (RealESRGAN-DeSRA, LDL-DeSRA and SwinIR-DeSRA).
This paper aims at dealing with GAN-inference artifacts. <br>
We design a method to effectively detect regions with GAN-inference artifacts, and further propose a fine-tuning strategy that only requires a small number of artifact images to eliminate the same kinds of artifacts, which bridges the gap of applying SR algorithms to practical scenarios.
:book: DeSRA: Detect and Delete the Artifacts of GAN-based Real-World Super-Resolution Models
<p align="center"> <img src="./assets/DeSRA_teasor.jpg", height=400> </p>[Paper] ā [Project Page] ā [Video] ā [Bē«] ā [Poster] ā [PPT slides]<br> Liangbin Xie*, Xintao Wang*, Xiangyu Chen*, Gen Li, Ying Shan, Jiantao Zhou, Chao Dong <br> Tencent ARC Lab; University of Macau; Shenzhen Institutes of Advanced Technology; Shanghai AI Lab
š§ Dependencies and Installation
- Python >= 3.7 (Recommend to use Anaconda or Miniconda)
- PyTorch >= 1.7
- Option: NVIDIA GPU + CUDA
- Option: Linux
Installation (The version of mmsegmentation utilized in this project is 0.29.0.)
-
Install mmsegmentation package and install dependent packages. Note: The version of mmsegmentation and mmcv-full that used in the experiment are <b>0.29.0</b> and <b>1.6.1</b>, respectively. Setting up the environment might take some time.
git clone https://github.com/open-mmlab/mmsegmentation.git cd mmsegmentation pip install -r requirements.txt
-
Clone repo and move the provided scripts into the demo folder (a subfolder) in mmsegmentation folder.
git clone https://github.com/TencentARC/DeSRA cd DeSRA mv scripts/* mmsegmentation/demo (you need to modify the path)
If you encounter problem, I also provide the environment that I used in the experiments. You can refer to the requirements.txt
š¦ Testing datasets
For three representative methods: RealESRGAN, LDL and SwinIR, we choose nearly 200 representative images with GAN-inference artifacts to construct this GAN-SR artifact dataset. You can download from GoogleDrive and BaiduDisk. (For each methods, we provide the MSE-SR, GAN-SR, DeSRA-Mask, LR, and human-labeled GT-Mask)
š° Pre-trained Models
We have provided the MSE-based and GAN-based models used to detect artifacts generated by the three methods: Real-ESRGAN, LDL, and SwinIR, as well as the corresponding checkpoint and configuration files of the SegFormer we used in the experiment. You can download from GoogleDrive.
The GAN-DeSRA models are also released GoogleDrive. For each method, we have released the corresponding three checkpoints. The effect of the model fine-tuned for 1000 iterations may not be significant under some methods. In this case, you can try other checkpoints that have been fine-tuned for a longer time.
āļø Quick Inference
-
Detect the artifacts between the MSE-SR results and GAN-SR results. We store many intermediate results and the final detected binary artifact map are stored in Final_Artifact_Map folder. The config file and checkpoint of SegFormer can be found in the mmsegmentation package.
python demo/artifact_detection.py --mse_root="./LDL/MSE-SR" --gan_root="./LDL/GAN-SR" --save_root="./results/LDL/DeSRA-Mask"
-
Evaluate the performance. As mentioned in our paper, we provide three scripts to calculate IOU, Precision and Recall, respectively. You can find these scripts in metrics folder.
python metrics/calc_iou.py python metrics/calc_precision.py python metrics/calc_recall.py
š License and Acknowledgement
DeSRA is released under Apache License Version 2.0.
BibTeX
@article{xie2023desra,
title={DeSRA: Detect and Delete the Artifacts of GAN-based Real-World Super-Resolution Models},
author={Xie, Liangbin and Wang, Xintao and Chen, Xiangyu and Li, Gen and Shan, Ying and Zhou, Jiantao and Dong, Chao},
year={2023}
}
š§ Contact
If you have any question, please email lb.xie@siat.ac.cn
.