Home

Awesome

Dual Regression Compression (DRC)

Yong Guo, Jingdong Wang, Qi Chen, Jiezhang Cao, Zeshuai Deng, Yanwu Xu, Jian Chen, Mingkui Tan

This repository contains the official Pytorch implementation and the pretrained models of Towards Lightweight Super-Resolution with Dual Regression Learning.


Contents

Catalog

Datasets

Used training and testing sets can be downloaded as follows:

Training SetNon-Blind Testing SetBlind Testing Set
DIV2K (800 training images, 100 validation images) + Flickr2K (2650 images)You can evaluate our non-blind models on several widely used benchmark datasets, including Set5, Set14, B100, Urban100, Manga109.You can evaluate our blind models on DIV2KRK dataset.

Please organize the datasets using the following hierarchy.

- datasets/
    - DIV2K
        - DIV2K_train_HR
        - DIV2K_train_LR_bicubic

    - DF2K
        - DF2K_HR
        - DF2K_LR_bicubic
    
    - benchmark
        - Set5
        - Set14
        - B100
        - Urban100
        - Manga109
    
    - DIV2KRK
        - gt
        - lr_x2 
        - lr_x4

Models

You can download the pre-trained large SR models enhanced by our Dual Regression (DR) scheme for 4x SR. More pretrained models can be found in the released assets of this repository.

MethodParamsFLOPs (G)DatasetPSNR (dB)SSIMModel Zoo
DRN-S4.8M109.9Set532.680.901Download
DRN-L9.8M224.8Set532.740.902Download
SwinIR-DR11.9M121.1Set533.030.904Download
DAT-DR14.8M155.1Set533.170.906Download

You can download the compressed non-blind SR models (remove 30% parameters) obtained by our Dual Regression Compression (DRC) approach for 4x SR.

MethodParamsFLOPs (G)DatasetPSNR (dB)SSIMModel Zoo
DRN-S303.1M72.3Set532.660.900Download
SwinIR-light-DRC635K6.8Set532.440.896Download

You can download the compressed blind SR models (remove 30% parameters) obtained by our DRC approach for 4x SR.

MethodParamsFLOPs (G)DatasetPSNR (dB)SSIMModel Zoo
DCLS-DRC14.2M57.1DIV2KRK29.010.798Download

Evaluating and Training

We put the detailed explanations about the code of evaluating and training in the corresponding folders. Please refer to more details in the README.md file within these folders.

Results

We achieved competitive performance. Detailed results can be found in the paper.

<details> <summary>Click to expand</summary> <p align="center"> <img width="900" src="figures/Table-1.png"> </p> <p align="center"> <img width="900" src="figures/Table-2.png"> </p> <p align="center"> <img width="450" src="figures/Table-3.png"> </p> <p align="center"> <img width="900" src="figures/img4x_compare.jpg"> </p> <p align="center"> <img width="900" src="figures/img4x_compression_compare.jpg"> </p> </details>

Citation

If you find this repository helpful, please consider citing:

@article{guo2022towards,
  title={Towards lightweight super-resolution with dual regression learning},
  author={Guo, Yong and Wang, Jingdong and Chen, Qi and Cao, Jiezhang and Deng, Zeshuai and Xu, Yanwu and Chen, Jian and Tan, Mingkui},
  journal={arXiv preprint arXiv:2207.07929},
  year={2022}
}