Home

Awesome

LKDN

Large Kernel Distillation Network for Efficient Single Image Super-Resolution

Chengxing Xie, Xiaoming Zhang, Linze Li, Haiteng Meng, Tianlin Zhang, Tianrui Li and Xiaole Zhao

Environment

Installation

pip install -r requirements.txt
python setup.py develop

How To Test

python basicsr/test.py -opt options/test/LKDN/test_LKDN_x4.yml

The testing results will be saved in the ./results folder.

How To Train

python basicsr/train.py -opt options/train/LKDN/train_LKDN_x4.yml

More training commands can refer to this page.

The training logs and weights will be saved in the ./experiments folder.

How To Re-parameterize

Refer to ./pth for the validation and use of re-parameterization.

conv1x1_3x3.py, conv1x1.py and shortcut.py respectively verify the three re-parameterization methods.

del_params_ema.py simplifies the .pth file. (Remove the additional parameters retained when using EMA.)

print_pth.py prints the content of the .pth file.

reparm.py reparameterizes the model.

Note that the LKDN-S_del_rep_x4.pth is the model after re-parameterizing, and the LKDN-S_x4.pth is the model without re-parameterizing.

Results

Benchmark results on SR ×4. Multi-Adds is calculated with a 1280 × 720 GT image.

MethodParams[K]Multi-Adds[G]Set5 PSNR/SSIMSet14 PSNR/SSIMBSD100 PSNR/SSIMUrban100 PSNR/SSIMManga109 PNSR/SSIM
BSRN35219.432.35/0.896628.73/0.784727.65/0.738726.27/0.790830.84/0.9123
VapSR34219.532.38/0.897828.77/0.785227.68/0.739826.35/0.794130.89/0.9132
LKDN32218.332.39/0.897928.79/0.785927.69/0.740226.42/0.796530.97/0.9140

The inference results on benchmark datasets are available at Google Drive or Baidu Netdisk.

Contact

If you have any question, please email zxc0074869@gmail.com.