Home

Awesome

[CVPR 2023] Learning Semantic-Aware Knowledge Guidance for Low-Light Image Enhancement (Paper)

Yuhui Wu, Chen Pan, Guoqing Wang*, Yang Yang, Jiwei Wei, Chongyi Li, Heng Tao Shen (*Corresponding Author)

University of Electronic Science and Technology of China (UESTC)

Introduction

This repository is the official implementation of the paper, "Learning Semantic-Aware Knowledge Guidance for Low-Light Image Enhancement", where more implementation details are presented.

Motivation and Superiority

MotiSupe

Overall

Framework

Dataset

You can refer to the following links to download the datasets:

LOL: Chen Wei, Wenjing Wang, Wenhan Yang, and Jiaying Liu. "Deep Retinex Decomposition for Low-Light Enhancement", BMVC, 2018. [Baiduyun (extracted code: sdd0)] [Google Drive] <br> LOL-v2 (the extension work): Wenhan Yang, Haofeng Huang, Wenjing Wang, Shiqi Wang, and Jiaying Liu. "Sparse Gradient Regularized Deep Retinex Network for Robust Low-Light Image Enhancement", TIP, 2021. [Baiduyun (extracted code: l9xm)] [Google Drive] <br> <br>

Results

The evauluation results on LOL are as follows:

MethodPSNRSSIMLPIPSMethodPSNRSSIMLPIPS
RetinexNet16.7700.4620.474HWMNet24.2400.8520.114
RetinexNet-SKF20.4180.7110.216HWMNet-SKF25.0860.8600.108
KinD20.8700.7990.207SNR-LLIE-Net24.6080.8400.151
KinD-SKF21.9130.8350.143SNR-LLIE-Net-SKF25.0310.8550.113
DRBN19.8600.8340.155LLFlow-S24.0600.8600.136
DRBN-SKF22.8370.8410.138LLFlow-S-SKF25.9420.8650.125
KinD++18.9700.8040.175LLFlow-L24.9990.8700.117
KinD++-SKF20.3630.8050.201LLFlow-L-SKF26.7980.8790.105

Test and Train

More details can be found in the subfolder of each baseline.

Citation

If you find our work useful for your research, please cite our paper

@inproceedings{wu2023skf,
  title={Learning Semantic-Aware Knowledge Guidance for Low-Light Image Enhancement},
  author={Yuhui, Wu and Chen, Pan and Guoqing, Wang and Yang, Yang and Jiwei, Wei and Chongyi, Li and Heng Tao Shen},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={},
  year={2023}
}

Contact

If you have any question, please feel free to contact us via wuyuhui132@gmail.com or panchen0103@gmail.com.