Awesome
Low-light Image Enhancement via Breaking Down the Darkness
by Xiaojie Guo, Qiming Hu.
:boom: Update Online Replicate Demo:
<!-- ![figure_tease](https://github.com/mingcv/Bread/blob/main/figures/figure_tease.png) -->1. Dependencies
- Python3
- PyTorch>=1.0
- OpenCV-Python, TensorboardX
- NVIDIA GPU+CUDA
2. Network Architecture
3. Data Preparation
3.1. Training dataset
- 485 low/high-light image pairs from our485 of LOL dataset, each low image of which is augmented by our exposure_augment.py to generate 8 images under different exposures. (Download Link for Augmented LOL)
- To train the MECAN (if it is desired), 559 randomly-selected multi-exposure sequences from SICE are adopted (Download Link for a resized version).
3.2. Tesing dataset
The images for testing can be downloaded in this link.
<!-- * 15 low/high-light image pairs from eval15 of [LOL dataset](https://daooshee.github.io/BMVC2018website/). * 44 low-light images from DICM. * 8 low-light images from NPE. * 24 low-light images from VV. -->4. Usage
4.1. Training
- Multi-exposure data synthesis:
python exposure_augment.py
- Train IAN:
python train_IAN.py -m IAN --comment IAN_train --batch_size 1 --val_interval 1 --num_epochs 500 --lr 0.001 --no_sche
- Train ANSN:
python train_ANSN.py -m1 IAN -m2 ANSN --comment ANSN_train --batch_size 1 --val_interval 1 --num_epochs 500 --lr 0.001 --no_sche -m1w ./checkpoints/IAN_335.pth
- Train CAN:
python train_CAN.py -m1 IAN -m3 FuseNet --comment CAN_train --batch_size 1 --val_interval 1 --num_epochs 500 --lr 0.001 --no_sche -m1w ./checkpoints/IAN_335.pth
- Train MECAN on SICE:
python train_MECAN.py -m FuseNet --comment MECAN_train --batch_size 1 --val_interval 1 --num_epochs 500 --lr 0.001 --no_sche
- Finetune MECAN on SICE and LOL datasets:
python train_MECAN_finetune.py -m FuseNet --comment MECAN_finetune --batch_size 1 --val_interval 1 --num_epochs 500 --lr 1e-4 --no_sche -mw ./checkpoints/FuseNet_MECAN_for_Finetuning_404.pth
4.2. Testing
- [Tips]: Using gamma correction for evaluation with parameter --gc; Show extra intermediate outputs with parameter --save_extra
- Evaluation:
python eval_Bread.py -m1 IAN -m2 ANSN -m3 FuseNet -m4 FuseNet --mef --comment Bread+NFM+ME[eval] --batch_size 1 -m1w ./checkpoints/IAN_335.pth -m2w ./checkpoints/ANSN_422.pth -m3w ./checkpoints/FuseNet_MECAN_251.pth -m4w ./checkpoints/FuseNet_NFM_297.pth
- Testing:
python test_Bread.py -m1 IAN -m2 ANSN -m3 FuseNet -m4 FuseNet --mef --comment Bread+NFM+ME[test] --batch_size 1 -m1w ./checkpoints/IAN_335.pth -m2w ./checkpoints/ANSN_422.pth -m3w ./checkpoints/FuseNet_MECAN_251.pth -m4w ./checkpoints/FuseNet_NFM_297.pth
- Remove NFM:
python test_Bread_NoNFM.py -m1 IAN -m2 ANSN -m3 FuseNet --mef -a 0.10 --comment Bread+ME[test] --batch_size 1 -m1w ./checkpoints/IAN_335.pth -m2w ./checkpoints/ANSN_422.pth -m3w ./checkpoints/FuseNet_MECAN_251.pth
4.3. Trained weights
Please refer to our release.