Awesome
RAW-to-sRGB (ICCV 2021)
PyTorch implementation of Learning RAW-to-sRGB Mappings with Inaccurately Aligned Supervision
1. Framework
<p align="center"><img src="./figs/framework.png" width="95%"></p> <p align="center">Figure 1: Illustration of the proposed joint learning framework.</p>2. Results
<p align="center"><img src="./figs/results.png" width="95%"></p> <p align="center">Figure 2: Example of data pairs of ZRR and SR-RAW datasets, where clear spatial misalignment can be observed with the reference line. With such inaccurately aligned training data, PyNet [22] and Zhang et al. [62] are prone to generating blurry results with spatial misalignment, while our results are well aligned with the input.</p>3. Preparation
-
Prerequisites
- Python 3.x and PyTorch 1.6.
- OpenCV, NumPy, Pillow, CuPy, colour_demosaicing, tqdm, lpips, scikit-image and tensorboardX.
-
Dataset
- Zurich RAW to RGB dataset. It can also be downloaded from Baidu Netdisk.
- Preprocessed SR-RAW Dataset. Note that here we preprocessed the original SR-RAW dataset according to the code. You can also download the original SR-RAW dataset here.
4. Quick Start
4.1 Pre-trained models
- The pre-trained models can be downloaded. You need to put them in the
RAW-to-sRGB/ckpt/
folder.
4.2 Training
-
Zurich RAW to RGB dataset
-
SR-RAW Dataset
4.3 Testing
-
Zurich RAW to RGB dataset
-
SR-RAW Dataset
4.4 Note
- You can specify which GPU to use by
--gpu_ids
, e.g.,--gpu_ids 0,1
,--gpu_ids 3
,--gpu_ids -1
(for CPU mode). In the default setting, all GPUs are used. - You can refer to options for more arguments.
5. Citation
If you find it useful in your research, please consider citing:
@inproceedings{RAW-to-sRGB,
title={Learning RAW-to-sRGB Mappings with Inaccurately Aligned Supervision},
author={Zhang, Zhilu and Wang, Haolin and Liu, Ming and Wang, Ruohao and Zuo, Wangmeng and Zhang, Jiawei},
booktitle={ICCV},
year={2021}
}
6. Acknowledgement
This repo is built upon the framework of CycleGAN, and we borrow some code from PyNet, Zoom-Learn-Zoom, PWC-Net and AdaDSR, thanks for their excellent work!