Awesome
TRG-Net: An Interpretable and Controllable Rain Generator
Zhiqiang Pang, Hong Wang, Qi Xie, Deyu Meng, Zongben Xu
[Arxiv]
Abstract
Exploring and modeling rain generation mechanism is critical for augmenting paired data to ease training of rainy image processing models. Most of the conventional methods handle this task in an artificial physical rendering manner, through elaborately designing fundamental elements constituting rains. This kind of methods, however, are over-dependent on human subjectivity, which limits their adaptability to real rains. In contrast, recent deep learning methods have achieved great success by training a neural network based generator from precollected rainy image data. However, current methods usually design the generator in a “black-box” manner, increasing the learning difficulty and data requirements. To address these issues, this study proposes a novel deep learning based rain generator, which fully takes the physical generation mechanism underlying rains into consideration and well encodes the learning of the fundamental rain factors (i.e., shape, orientation, length, width and sparsity) explicitly into the deep network. Its significance lies in that the generator not only elaborately design essential elements of the rain to simulate expected rains, like conventional artificial strategies, but also finely adapt to complicated and diverse practical rainy images, like deep learning methods. By rationally adopting filter parameterization technique, the proposed rain generator is finely controllable with respect to rain factors and able to learn the distribution of these factors purely from data without the need for rain factor labels. Our unpaired generation experiments demonstrate that the rain generated by the proposed rain generator is not only of higher quality, but also more effective for deraining and downstream tasks compared to current state-of-the-art rain generation methods. Besides, the paired data augmentation experiments, including both indistribution and out-of-distribution (OOD), further validate the diversity of samples generated by our model for in-distribution deraining and OOD generalization tasks.
Controllable Rain Generation
Controlling Rain with Respect to Orientation
More controllable generation results can be found in gifs folder.
Usage
Datasets
Rain100L, Rain100H and SPA-Data can be downloaded from here.
Clone this repository
git clone https://github.com/pzq-xjtu/TRG-Net.git
cd TRG-Net
Create conda environment and install dependencies
TRG-Net is implemented with Pytorch>=1.7.1
conda create -n TRGNet
conda activate TRGNet
conda install pytorch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 pytorch-cuda=11.8 -c pytorch -c nvidia
pip install -r requirements.txt
Paired train using sagan
# take rain100L as example
python train_sagan.py -opt train_sagan.yml
Paired train using patchgan
python train_patchgan.py -opt train_patchgan.yml
Unpaired train
Coming soon
Acknowledgement
Code borrows from VRGNet.
Citation
@article{pang2024trg,
title={TRG-Net: An Interpretable and Controllable Rain Generator},
author={Pang, Zhiqiang and Wang, Hong and Xie, Qi and Meng, Deyu and Xu, Zongben},
journal={arXiv preprint arXiv:2403.09993},
year={2024}
}
Contact
If you have any question, please feel free to concat Zhiqiang Pang (E-mail: xjtupzq@gmail.com)