Awesome
ATTSF Attention! Stay Focus!
Solution of Defocus Deblurring Challenge
Attention! Stay Focus! (ATTSF) - NTIRE 2021 - Paper
by Tu Vo
Content
Getting Started
- Clone the repository
Prerequisites
- Tensorflow 2.2.0+
- Tensorflow_addons
- Python 3.6+
- Keras 2.3.0
- PIL
- numpy
Running
Training
-
Preprocess
-
Download the training data
-
Unzip the file
-
-
Train ATTSF
- change
op_phase='train'
inconfig.py
python main.py
- change
-
Test ATTSF
- change
op_phase='valid'
inconfig.py
python main.py
- change
Usage
Training
usage: main.py [-h] [--filter FILTER] [--attention_filter ATTENTION_FILTER]
[--kernel KERNEL] [--encoder_kernel ENCODER_KERNEL]
[--decoder_kernel DECODER_KERNEL]
[--triple_pass_filter TRIPLE_PASS_FILTER] [--num_rrg NUM_RRG]
[--num_mrb NUM_MRB]
optional arguments:
-h, --help show this help message and exit
--filter FILTER
--attention_filter ATTENTION_FILTER
--kernel KERNEL
--encoder_kernel ENCODER_KERNEL
--decoder_kernel DECODER_KERNEL
--triple_pass_filter TRIPLE_PASS_FILTER
Testing
- Download the weight here and put it to the folder
ModelCheckpoints
- Note: as the part of our research, the weight file has been hidden.
Result
Left image | Right Image | Output
License
This project is licensed under the MIT License - see the LICENSE file for details
References
[1] Defocus Deblurring Challenge - NTIRE2021
Citation
@InProceedings{Vo_2021_CVPR,
author = {Vo, Tu},
title = {Attention! Stay Focus!},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2021},
pages = {479-486}
}
Acknowledgments
- This work is heavily based on the code from the challenge host . Thank you for the hard job.