Awesome
BurstM: Deep Burst Multi-scale SR using Fourier Space with Optical Flow (ECCV 2024)
EungGu Kang, Byeonghun Lee, Sunghoon Im, Kyong Hwan Jin
This repository contains the official implementation for BurstM introduced in the following paper:
News
- July 01, 2024: Paper accepted at ECCV 2024 :tada:
- Sep 30, 2024: Paper link updated :tada:
<hr />Multiframesuper-resolution(MFSR)achieveshigherperfor- mance than single image super-resolution (SISR), because MFSR leverages abundant information from multiple frames. Recent MFSR approaches adapt the deformable convolution network (DCN) to align the frames. However, the existing MFSR suffers from misalignments between the reference and source frames due to the limitations of DCN, such as small receptive fields and the predefined number of kernels. From these problems, existing MFSR approaches struggle to represent high-frequency information. To this end, we propose Deep Burst Multi-scale SR using Fourier Space with Optical Flow (BurstM). The proposed method estimates the optical flow offset for accurate alignment and predicts the continuous Fourier coefficient of each frame for representing high-frequency textures. In addition, we have enhanced the network’s flexibility by supporting various super-resolution (SR) scale factors with the unimodel. We demonstrate that our method has the highest performance and flexibility than the existing MFSR methods.
Overall architectures for BurstM
Quantitative comparison
x4 inference result for BurstSR dataset(Real-world dataset)
Multi-scale inference result for BurstSR dataset(Real-world dataset)
Dependencies
- OS: Ubuntu 22.04
- nvidia cuda: 12.4
- Python: 3.10.14
- pytorch: 2.3.0
We used NVIDIA RTX 3090 24GB, sm86
We recommend using conda for installation:
conda env create --file environment.yaml
conda activate BurstM
Training
SyntheticBurst
-
Download dataset(Zurich RAW to RGB dataset) Download.
-
Train
# Please modify the path of input directory
CUDA_VISIBLE_DEVICES=0,1,2,3 python BurstM_Track_1_training.py --input_dir=<Input DIR> --log_dir=<Log DIR> --model_dir=<Model save DIR> --result_dir=<tensorboard dir>
BurstSR(Real-world data)
-
Download dataset(BurstSR for real-world datasets) Download
-
Train
# Please modify the path of input directory
CUDA_VISIBLE_DEVICES=0,1,2,3 python BurstM_Track_2_training.py --input_dir=<Input DIR> --pre_trained=<Pretrained model of SyntheticBurst> --log_dir=<Log DIR> --model_dir=<Model save DIR> --result_dir=<tensorboard dir>
Test
SyntheticBurst
-
Download pre-trained models of SyntheticBurst Download.
-
Test
If you want to change the super-resolution scale, please change --scale. Not only intager scales, but also floating scales are possible. But the qualities of floating sclae such as x2.5 and x3.5 are not guaranteed.
# Please modify the path of iamge directory for inputs and pre-trained models(weights).
CUDA_VISIBLE_DEVICES=0 python BurstM_Track_1_evaluation.py --input_dir=<Input DIR> --scale=4 --weights=<Pretrained model of SyntheticBurst> --result_dir=<Result DIR> --result_gt_dir=<GT Result DIR>
BurstSR(Real-world data)
-
Download pre-trained models of BurstSR Download
-
Test
If you want to change the super-resolution scale, please change --scale. Not only intager scales, but also floating scales are possible. But the qualities of floating sclae such as x2.5 and x3.5 are not guaranteed.
# Please modify the path of iamge directory for inputs and pre-trained models(weights).
CUDA_VISIBLE_DEVICES=0 python BurstM_Track_2_evaluation.py --input_dir=<Input DIR> --scale=4 --weights=<Pretrained model of BurstSR> --result_dir=<Result DIR> --result_gt_dir=<GT Result DIR>
Citations
If our code helps your research or work, please consider citing our paper. The following is a BibTeX reference.
Will be updated
Acknowledgement
This work is mainly based on NIS and Burstormer, we thank the authors for the contribution.