Home

Awesome

Deep Atrous Guided Filter

Our submission to the Under Display Camera Challenge (UDC) at ECCV 2020. We placed 2nd and 5th on the POLED and TOLED tracks respectively!

Project Page | Paper | Open In Colab

Method Diagram

Official implementation of our ECCVW 2020 paper, "Deep Atrous Guided Filter for Image Restoration in Under Display Cameras", Varun Sundar<sup>*</sup>, Sumanth Hedge<sup>*</sup>, Divya K Raman, Kaushik Mitra. Indian Institute of Technology Madras, * denotes equal contribution.

Quick Collab Demo

If you want to experiment with Deep Atrous Guided Filter (DAGF), we recommend you get started with the collab notebook. It exposes the core aspects of our method, while abstracting away minor details and helper functions.

It requires no prior setup, and contains a demo for both POLED and TOLED measurements.

If you're unfamiliar with Under Display Cameras, they are a new imaging system for smartphones, where the camera is mounted right under the display. This makes truly bezel-free displays possible, and opens up a bunch of other applications. You can read more here.

Get Started

If you would like to reproduce all our experiments presented in the paper, head over to the experiments branch. For a concise version with just our final models, you may continue here.

You'll need to install the following:

Data

DatasetTrain FolderVal FolderTest Folder
POLEDPOLED_trainPOLED_valPOLED_test
TOLEDTOLED_trainTOLED_valTOLED_test
Simulated POLEDSim_trainSim_valNA
Simulated TOLEDSim_trainSim_valNA

Download the required folder and place it under the data/ directory. The train and val splits contain both low-quality measurements (LQ folder) and high-quality groudtruth (HQ folder). The test set contains only measurements currently.

We also provide our simulated dataset, based on training a shallow version of DAGF with Contextual Bilateral (CoBi) loss. For simulation specific details (procedure etc.) take a look at the experiments branch.

Configs and Checkpoints

We use sacred to handle config parsing, with the following command-line invocation:

python train{val}.py with config_name {other flags} -p

Various configs available:

ModelDatasetConfig NameCheckpoints
DAGFPOLEDours_poledours-poled
DAGF-simSimulated POLEDours_poled_simours-poled-sim
DAGF-PreTrPOLED (fine-tuned from DAGF-sim)ours_poled_PreTrours-poled-PreTr
DAGFTOLEDours_toledours-toled
DAGF-simSimulated TOLEDours_toled_simours-toled-sim
DAGF-PreTrTOLED (fine-tuned from DAGF-sim)ours_toled_PreTrours-toled-PreTr

Download the required checkpoint folder and place it under ckpts/.

DAGF-sim networks are first trained on simulated data. To obtain this data, we trained a shallow version of our final model to transform clean images to Glass/ POLED / TOLED. You can find the checkpoints and code to these networks in our experiments branch.

Further, see config.py for exhaustive set of config options. To add a config, create a new function in config.py and add it to named_configs`.

Directory Setup

Create the following symbolic links (assume path_to_root_folder/ is ~/udc_net):

High Level Organisation

Data folder: Each subfolder contains a data split.

|-- Poled_train
|   |-- HQ
|   |-- |-- 101.png
|   |-- |-- 102.png
|   |-- |-- 103.png
|   `-- LQ
|-- Poled_val
|   `-- LQ

Splits:

Outputs folder: Val, test dumps under various experiment names.

outputs
|-- ours-poled
|   |-- test_latest
|   |-- val_latest
        |-- 99.png
        |-- 9.png
        `-- metrics.txt

Ckpts folder: Ckpts under various experiment names. We store every 64th epoch, and every 5 epochs prior for model snapshots. This is mutable under config.py.

ckpts
|-- ours-poled
|   `-- model_latest.pth

Runs folder: Tensorboard event files under various experiment names.

runs
|-- ours-poled
|   |-- events.out.tfevents.1592369530.genesis.26208.0

Train Script

Run as: python train.py with xyz_config {other flags}

For a multi-gpu version (we use pytorch's distributed-data-parallel):

python -m torch.distributed.launch --nproc_per_node=3 --use_env train.py with xyz_config distdataparallel=True {other flags}

Val Script

Run as: python val.py with xyz_config {other flags}

Useful Flags:

See config.py for exhaustive set of arguments (under base_config).

Citation

If you find our work useful in your research, please cite:

@InProceedings{10.1007/978-3-030-68238-5_29,
author="Sundar, Varun
and Hegde, Sumanth
and Kothandaraman, Divya
and Mitra, Kaushik",
title="Deep Atrous Guided Filter for Image Restoration in Under Display Cameras",
booktitle="Computer Vision -- ECCV 2020 Workshops",
year="2020",
publisher="Springer International Publishing",
pages="379--397",
}

Contact

Feel free to mail us if you have any questions!