Awesome
Deep Generative Reflectance Fusion
Achieving Landsat-like reflectance at any date by fusing Landsat and MODIS surface reflectance with deep generative models.
Getting Started
<p align="center"> <img src="https://github.com/Cervest/ds-generative-reflectance-fusion/blob/master/docs/source/img/deep_reflectance_fusion.png" alt="Reflectance Fusion Drawing" width="800"/> </p>Running experiments
Setup YAML configuration files specifying experiment : dataset, model, optimizer, experiment. See here for examples.
Execute training on, say GPU 0, as:
$ python run_training.py --cfg=path/to/config.yaml --o=output/directory --device=0
Once training completed, specify model checkpoint to evaluate in previously defined YAML configuration file and run evaluation as:
$ python run_testing.py --cfg=path/to/config.yaml --o=output/directory --device=0
Preimplemented experiments
Experiment | Mean Absolute Error | PSNR | SSIM | SAM |
---|---|---|---|---|
ESTARFM | - | 21.0 | 0.645 | 0.0488 |
cGAN + L1 | 218 | 22.8 | 0.717 | 0.0275 |
cGAN + L1 + SSIM | 215 | 23.0 | 0.732 | 0.0270 |
Compile ESTARFM
To compile ESTARFM please follow guidelines from official repository.
Project Structure
├── data/
├── repro/
├── src/
│ ├── cuESTARFM
│ ├── deep_reflectance_fusion
│ ├── prepare_data
│ └── utils
├── tests
├── run_training.py
├── run_testing.py
├── run_ESTARFM.py
└── run_ESTARFM_evaluation.py
Directories :
data/
: Landsat-MODIS reflectance time series dataset and experiments outputsrepro/
: bash scripts to run data version control pipelinessrc/
: modules to run reflectance patches extraction and deep reflectance fusion experimentstests/
: unit testingutils/
: miscellaneous utilities
Installation
Code implemented in Python 3.8
Setting up environment
Clone and go to repository
$ git clone https://github.com/Cervest/ds-generative-reflectance-fusion.git
$ cd ds-generative-reflectance-fusion
Create and activate environment
$ pyenv virtualenv 3.8.2 fusion
$ pyenv activate fusion
$ (fusion)
Install dependencies
$ (fusion) pip install -r requirements.txt
Setting up dvc
From the environment and root project directory, you first need to build symlinks to data directories as:
$ (fusion) dvc init -q
$ (fusion) python repro/dvc.py --link=where/data/stored --cache=where/cache/stored
if no --link
specified, data will be stored by default into data/
directory and default cache is .dvc/cache
.
To reproduce a pipeline stage, execute:
$ (fusion) dvc repro -s stage_name
In case pipeline is broken, hidden bash files are provided under repro
directory
References
@misc{bouabid2020predicting,
title={Predicting Landsat Reflectance with Deep Generative Fusion},
author={Shahine Bouabid and Maxim Chernetskiy and Maxime Rischard and Jevgenij Gamper},
year={2020},
eprint={2011.04762},
archivePrefix={arXiv},
primaryClass={cs.CV}
}