Home

Awesome

WaterGAN

<p align="center"> <img src="https://github.com/kskin/WaterGAN/blob/master/watergan.PNG?raw=true"/> </p>

Usage

Download data:

  1. MHL test tank dataset: MHL.tar.gz
  2. Jamaica field dataset: Jamaica.tar.gz
  3. In air data: Any RGB-D dataset, e.g. Microsoft 7-Scenes, NYU Depth, UW RGB-D Object, B3DO<br /> Note: The current configuration expects 640x480 PNG images for in-air data.

Directory structure:

.
├── ...
├── data                    
│   ├── air_images
│   │   └── *.png
│   ├── air_depth  
│   │   └── *.mat
│   └── water_images 
│       └── *.png
└── ...

Train a model with the MHL dataset:

python mainmhl.py --water_dataset water_images --air_dataset air_images --depth_dataset air_depth

Train a model with the Jamaica dataset:

python mainjamaica.py --water_dataset water_images --air_dataset air_images --depth_dataset air_depth

Color Correction Network

WaterGAN outputs a dataset with paired true color, depth, and (synthetic) underwater images. We can use this to train an end-to-end network for underwater image restoration. Source code and pretrained models for the end-to-end network are available here. For more details, see the paper.

Citations

If you find this work useful for your research, please cite WaterGAN in your publications.

@article{Li:2017aa,
	Author = {Jie Li and Katherine A. Skinner and Ryan Eustice and M. Johnson-Roberson},
	Date-Added = {2017-06-12 22:07:13 +0000},
	Date-Modified = {2017-06-12 22:12:20 +0000},
	Journal = {IEEE Robotics and Automation Letters (RA-L)},
	Keywords = {jrnl},
	Note = {accepted},
	Title = {WaterGAN: Unsupervised Generative Network to Enable Real-time Color Correction of Monocular Underwater Images},
	Year = {2017}}