Awesome
SRGAN(Super-Resolution Generative Adversarial Network
A tensorflow implementation of Christian et al's "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network" paper. ( See : https://arxiv.org/abs/1609.04802 ) This implementation is quite different from original paper. The differences are as followings:
- MNIST data set is used for convenience. ( It'll be straight-forward applying this scheme to large image data set like Urban 100 )
- I've completely replace MSE loss with GAN using tuple input for discriminator.( see training source code )
- I've used ESPCN ( sub-pixel CNN ) instead of deconvolution. ( see : http://www.cv-foundation.org/openaccess/content_cvpr_2016/papers/Shi_Real-Time_Single_Image_CVPR_2016_paper.pdf )
The existing CNN based super-resolution skill mainly use MSE loss and this makes super-resolved images look blurry. If we replace MSE loss with gradients from GAN, we may prevent the blurry artifacts of the super-resolved images and this is the key idea of this paper. I think this idea looks promising and my experiment result using MNIST data set looks good.
Dependencies
- tensorflow >= rc0.11
- sugartensor >= 0.0.1.7
Training the network
Execute
<pre><code> python train.py </code></pre>to train the network. You can see the result ckpt files and log files in the 'asset/train' directory. Launch tensorboard --logdir asset/train/log to monitor training process.
Generating image
Execute
<pre><code> python generate.py </code></pre>to generate sample image. The 'sample.png' file will be generated in the 'asset/train' directory.
Super-resolution image sample
This image was generated by SRGAN.
<p align="center"> <img src="https://raw.githubusercontent.com/buriburisuri/SRGAN/master/png/sample.png" width="350"/> </p>Other resources
- Original GAN tensorflow implementation
- InfoGAN tensorflow implementation
- Supervised InfoGAN tensorflow implementation
- EBGAN tensorflow implementation
- Time-series InfoGAN tensorflow implementation
Authors
Namju Kim (buriburisuri@gmail.com) at Jamonglabs Co., Ltd.