Home

Awesome

Lightweight Generative Adversarial Networks for Text-Guided Image Manipulation

Pytorch implementation for Lightweight Generative Adversarial Networks for Text-Guided Image Manipulation. The goal is to introduce a lightweight generative adversarial network for efficient image manipulation using natural language descriptions.

Overview

<img src="archi.jpg" width="940px" height="230px"/>

Lightweight Generative Adversarial Networks for Text-Guided Image Manipulation.
Bowen Li, Xiaojuan Qi, Philip H. S. Torr, Thomas Lukasiewicz.<br> University of Oxford, University of Hong Kong <br> NeurIPS 2020 <br>

Data

  1. Download the preprocessed metadata for bird and coco, and save both into data/
  2. Download bird dataset and extract the images to data/birds/
  3. Download coco dataset and extract the images to data/coco/

Training

All code was developed and tested on CentOS 7 with Python 3.7 (Anaconda) and PyTorch 1.1.

DAMSM model includes a text encoder and an image encoder

python pretrain_DAMSM.py --cfg cfg/DAMSM/bird.yml --gpu 0
python pretrain_DAMSM.py --cfg cfg/DAMSM/coco.yml --gpu 1

Our Model

python main.py --cfg cfg/train_bird.yml --gpu 2
python main.py --cfg cfg/train_coco.yml --gpu 3

*.yml files include configuration for training and testing. To reduce the number of parameters used in the model, please edit DF_DIM and/or GF_DIM values in the corresponding *.yml files.

Pretrained DAMSM Model

Pretrained Lightweight Model

Testing

python main.py --cfg cfg/eval_bird.yml --gpu 4
python main.py --cfg cfg/eval_coco.yml --gpu 5

Evaluation

Code Structure

Citation

If you find this useful for your research, please use the following.

@article{li2020lightweight,
  title={Lightweight Generative Adversarial Networks for Text-Guided Image Manipulation},
  author={Li, Bowen and Qi, Xiaojuan and Torr, Philip and Lukasiewicz, Thomas},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  year={2020}
}

Acknowledgements

This code borrows heavily from ManiGAN and ControlGAN repositories. Many thanks.