Home

Awesome

fast-neural-style :city_sunrise: :rocket:

NOTICE: This codebase is no longer maintained, please use the codebase from pytorch examples repository available at pytorch/examples/fast_neural_style.

This repository contains a pytorch implementation of an algorithm for artistic style transfer. The algorithm can be used to mix the content of an image with the style of another image. For example, here is a photograph of a door arch rendered in the style of a stained glass painting.

The model uses the method described in Perceptual Losses for Real-Time Style Transfer and Super-Resolution along with Instance Normalization. The saved-models for examples shown in the README can be downloaded from here.

DISCLAIMER: This implementation is also a part of the pytorch examples repository. Implementation in this repository uses pretrained Caffe2 VGG whereas the pytorch examples repository implementation uses pretrained Pytorch VGG. The two VGGs have different preprocessings which results in different --content-weight and --style-weight parameters. The styled output images also look slightly different.

<p align="center"> <img src="images/style-images/mosaic.jpg" height="200px"> <img src="images/content-images/amber.jpg" height="200px"> <img src="images/output-images/amber-mosaic.jpg" height="440px"> </p>

Requirements

The program is written in Python, and uses pytorch, scipy. A GPU is not necessary, but can provide a significant speed up especially for training a new model. Regular sized images can be styled on a laptop, desktop using saved models.

Setup the environnment

Run with virtualenv

Create a virtualenv with python3.5 or python3.6. Older versions are not supported due to a lack of compatibilty with pytorch.

python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Run with Docker

Build the image:

docker build . -t fast-neural-style

Run the container:

docker run --rm --volume "$(pwd)/:/data" style eval --content-image /data/image.jpg --model /app/saved-models/mosaic.pth --output-image /data/output.jpg --cuda 0

Usage

Stylize image

python neural_style/neural_style.py eval --content-image </path/to/content/image> --model </path/to/saved/model> --output-image </path/to/output/image> --cuda 0

Train model

python neural_style/neural_style.py train --dataset </path/to/train-dataset> --style-image </path/to/style/image> --vgg-model-dir </path/to/vgg/folder> --save-model-dir </path/to/save-model/folder> --epochs 2 --cuda 1

There are several command line arguments, the important ones are listed below

Refer to neural_style/neural_style.py for other command line arguments.

Models

Models for the examples shown below can be downloaded from here or by running the script download_styling_models.sh.

<div align='center'> <img src='images/content-images/amber.jpg' height="174px"> </div> <div align='center'> <img src='images/style-images/mosaic.jpg' height="174px"> <img src='images/output-images/amber-mosaic.jpg' height="174px"> <img src='images/output-images/amber-candy.jpg' height="174px"> <img src='images/style-images/candy.jpg' height="174px"> <br> <img src='images/style-images/starry-night-cropped.jpg' height="174px"> <img src='images/output-images/amber-starry-night.jpg' height="174px"> <img src='images/output-images/amber-udnie.jpg' height="174px"> <img src='images/style-images/udnie.jpg' height="174px"> </div>