An implementation of the arXiv preprint A Neural Algorithm of Artistic Style [1] & paper Image Style Transfer Using Convolutional Neural Networks [2].
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
To install dependent modules:
pip3 install -r requirements.txt
neural_stylization contains Python modules with utility methods and classes for the project.
This project relies on the VGG19 architecture. VGG19-classification.ipynb outlines some basic image classification using the network with weight-set W pre-trained on the ImageNet dataset. The implementation of VGG19 can be found in neural_stylization/vgg19.py. Utility methods for loading manipulating, and normalizing image can be found in neural_stylization/img_util.py.
content-reconstruction.ipynb describes the content reconstruction process from white noise. Performing gradient descent of the content loss on a white noise input x for a given content p yields a representation of the networks activation for a given layer l.
style-reconstruction.ipynb describes the style reconstruction process from white noise. Performing gradient descent of the style loss on a white noise input x for a given artwork a yields a representation of the networks activation for a given set of layers L.
style-transfer.ipynb describes the style transfer process between a white noise image x, a content image p, and a style representation a. Performing gradient descent of the content loss and style loss with respect to x impressions the content of p into x, bearing local styles, and colors from a.
photo-realistic-style-transfer.ipynb describes the photo-realistic style transfer process. Opposed to transfering style from an artwork, this notebook explores transfering a nighttime theme from a picture of one city to a daytime picture of another city with mixed results.
effect-of-content-layer.ipynb visualizes how the style transfer is affected by using different layers for content loss.
effect-of-style-layers.ipynb visualizes how the style transfer is affected by using different sets of layers for style loss.
optimizers.ipynb employs gradient descent, adam, and L-BFGS to understand the affect of different blackbox optimizers. Gatys et. al use L-BFGS, but Adam appears to produce competetive results too.
- keras-team provides
Keras
, a high level neural network framework. They also provide the pre-trained ImageNet weights and some tutorials that help build this project.