Skip to content

Latest commit

 

History

History

style_neural_network

Style Neural Networks

Link to blogpost, where I explain in more detail what I am doing.

In this folder, I explore style neural networks, and implement papers describing descriptive style neural networks, in which the image is iterated to minimize a loss function which describes the style and content of two input images.

My approach exploring this was to implement papers on style neural networks. There are three papers I implemented:

  1. A Neural Network of Artistic Style
  2. Incorporating Long Range Consistency in CNN based Texture Generation
  3. Stable and Controllable Neural Texture Synthesis and Style Transfer Using Histogram Losses

These notebooks require Keras, with a Tensorflow backend (as I use some tensorflow specific methods to manipulate the tensors).

I slightly rewrite the VGG model, so that average pooling instead of max pooling occurs between layers. This modified model is in VGG16.py

A Neural Network of Artistic Style

As this is the basis of style neural networks, this paper has already been implemented in Keras and Tensorflow. I used this blog's implementation.

Long Range Consistency

I implement this paper over two ipython notebooks:

  1. In Spatial_Co_Occurences.ipynb, I write the loss functions describing the spatially transformed outputs, and their Gramian matrices.
  2. In Style_Network_w_CoOccurence.ipynb, I add the loss function to the already existing style and content loss functions.

This yields the following style output (compared to a normal style loss function):

Histogram Losses

I implement this paper over two ipython notebooks:

  1. In Histogram Loss.ipynb, I implement histogram mapping for the outputs of the VGG model, and generate a loss function from them.
  2. In Style Transfer with Histogram Loss.ipynb, I add this to the original style and content loss functions.

Combining it all

I add it all together in Final_Style_Network.ipynb to yield the following image: