Source: Deep Learning on Medium
This is the first part of a series of posts that will be exploring artistic stylization using neural networks. Ever since the use of Convolution Neural Networks to achieve artistic stylization, commonly known as style transfer, was proposed by Gatys et al. many papers have been written proposing new methods to broaden the application of the original concept.
In this series of posts I will go over each paper and implement it and then discuss the new advances that the method in the paper has brought to the field as well its possible limitations.
I have completed just a couple of days ago all the requirements of the PyTorch Scholarship Challenge from Facebook. I started the course aptly titled Introduction to Deep Learning With PyTorch about two months ago knowing absolutely nothing about Deep Learning.
I have learned a lot ever since and fallen head over heels in love with Convolutional Neural Networks (CNNs). As a side project I wish to explore more in depth one of the uses of CNNs: style transfer.
My ultimate goal, besides getting a more comprehensive understanding of the concept of style transfer, is to build a real-time arbitrary style transfer model that I will integrate to an Android App.
By doing so I hope to build a bridge between this particular application of Deep Learning and the common man.
Wait, what is Style Transfer?
Style transfer is actually pretty much what it sounds like. It is applying the style — colors and texture — of one image to another. The image whose style is being used for the transfer is called the Style Image and the image the style is being transferred or applied to is called the Content Image.
How exactly is it done using Neural Networks?
Allow me to answer this a bit later. In the following posts I will be exploring different answers of this question given by different research teams. For now, just know that it’s done using the ever awesome Convolutional Neural Networks (CNNs).
I will start with the very first paper to ever suggest the use of Convolutional Neural Network to achieve style transfer. I will then go through a few papers that tried to solve some issues that the first paper left pending and then the last paper we will look at will be the one that seems to have solved the ultimate problem related to this.
Once all that is done and I have defined and trained a model based on the technique outlined in the last paper, I will build an Android App and integrate the model to it.
I will be using the PyTorch framework.
These will become links as I write posts about them.
- Implementing the optimization method in the Gatys et al.’s paper.
- Implementing the feedforward method in the Johnson et al.’s paper.
- Exploring the use of Instance Normalization in stylization proposed in the Ulyanov et al.’s paper.
- Implementing the Adaptive Instance Normalization method in the paper by Huang et al.
- Implementing the method described by the team at Google Brain.
- Reflections on all these papers and techniques.
- Building the Android App.
- Hooking up the model to the Android App.
Let’s dive in! Wish me luck!
These are all the research papers mentioned above that I will be reading and working on over the coming days or weeks:
- Here is the first paper to propose the use of Convolution Neural Networks for style transfer.
- Here is the paper that pioneered the feedforward method and took a step closer towards arbitrary real-time style transfer.
- Here you will find the paper pioneering the use of Instance Normalization in stylization
- Here is the paper that proposed the use of Adaptive Instance Normalization to achieve arbitrary neural artistic transfer.
- Here is the paper by the team at Google Brain.
- Here you will find a tutorial about transferring a model from PyTorch to Caffe2 and mobile.