Original article was published by dhwani mehta on Artificial Intelligence on Medium
Calculating loss from same input image without any style gap makes identity loss to achieve maintenance of content structure as well as style characteristics simultaneously.
Conclusion and Results
Experiments have evidently shown that the results obtained for Style Transfer using SANet parse diverse style patterns such as global color distribution,
texture, and local style patterns while maintaining the structure of the content. Also SANet is prolific in distinguishing content structure and transfer
style corresponding to each semantic content. It could be hence inferred that SANet is not only efficient but also effective in retaining content structure
as well as easily blend style features enriching global as well as local style statistics.
 Gatys, Leon A., Alexander S. Ecker, and Matthias Bethge. “Image style transfer using convolutional neural networks.” Proceedings of the IEEE conference on computer vision and pattern recognition. 2016.