Insight on Style Attentional Networks for Arbitrary Style Transfer

Original article was published by dhwani mehta on Artificial Intelligence on Medium


Calculating loss from same input image without any style gap makes identity loss to achieve maintenance of content structure as well as style characteristics simultaneously.

Conclusion and Results

Figure 7 : User preference result of five style transfer algorithms [image source]

Experiments have evidently shown that the results obtained for Style Transfer using SANet parse diverse style patterns such as global color distribution,
texture, and local style patterns while maintaining the structure of the content. Also SANet is prolific in distinguishing content structure and transfer
style corresponding to each semantic content. It could be hence inferred that SANet is not only efficient but also effective in retaining content structure
as well as easily blend style features enriching global as well as local style statistics.

Figure 8 : Experimental results for comparison of SANet over other style transfer mechanisms [image source]

References

[1] Park, Dae Young, and Kwang Hee Lee. “Arbitrary style transfer with style-attentional networks.” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019.

[2] Gatys, Leon A., Alexander S. Ecker, and Matthias Bethge. “Image style transfer using convolutional neural networks.” Proceedings of the IEEE conference on computer vision and pattern recognition. 2016.

[3] Huang, Xun, and Serge Belongie. “Arbitrary style transfer in real-time with adaptive instance normalization.” Proceedings of the IEEE International Conference on Computer Vision. 2017.

[4] Li, Yijun, et al. “Universal style transfer via feature transforms.” Advances in neural information processing systems. 2017.

[5] Sheng, Lu, et al. “Avatar-net: Multi-scale zero-shot style transfer by feature decoration.” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018.