Characterizing and Improving Stability in Neural Style Transfer

Recent progress in style transfer on images has focused on improving the quality of stylized images and speed of methods. However, real-time methods are highly unstable resulting in visible flickering when applied to videos. In this work we characterize the instability of these methods by examining the solution set of the style transfer objective. We show that the trace of the Gram matrix representing style is inversely related to the stability of the method. Then, we present a recurrent convolutional network for real-time video style transfer which incorporates a temporal consistency loss and overcomes the instability of prior methods. Our networks can be applied at any resolution, do not re- quire optical flow at test time, and produce high quality, temporally consistent stylized videos in real-time.


Presented at:
International Conference on Computer Vision (ICCV), Venice, Italy, October 22-29, 2017
Year:
2017
Laboratories:




 Record created 2017-08-18, last modified 2018-09-13

Preprint:
Download fulltextPDF
n/a:
Download fulltextJPG.PNG
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)