Parameter-Free Style Projection for Arbitrary Style Transfer (2003.07694v2)
Abstract: Arbitrary image style transfer is a challenging task which aims to stylize a content image conditioned on arbitrary style images. In this task the feature-level content-style transformation plays a vital role for proper fusion of features. Existing feature transformation algorithms often suffer from loss of content or style details, non-natural stroke patterns, and unstable training. To mitigate these issues, this paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation. This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer, which includes a regularization term for matching the semantics between input contents and stylized outputs. Extensive qualitative analysis, quantitative evaluation, and user study have demonstrated the effectiveness and efficiency of the proposed methods.
- Siyu Huang (50 papers)
- Haoyi Xiong (98 papers)
- Tianyang Wang (80 papers)
- Bihan Wen (86 papers)
- Qingzhong Wang (26 papers)
- Zeyu Chen (48 papers)
- Jun Huan (31 papers)
- Dejing Dou (112 papers)