A non-definitive auto-transfer mechanism for arbitrary style transfers
File version
Author(s)
Xu, Libo
Yu, Xin
Lu, Huanda
Huang, Zhenrui
Wang, Huanhuan
Pang, Chaoyi
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
In recent years, image style transfers have increasingly become a hot research topic in the field of computer vision. The existing CNN-based style transfer methods have been the basis of a substantial amount of research in the content–style fusion area. Artificially controlled style fusions are applied by deterministic computations. Since these methods limit the automatic learning ability of the model, their style transfer effect is unstable. To solve this problem, we propose a non-definitive style auto-transfer module. This module is based on an attention submodule that guides the model in regard to the channels and the space for content–style fusions. Instead of artificially defining the content–style fusion methods, it lets the model learn on its own. We also propose a feature shuffle operation, which reduces the influence of the style image on the content of the result. In addition, to better preserve the high-level and low-level information of the image, our loss function adopts a multi-scale content–style loss and an edge detection loss. All our experiments are conducted on the WikiArt and Microsoft COCO datasets. The experimental results show that our method can achieve more stable and better visual effects than the existing methods.
Journal Title
Knowledge-Based Systems
Conference Title
Book Title
Edition
Volume
260
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject
Artificial intelligence
Data management and data science
Machine learning
Persistent link to this record
Citation
Wang, J; Xu, L; Yu, X; Lu, H; Huang, Z; Wang, H; Pang, C, A non-definitive auto-transfer mechanism for arbitrary style transfers, Knowledge-Based Systems, 2023, 260, pp. 110171