GCSANet: Arbitrary style transfer with global context self-attentional network

Z Bai, H Xu, X Zhang, Q Ding - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Arbitrary style transfer is attracting increasing attention in the computer vision community
due to its application flexibility. Existing approaches directly fuse deep style features with …

Facerefiner: High-fidelity facial texture refinement with differentiable rendering-based style transfer

C Li, B Cheng, Y Cheng, H Zhang, R Liu… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Recent facial texture generation methods prefer to use deep networks to synthesize image
content and then fill in the UV map, thus generating a compelling full texture from a single …

MNCAA: Balanced Style Transfer Based on Multi-level Normalized Cross Attention Alignment

L Heng, C Feng, J Han - Authorea Preprints, 2023 - techrxiv.org
Given a content image and an artistic style one, style transfer usually refers to applying the
patterns learned from the style image to the content image to generate a new stylized image …