Stylediffusion: Controllable disentangled style transfer via diffusion models

Z Wang, L Zhao, W **ng - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Content and style (CS) disentanglement is a fundamental problem and critical challenge of
style transfer. Existing approaches based on explicit definitions (eg, Gram matrix) or implicit …

Ccpl: Contrastive coherence preserving loss for versatile style transfer

Z Wu, Z Zhu, J Du, X Bai - European Conference on Computer Vision, 2022 - Springer
In this paper, we aim to devise a universally versatile style transfer method capable of
performing artistic, photo-realistic, and video style transfer jointly, without seeing videos …

Cross-domain correlation distillation for unsupervised domain adaptation in nighttime semantic segmentation

H Gao, J Guo, G Wang… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
The performance of nighttime semantic segmentation is restricted by the poor illumination
and a lack of pixel-wise annotation, which severely limit its application in autonomous …

Wavelet knowledge distillation: Towards efficient image-to-image translation

L Zhang, X Chen, X Tu, P Wan… - Proceedings of the …, 2022 - openaccess.thecvf.com
Remarkable achievements have been attained with Generative Adversarial Networks
(GANs) in image-to-image translation. However, due to a tremendous amount of parameters …

Accelerating histopathology workflows with generative AI-based virtually multiplexed tumour profiling

P Pati, S Karkampouna, F Bonollo… - Nature machine …, 2024 - nature.com
Understanding the spatial heterogeneity of tumours and its links to disease initiation and
progression is a cornerstone of cancer biology. Presently, histopathology workflows heavily …

Domain-aware universal style transfer

K Hong, S Jeon, H Yang, J Fu… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Style transfer aims to reproduce content images with the styles from reference images.
Existing universal style transfer methods successfully deliver arbitrary styles to original …

AesPA-Net: Aesthetic pattern-aware style transfer networks

K Hong, S Jeon, J Lee, N Ahn, K Kim… - Proceedings of the …, 2023 - openaccess.thecvf.com
To deliver the artistic expression of the target style, recent studies exploit the attention
mechanism owing to its ability to map the local patches of the style image to the …

AesUST: towards aesthetic-enhanced universal style transfer

Z Wang, Z Zhang, L Zhao, Z Zuo, A Li, W **ng… - Proceedings of the 30th …, 2022 - dl.acm.org
Recent studies have shown remarkable success in universal style transfer which transfers
arbitrary visual styles to content images. However, existing approaches suffer from the …

Neural preset for color style transfer

Z Ke, Y Liu, L Zhu, N Zhao… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
In this paper, we present a Neural Preset technique to address the limitations of existing
color style transfer methods, including visual artifacts, vast memory requirement, and slow …

Unsupervised monocular depth estimation in highly complex environments

C Zhao, Y Tang, Q Sun - IEEE Transactions on Emerging …, 2022 - ieeexplore.ieee.org
With the development of computational intelligence algorithms, unsupervised monocular
depth and pose estimation framework, which is driven by warped photometric consistency …