Core-attributes enhanced generative adversarial networks for robust image enhancement

S Liu, G **ao, MS Lew, X Gao, S Wu - Engineering Applications of Artificial …, 2024 - Elsevier
Automated image enhancement algorithms have a profound impact on human life today. To
solve the problems of luminance, lack of detail information, and overall color tone bias of …

Multi-knowledge-driven enhanced module for visible-infrared cross-modal person Re-identification

S Shan, P Sun, G **ao, S Wu - International Journal of Multimedia …, 2024 - Springer
Abstract Visible-Infrared Person Re-identification (VI-ReID) is challenging in social security
surveillance because the semantic gap between cross-modal data significantly reduces VI …

Implicit Modality Knowledge Alignment and Uncertainty Estimation for visible-infrared person re-identification

S Wu, S Shan, G **ao, MS Lew, X Gao - Expert Systems with Applications, 2025 - Elsevier
Visible-infrared person re-identification (VI-ReID) is a crucial cross-modal matching task in
computer vision. Existing research typically relies on a single auxiliary modality to address …

[PDF][PDF] Core-attributes enhanced generati e ad ersarial networks for robust image enhancement. doi: 10.1016/. engappai. 2023.107799 Version: Publisher's Version …

S Liu, G **ao, MSK Lew, X Gao… - Law …, 2024 - scholarlypublications …
Automated image enhancement algorithms have a profound impact on human life today. To
solve the problems of luminance, lack of detail information, and overall color tone bias of …

Portrait Matting Network with Essential Feature Mining and Fusion

H Jiang, S Wu, D He, G **ao - International Conference on Neural …, 2022 - Springer
We propose an end-to-end portrait matting algorithm that emphasizes the mining and fusion
of critical features to achieve higher accuracy. Previous best-performing portrait matting …

[SITAATTI][C] Multi Knowledge-Driven Enhancement Learning for Visible-Infrared Cross-Modal Person Re-Identification

S Shan, G **ao, MS Lew, X Gao, S Wu - Available at SSRN 4477832, 2023