Sampling to distill: Knowledge transfer from open-world data

Y Wang, Z Chen, J Zhang, D Yang, Z Ge, Y Liu… - Proceedings of the …, 2024 - dl.acm.org
Data-Free Knowledge Distillation (DFKD) is a novel task that aims to train high-performance
student models using only the pre-trained teacher network without original training data …

Toward Robust Incomplete Multimodal Sentiment Analysis via Hierarchical Representation Learning

M Li, D Yang, Y Liu, S Wang, J Chen, S Wang… - arxiv preprint arxiv …, 2024 - arxiv.org
Multimodal Sentiment Analysis (MSA) is an important research area that aims to understand
and recognize human sentiment through multiple modalities. The complementary …

UniEmoX: Cross-modal Semantic-Guided Large-Scale Pretraining for Universal Scene Emotion Perception

C Chen, X Sun, Z Liu - arxiv preprint arxiv:2409.18877, 2024 - arxiv.org
Visual emotion analysis holds significant research value in both computer vision and
psychology. However, existing methods for visual emotion analysis suffer from limited …