A survey of text classification based on pre-trained language model
The utilization of text classification is widespread within the domain of Natural Language
Processing (NLP). In recent years, pre-trained language models (PLMs) based on the …
Processing (NLP). In recent years, pre-trained language models (PLMs) based on the …
CLIPMulti: Explore the performance of multimodal enhanced CLIP for zero-shot text classification
P Wang, D Li, X Hu, Y Wang, Y Zhang - Computer Speech & Language, 2025 - Elsevier
Zero-shot text classification does not require large amounts of labeled data and is designed
to handle text classification tasks that lack annotated training data. Existing zero-shot text …
to handle text classification tasks that lack annotated training data. Existing zero-shot text …
[PDF][PDF] RMIT-IR at EXIST lab at CLEF 2024
This paper describes RMIT-IR team's participation in the EXIST Lab at CLEF 2024. The
proposed approaches aim to address sexism characterization on microblog posts (Tasks 1 …
proposed approaches aim to address sexism characterization on microblog posts (Tasks 1 …
LabCLIP: Label-Enhanced Clip for Improving Zero-Shot Text Classification
Zero-shot text classification aims to handle the text classification task without any annotated
training data, which can greatly alleviate the data scarcity problem. Current dominant …
training data, which can greatly alleviate the data scarcity problem. Current dominant …
Improving Text Classification Performance Through Multimodal Representation
Y Wu, X Zhang, H Ren - Chinese Conference on Pattern Recognition and …, 2024 - Springer
Traditionally, text classification research has predominantly focused on extracting single text
features, with limited exploration of integrating other modal information (such as speech and …
features, with limited exploration of integrating other modal information (such as speech and …
Learning Word Embeddings by Incorporating Latent Meanings of Chinese Characters, Radicals and Sub-characters
X Su, S Zhao, C Cai, W Zhao, Y Wang… - Proceedings of the …, 2024 - dl.acm.org
Different from traditional word embeddings that ignore implicit information within words,
Chinese words usually consist of characters, most of which and can be broken down into …
Chinese words usually consist of characters, most of which and can be broken down into …