Contrastive learning models for sentence representations

L Xu, H **e, Z Li, FL Wang, W Wang, Q Li - ACM Transactions on …, 2023 - dl.acm.org
Sentence representation learning is a crucial task in natural language processing, as the
quality of learned representations directly influences downstream tasks, such as sentence …

[HTML][HTML] Contrastive sentence representation learning with adaptive false negative cancellation

L Xu, H **e, FL Wang, X Tao, W Wang, Q Li - Information Fusion, 2024 - Elsevier
Contrastive sentence representation learning has made great progress thanks to a range of
text augmentation strategies and hard negative sampling techniques. However, most studies …

Cpl: Counterfactual prompt learning for vision and language models

X He, D Yang, W Feng, TJ Fu, A Akula… - arxiv preprint arxiv …, 2022 - arxiv.org
Prompt tuning is a new few-shot transfer learning technique that only tunes the learnable
prompt for pre-trained vision and language models such as CLIP. However, existing prompt …

A multi-level supervised contrastive learning framework for low-resource natural language inference

X Hu, L Lin, A Liu, L Wen… - IEEE/ACM Transactions …, 2023 - ieeexplore.ieee.org
Natural Language Inference (NLI) is a growingly essential task in natural language
understanding, which requires inferring the relationship between the sentence pairs …

Facilitating contrastive learning of discourse relational senses by exploiting the hierarchy of sense relations

W Long, B Webber - arxiv preprint arxiv:2301.02724, 2023 - arxiv.org
Implicit discourse relation recognition is a challenging task that involves identifying the
sense or senses that hold between two adjacent spans of text, in the absence of an explicit …

A survey of methods for addressing class imbalance in deep-learning based natural language processing

S Henning, W Beluch, A Fraser, A Friedrich - arxiv preprint arxiv …, 2022 - arxiv.org
Many natural language processing (NLP) tasks are naturally imbalanced, as some target
categories occur much more frequently than others in the real world. In such scenarios …