Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Does negative sampling matter? a review with insights into its theory and applications
Negative sampling has swiftly risen to prominence as a focal point of research, with wide-
ranging applications spanning machine learning, computer vision, natural language …
ranging applications spanning machine learning, computer vision, natural language …
Contrastive learning models for sentence representations
Sentence representation learning is a crucial task in natural language processing, as the
quality of learned representations directly influences downstream tasks, such as sentence …
quality of learned representations directly influences downstream tasks, such as sentence …
Heterogeneous contrastive learning for foundation models and beyond
In the era of big data and Artificial Intelligence, an emerging paradigm is to utilize contrastive
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …
Whitenedcse: Whitening-based contrastive learning of sentence embeddings
This paper presents a whitening-based contrastive learning method for sentence
embedding learning (WhitenedCSE), which combines contrastive learning with a novel …
embedding learning (WhitenedCSE), which combines contrastive learning with a novel …
[HTML][HTML] Contrastive sentence representation learning with adaptive false negative cancellation
Contrastive sentence representation learning has made great progress thanks to a range of
text augmentation strategies and hard negative sampling techniques. However, most studies …
text augmentation strategies and hard negative sampling techniques. However, most studies …
Contrastive pre-training with adversarial perturbations for check-in sequence representation learning
A core step of mining human mobility data is to learn accurate representations for user-
generated check-in sequences. The learned representations should be able to fully describe …
generated check-in sequences. The learned representations should be able to fully describe …
micse: Mutual information contrastive learning for low-shot sentence embeddings
This paper presents miCSE, a mutual information-based contrastive learning framework that
significantly advances the state-of-the-art in few-shot sentence embedding. The proposed …
significantly advances the state-of-the-art in few-shot sentence embedding. The proposed …
Detective: Detecting ai-generated text via multi-level contrastive learning
X Guo, S Zhang, Y He, T Zhang, W Feng… - arxiv preprint arxiv …, 2024 - arxiv.org
Current techniques for detecting AI-generated text are largely confined to manual feature
crafting and supervised binary classification paradigms. These methodologies typically lead …
crafting and supervised binary classification paradigms. These methodologies typically lead …
Contraclm: Contrastive learning for causal language model
Despite exciting progress in causal language models, the expressiveness of the
representations is largely limited due to poor discrimination ability. To remedy this issue, we …
representations is largely limited due to poor discrimination ability. To remedy this issue, we …
EMMA-X: an EM-like multilingual pre-training algorithm for cross-lingual representation learning
P Guo, X Wei, Y Hu, B Yang, D Liu… - Advances in Neural …, 2023 - proceedings.neurips.cc
Expressing universal semantics common to all languages is helpful to understand the
meanings of complex and culture-specific sentences. The research theme underlying this …
meanings of complex and culture-specific sentences. The research theme underlying this …