Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
The rise and potential of large language model based agents: A survey
For a long time, researchers have sought artificial intelligence (AI) that matches or exceeds
human intelligence. AI agents, which are artificial entities capable of sensing the …
human intelligence. AI agents, which are artificial entities capable of sensing the …
[HTML][HTML] Deep Learning applications for COVID-19
This survey explores how Deep Learning has battled the COVID-19 pandemic and provides
directions for future research on COVID-19. We cover Deep Learning applications in Natural …
directions for future research on COVID-19. We cover Deep Learning applications in Natural …
Angle-optimized text embeddings
High-quality text embedding is pivotal in improving semantic textual similarity (STS) tasks,
which are crucial components in Large Language Model (LLM) applications. However, a …
which are crucial components in Large Language Model (LLM) applications. However, a …
An introduction to deep learning in natural language processing: Models, techniques, and tools
Abstract Natural Language Processing (NLP) is a branch of artificial intelligence that
involves the design and implementation of systems and algorithms able to interact through …
involves the design and implementation of systems and algorithms able to interact through …
Simcse: Simple contrastive learning of sentence embeddings
This paper presents SimCSE, a simple contrastive learning framework that greatly advances
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …
Consert: A contrastive framework for self-supervised sentence representation transfer
Learning high-quality sentence representations benefits a wide range of natural language
processing tasks. Though BERT-based pre-trained language models achieve high …
processing tasks. Though BERT-based pre-trained language models achieve high …
DiffCSE: Difference-based contrastive learning for sentence embeddings
We propose DiffCSE, an unsupervised contrastive learning framework for learning sentence
embeddings. DiffCSE learns sentence embeddings that are sensitive to the difference …
embeddings. DiffCSE learns sentence embeddings that are sensitive to the difference …
Declutr: Deep contrastive learning for unsupervised textual representations
Sentence embeddings are an important component of many natural language processing
(NLP) systems. Like word embeddings, sentence embeddings are typically learned on large …
(NLP) systems. Like word embeddings, sentence embeddings are typically learned on large …
Albert: A lite bert for self-supervised learning of language representations
Increasing model size when pretraining natural language representations often results in
improved performance on downstream tasks. However, at some point further model …
improved performance on downstream tasks. However, at some point further model …
Sentence-bert: Sentence embeddings using siamese bert-networks
BERT (Devlin et al., 2018) and RoBERTa (Liu et al., 2019) has set a new state-of-the-art
performance on sentence-pair regression tasks like semantic textual similarity (STS) …
performance on sentence-pair regression tasks like semantic textual similarity (STS) …