The rise and potential of large language model based agents: A survey

Z **, W Chen, X Guo, W He, Y Ding, B Hong… - Science China …, 2025 - Springer
For a long time, researchers have sought artificial intelligence (AI) that matches or exceeds
human intelligence. AI agents, which are artificial entities capable of sensing the …

[HTML][HTML] Deep Learning applications for COVID-19

C Shorten, TM Khoshgoftaar, B Furht - Journal of big Data, 2021 - Springer
This survey explores how Deep Learning has battled the COVID-19 pandemic and provides
directions for future research on COVID-19. We cover Deep Learning applications in Natural …

Angle-optimized text embeddings

X Li, J Li - arxiv preprint arxiv:2309.12871, 2023 - arxiv.org
High-quality text embedding is pivotal in improving semantic textual similarity (STS) tasks,
which are crucial components in Large Language Model (LLM) applications. However, a …

An introduction to deep learning in natural language processing: Models, techniques, and tools

I Lauriola, A Lavelli, F Aiolli - Neurocomputing, 2022 - Elsevier
Abstract Natural Language Processing (NLP) is a branch of artificial intelligence that
involves the design and implementation of systems and algorithms able to interact through …

Simcse: Simple contrastive learning of sentence embeddings

T Gao, X Yao, D Chen - arxiv preprint arxiv:2104.08821, 2021 - arxiv.org
This paper presents SimCSE, a simple contrastive learning framework that greatly advances
state-of-the-art sentence embeddings. We first describe an unsupervised approach, which …

Consert: A contrastive framework for self-supervised sentence representation transfer

Y Yan, R Li, S Wang, F Zhang, W Wu, W Xu - arxiv preprint arxiv …, 2021 - arxiv.org
Learning high-quality sentence representations benefits a wide range of natural language
processing tasks. Though BERT-based pre-trained language models achieve high …

DiffCSE: Difference-based contrastive learning for sentence embeddings

YS Chuang, R Dangovski, H Luo, Y Zhang… - arxiv preprint arxiv …, 2022 - arxiv.org
We propose DiffCSE, an unsupervised contrastive learning framework for learning sentence
embeddings. DiffCSE learns sentence embeddings that are sensitive to the difference …

Declutr: Deep contrastive learning for unsupervised textual representations

J Giorgi, O Nitski, B Wang, G Bader - arxiv preprint arxiv:2006.03659, 2020 - arxiv.org
Sentence embeddings are an important component of many natural language processing
(NLP) systems. Like word embeddings, sentence embeddings are typically learned on large …

Albert: A lite bert for self-supervised learning of language representations

Z Lan, M Chen, S Goodman, K Gimpel… - arxiv preprint arxiv …, 2019 - arxiv.org
Increasing model size when pretraining natural language representations often results in
improved performance on downstream tasks. However, at some point further model …

Sentence-bert: Sentence embeddings using siamese bert-networks

N Reimers, I Gurevych - arxiv preprint arxiv:1908.10084, 2019 - arxiv.org
BERT (Devlin et al., 2018) and RoBERTa (Liu et al., 2019) has set a new state-of-the-art
performance on sentence-pair regression tasks like semantic textual similarity (STS) …