Empathetic conversational systems: A review of current advances, gaps, and opportunities
Empathy is a vital factor that contributes to mutual understanding, and joint problem-solving.
In recent years, a growing number of studies have recognized the benefits of empathy and …
In recent years, a growing number of studies have recognized the benefits of empathy and …
Sentence-t5: Scalable sentence encoders from pre-trained text-to-text models
We provide the first exploration of sentence embeddings from text-to-text transformers (T5).
Sentence embeddings are broadly useful for language processing tasks. While T5 achieves …
Sentence embeddings are broadly useful for language processing tasks. While T5 achieves …
Language-agnostic BERT sentence embedding
While BERT is an effective method for learning monolingual sentence embeddings for
semantic similarity and embedding based transfer learning (Reimers and Gurevych, 2019) …
semantic similarity and embedding based transfer learning (Reimers and Gurevych, 2019) …
Whitening sentence representations for better semantics and faster retrieval
Pre-training models such as BERT have achieved great success in many natural language
processing tasks. However, how to obtain better sentence representation through these pre …
processing tasks. However, how to obtain better sentence representation through these pre …
[PDF][PDF] Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
N Reimers - arxiv preprint arxiv:1908.10084, 2019 - fq.pkwyx.com
BERT (Devlin et al., 2018) and RoBERTa (Liu et al., 2019) has set a new state-of-the-art
performance on sentence-pair regression tasks like semantic textual similarity (STS) …
performance on sentence-pair regression tasks like semantic textual similarity (STS) …
Ernie: Enhanced representation through knowledge integration
We present a novel language representation model enhanced by knowledge called ERNIE
(Enhanced Representation through kNowledge IntEgration). Inspired by the masking …
(Enhanced Representation through kNowledge IntEgration). Inspired by the masking …
Universal sentence encoder for English
We present easy-to-use TensorFlow Hub sentence embedding models having good task
transfer performance. Model variants allow for trade-offs between accuracy and compute …
transfer performance. Model variants allow for trade-offs between accuracy and compute …
Language models that seek for knowledge: Modular search & generation for dialogue and prompt completion
Language models (LMs) have recently been shown to generate more factual responses by
employing modularity (Zhou et al., 2021) in combination with retrieval (Adolphs et al., 2021) …
employing modularity (Zhou et al., 2021) in combination with retrieval (Adolphs et al., 2021) …
Multilingual universal sentence encoder for semantic retrieval
We introduce two pre-trained retrieval focused multilingual sentence encoding models,
respectively based on the Transformer and CNN model architectures. The models embed …
respectively based on the Transformer and CNN model architectures. The models embed …
Right to be forgotten in the era of large language models: Implications, challenges, and solutions
Abstract The Right to be Forgotten (RTBF) was first established as the result of the ruling of
Google Spain SL, Google Inc. v AEPD, Mario Costeja González, and was later included as …
Google Spain SL, Google Inc. v AEPD, Mario Costeja González, and was later included as …