Empathetic conversational systems: A review of current advances, gaps, and opportunities

AS Raamkumar, Y Yang - IEEE Transactions on Affective …, 2022 - ieeexplore.ieee.org
Empathy is a vital factor that contributes to mutual understanding, and joint problem-solving.
In recent years, a growing number of studies have recognized the benefits of empathy and …

Sentence-t5: Scalable sentence encoders from pre-trained text-to-text models

J Ni, GH Abrego, N Constant, J Ma, KB Hall… - arxiv preprint arxiv …, 2021 - arxiv.org
We provide the first exploration of sentence embeddings from text-to-text transformers (T5).
Sentence embeddings are broadly useful for language processing tasks. While T5 achieves …

Language-agnostic BERT sentence embedding

F Feng, Y Yang, D Cer, N Arivazhagan… - arxiv preprint arxiv …, 2020 - arxiv.org
While BERT is an effective method for learning monolingual sentence embeddings for
semantic similarity and embedding based transfer learning (Reimers and Gurevych, 2019) …

Whitening sentence representations for better semantics and faster retrieval

J Su, J Cao, W Liu, Y Ou - arxiv preprint arxiv:2103.15316, 2021 - arxiv.org
Pre-training models such as BERT have achieved great success in many natural language
processing tasks. However, how to obtain better sentence representation through these pre …

[PDF][PDF] Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks

N Reimers - arxiv preprint arxiv:1908.10084, 2019 - fq.pkwyx.com
BERT (Devlin et al., 2018) and RoBERTa (Liu et al., 2019) has set a new state-of-the-art
performance on sentence-pair regression tasks like semantic textual similarity (STS) …

Ernie: Enhanced representation through knowledge integration

Y Sun, S Wang, Y Li, S Feng, X Chen, H Zhang… - arxiv preprint arxiv …, 2019 - arxiv.org
We present a novel language representation model enhanced by knowledge called ERNIE
(Enhanced Representation through kNowledge IntEgration). Inspired by the masking …

Universal sentence encoder for English

D Cer, Y Yang, S Kong, N Hua, N Limtiaco… - Proceedings of the …, 2018 - aclanthology.org
We present easy-to-use TensorFlow Hub sentence embedding models having good task
transfer performance. Model variants allow for trade-offs between accuracy and compute …

Language models that seek for knowledge: Modular search & generation for dialogue and prompt completion

K Shuster, M Komeili, L Adolphs, S Roller… - arxiv preprint arxiv …, 2022 - arxiv.org
Language models (LMs) have recently been shown to generate more factual responses by
employing modularity (Zhou et al., 2021) in combination with retrieval (Adolphs et al., 2021) …

Multilingual universal sentence encoder for semantic retrieval

Y Yang, D Cer, A Ahmad, M Guo, J Law… - arxiv preprint arxiv …, 2019 - arxiv.org
We introduce two pre-trained retrieval focused multilingual sentence encoding models,
respectively based on the Transformer and CNN model architectures. The models embed …

Right to be forgotten in the era of large language models: Implications, challenges, and solutions

D Zhang, P Finckenberg-Broman, T Hoang, S Pan… - AI and Ethics, 2024 - Springer
Abstract The Right to be Forgotten (RTBF) was first established as the result of the ruling of
Google Spain SL, Google Inc. v AEPD, Mario Costeja González, and was later included as …