[HTML][HTML] Extracting sentence embeddings from pretrained transformer models
L Stankevičius, M Lukoševičius - Applied Sciences, 2024 - mdpi.com
Pre-trained transformer models shine in many natural language processing tasks and
therefore are expected to bear the representation of the input sentence or text meaning …
therefore are expected to bear the representation of the input sentence or text meaning …
Context-aware cross-lingual map**
H Aldarmaki, M Diab - arxiv preprint arxiv:1903.03243, 2019 - arxiv.org
Cross-lingual word vectors are typically obtained by fitting an orthogonal matrix that maps
the entries of a bilingual dictionary from a source to a target vector space. Word vectors …
the entries of a bilingual dictionary from a source to a target vector space. Word vectors …
Efficient sentence embedding using discrete cosine transform
Vector averaging remains one of the most popular sentence embedding methods in spite of
its obvious disregard for syntactic structure. While more complex sequential or convolutional …
its obvious disregard for syntactic structure. While more complex sequential or convolutional …
A qualitative evaluation framework for paraphrase identification
In this paper, we present a new approach for the evaluation, error analysis, and
interpretation of supervised and unsupervised Paraphrase Identification (PI) systems. Our …
interpretation of supervised and unsupervised Paraphrase Identification (PI) systems. Our …
Unsupervised sentence-embeddings by manifold approximation and projection
S Kayal - arxiv preprint arxiv:2102.03795, 2021 - arxiv.org
The concept of unsupervised universal sentence encoders has gained traction recently,
wherein pre-trained models generate effective task-agnostic fixed-dimensional …
wherein pre-trained models generate effective task-agnostic fixed-dimensional …
Decomposing and comparing meaning relations: Paraphrasing, textual entailment, contradiction, and specificity
In this paper, we present a methodology for decomposing and comparing multiple meaning
relations (paraphrasing, textual entailment, contradiction, and specificity). The methodology …
relations (paraphrasing, textual entailment, contradiction, and specificity). The methodology …
Compressing Sentence Representation with Maximum Coding Rate Reduction
In most natural language inference problems, sentence representation is needed for
semantic retrieval tasks. In recent years, pre-trained large language models have been quite …
semantic retrieval tasks. In recent years, pre-trained large language models have been quite …
Scalable cross-lingual transfer of neural sentence embeddings
H Aldarmaki, M Diab - arxiv preprint arxiv:1904.05542, 2019 - arxiv.org
We develop and investigate several cross-lingual alignment approaches for neural sentence
embedding models, such as the supervised inference classifier, InferSent, and sequential …
embedding models, such as the supervised inference classifier, InferSent, and sequential …
Cross-Lingual Alignment of Word & Sentence Embeddings
H Aldarmaki - 2019 - search.proquest.com
One of the notable developments in current natural language processing is the practical
efficacy of probabilistic word representations, where words are embedded in high …
efficacy of probabilistic word representations, where words are embedded in high …
[BOK][B] Automatic Summarization of Financial Reports
M Alikhani - 2021 - search.proquest.com
Abstract The field of Natural Language Processing (NLP) has witnessed substantial
advancements due to both the development of new algorithms and the increase of …
advancements due to both the development of new algorithms and the increase of …