Inducing relational knowledge from BERT

Z Bouraoui, J Camacho-Collados… - Proceedings of the AAAI …, 2020‏ - ojs.aaai.org
One of the most remarkable properties of word embeddings is the fact that they capture
certain types of semantic and syntactic relationships. Recently, pre-trained language models …

On the systematicity of probing contextualized word representations: The case of hypernymy in BERT

A Ravichander, E Hovy, K Suleman… - Proceedings of the …, 2020‏ - aclanthology.org
Contextualized word representations have become a driving force in NLP, motivating
widespread interest in understanding their capabilities and the mechanisms by which they …

[کتاب][B] Distributional semantics

A Lenci, M Sahlgren - 2023‏ - books.google.com
Distributional semantics develops theories and methods to represent the meaning of natural
language expressions, with vectors encoding their statistical distribution in linguistic …

Relational word embeddings

J Camacho-Collados, L Espinosa-Anke… - arxiv preprint arxiv …, 2019‏ - arxiv.org
While word embeddings have been shown to implicitly encode various forms of attributional
knowledge, the extent to which they capture relational information is far more limited. In …

RelBERT: Embedding Relations with Language Models

A Ushio, J Camacho-Collados, S Schockaert - arxiv preprint arxiv …, 2023‏ - arxiv.org
Many applications need access to background knowledge about how different concepts and
entities are related. Although Knowledge Graphs (KG) and Large Language Models (LLM) …

Combining vision and language representations for patch-based identification of Lexico-semantic relations

P Jha, G Dias, A Lechervy, JG Moreno… - Proceedings of the 30th …, 2022‏ - dl.acm.org
Although a wide range of applications have been proposed in the field of multimodal natural
language processing, very few works have been tackling multimodal relational lexical …

[PDF][PDF] A latent variable model for learning distributional relation vectors

J Camacho Collados, L Espinosa-Anke, S Jameel… - 2019‏ - orca.cardiff.ac.uk
Recently a number of unsupervised approaches have been proposed for learning vectors
that capture the relationship between two words. Inspired by word embedding models, these …

Tifi: Taxonomy induction for fictional domains

CX Chu, S Razniewski, G Weikum - The World Wide Web Conference, 2019‏ - dl.acm.org
Taxonomies are important building blocks of structured knowledge bases, and their
construction from text sources and Wikipedia has received much attention. In this paper we …

Minimally-supervised relation induction from pre-trained language model

L Sun, Y Shen, W Lu - Findings of the Association for …, 2022‏ - aclanthology.org
Relation Induction is a very practical task in Natural Language Processing (NLP) area. In
practical application scenarios, people want to induce more entity pairs having the same …

[PDF][PDF] Understanding feature focus in multitask settings for lexico-semantic relation identification

H Akhmouch, G Dias, JG Moreno - Findings of the Association for …, 2021‏ - aclanthology.org
Discovering whether words are semantically related and identifying the specific semantic
relation that holds between them is of crucial importance for automatic reasoning on text …