Multi-scale attributed node embedding

B Rozemberczki, C Allen… - Journal of Complex …, 2021 - academic.oup.com
We present network embedding algorithms that capture information about a node from the
local distribution over node attributes around it, as observed over random walks following an …

On the origins of linear representations in large language models

Y Jiang, G Rajendran, P Ravikumar, B Aragam… - arxiv preprint arxiv …, 2024 - arxiv.org
Recent works have argued that high-level semantic concepts are encoded" linearly" in the
representation space of large language models. In this work, we study the origins of such …

Infinitewalk: Deep network embeddings as laplacian embeddings with a nonlinearity

S Chanpuriya, C Musco - Proceedings of the 26th ACM SIGKDD …, 2020 - dl.acm.org
The skip-gram model for learning word embeddings (Mikolov et al. 2013) has been widely
popular, and DeepWalk (Perozzi et al. 2014), among other methods, has extended the …

Theoretical understandings of product embedding for e-commerce machine learning

D Xu, C Ruan, E Korpeoglu, S Kumar… - Proceedings of the 14th …, 2021 - dl.acm.org
Product embeddings have been heavily investigated in the past few years, serving as the
cornerstone for a broad range of machine learning applications in e-commerce. Despite the …

Understanding the effects of negative (and positive) pointwise mutual information on word vectors

A Salle, A Villavicencio - Journal of Experimental & Theoretical …, 2023 - Taylor & Francis
Despite the recent popularity of contextual word embeddings, static word embeddings still
dominate lexical semantic tasks, making their study of continued relevance. A widely …

Interpreting knowledge graph relation representation from word embeddings

C Allen, I Balažević, T Hospedales - arxiv preprint arxiv:1909.11611, 2019 - arxiv.org
Many models learn representations of knowledge graph data by exploiting its low-rank latent
structure, encoding known relations between entities and enabling unknown facts to be …

Grarep++: flexible learning graph representations with weighted global structural information

M Ouyang, Y Zhang, X **a, X Xu - IEEE Access, 2023 - ieeexplore.ieee.org
The key to vertex embedding is to learn low-dimensional representations of global graph
information, and integrating information from multiple steps is an effective strategy. Existing …

Understanding and inferring units in spreadsheets

J Williams, C Negreanu, AD Gordon… - 2020 IEEE Symposium …, 2020 - ieeexplore.ieee.org
Numbers in spreadsheets often have units: metres, grams, dollars, etc. Spreadsheet cells
typically cannot carry unit information, and even where they can, users may not be motivated …

[PDF][PDF] On understanding knowledge graph representation

C Allen, I Balazevic, TM Hospedales - arxiv preprint arxiv …, 2019 - researchgate.net
Many methods have been developed to represent knowledge graph data, which implicitly
exploit low-rank latent structure in the data to encode known information and enable …

Dual Word Embedding for Robust Unsupervised Bilingual Lexicon Induction

H Cao, L Li, C Zhu, M Yang… - IEEE/ACM Transactions on …, 2023 - ieeexplore.ieee.org
The word embedding models such as Word2vec and FastText simultaneously learn dual
representations of input vectors and output vectors. In contrast, almost all existing …