Graph mamba: Towards learning on graphs with state space models
Graph Neural Networks (GNNs) have shown promising potential in graph representation
learning. The majority of GNNs define a local message-passing mechanism, propagating …
learning. The majority of GNNs define a local message-passing mechanism, propagating …
Weisfeiler and leman go machine learning: The story so far
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …
Facilitating graph neural networks with random walk on simplicial complexes
Node-level random walk has been widely used to improve Graph Neural Networks.
However, there is limited attention to random walk on edge and, more generally, on $ k …
However, there is limited attention to random walk on edge and, more generally, on $ k …
Rethinking tokenizer and decoder in masked graph modeling for molecules
Masked graph modeling excels in the self-supervised representation learning of molecular
graphs. Scrutinizing previous studies, we can reveal a common scheme consisting of three …
graphs. Scrutinizing previous studies, we can reveal a common scheme consisting of three …
Approximately equivariant graph networks
Graph neural networks (GNNs) are commonly described as being permutation equivariant
with respect to node relabeling in the graph. This symmetry of GNNs is often compared to …
with respect to node relabeling in the graph. This symmetry of GNNs is often compared to …
Wl meet vc
Recently, many works studied the expressive power of graph neural networks (GNNs) by
linking it to the $1 $-dimensional Weisfeiler-Leman algorithm ($1\text {-}\mathsf {WL} $) …
linking it to the $1 $-dimensional Weisfeiler-Leman algorithm ($1\text {-}\mathsf {WL} $) …
From relational pooling to subgraph gnns: A universal framework for more expressive graph neural networks
Relational pooling is a framework for building more expressive and permutation-invariant
graph neural networks. However, there is limited understanding of the exact enhancement in …
graph neural networks. However, there is limited understanding of the exact enhancement in …
Unifying generation and prediction on graphs with latent graph diffusion
In this paper, we propose the first framework that enables solving graph learning tasks of all
levels (node, edge and graph) and all types (generation, regression and classification) using …
levels (node, edge and graph) and all types (generation, regression and classification) using …
Distance-restricted folklore weisfeiler-leman GNNs with provable cycle counting power
The ability of graph neural networks (GNNs) to count certain graph substructures, especially
cycles, is important for the success of GNNs on a wide range of tasks. It has been recently …
cycles, is important for the success of GNNs on a wide range of tasks. It has been recently …
Efficient subgraph gnns by learning effective selection policies
Subgraph GNNs are provably expressive neural architectures that learn graph
representations from sets of subgraphs. Unfortunately, their applicability is hampered by the …
representations from sets of subgraphs. Unfortunately, their applicability is hampered by the …