Unbalanced optimal transport, from theory to numerics
Optimal Transport (OT) has recently emerged as a central tool in data sciences to compare
in a geometrically faithful way point clouds and more generally probability distributions. The …
in a geometrically faithful way point clouds and more generally probability distributions. The …
Space-time correspondence as a contrastive random walk
This paper proposes a simple self-supervised approach for learning a representation for
visual correspondence from raw video. We cast correspondence as prediction of links in a …
visual correspondence from raw video. We cast correspondence as prediction of links in a …
Pot: Python optimal transport
Optimal transport has recently been reintroduced to the machine learning community thanks
in part to novel efficient optimization procedures allowing for medium to large scale …
in part to novel efficient optimization procedures allowing for medium to large scale …
Auto-regressive image synthesis with integrated quantization
Deep generative models have achieved conspicuous progress in realistic image synthesis
with multifarious conditional inputs, while generating diverse yet high-fidelity images …
with multifarious conditional inputs, while generating diverse yet high-fidelity images …
Graph optimal transport for cross-domain alignment
Cross-domain alignment between two sets of entities (eg, objects in an image, words in a
sentence) is fundamental to both computer vision and natural language processing. Existing …
sentence) is fundamental to both computer vision and natural language processing. Existing …
Deep graph matching consensus
This work presents a two-stage neural architecture for learning and refining structural
correspondences between graphs. First, we use localized node embeddings computed by a …
correspondences between graphs. First, we use localized node embeddings computed by a …
Prompt learning with optimal transport for vision-language models
With the increasing attention to large vision-language models such as CLIP, there has been
a significant amount of effort dedicated to building efficient prompts. Unlike conventional …
a significant amount of effort dedicated to building efficient prompts. Unlike conventional …
Wasserstein weisfeiler-lehman graph kernels
Most graph kernels are an instance of the class of R-Convolution kernels, which measure
the similarity of objects by comparing their substructures. Despite their empirical success …
the similarity of objects by comparing their substructures. Despite their empirical success …
[PDF][PDF] On dyadic fairness: Exploring and mitigating bias in graph connections
Disparate impact has raised serious concerns in machine learning applications and its
societal impacts. In response to the need of mitigating discrimination, fairness has been …
societal impacts. In response to the need of mitigating discrimination, fairness has been …
Scalable Gromov-Wasserstein learning for graph partitioning and matching
We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a
novel and theoretically-supported paradigm for large-scale graph analysis. The proposed …
novel and theoretically-supported paradigm for large-scale graph analysis. The proposed …