Unbalanced optimal transport, from theory to numerics

T Séjourné, G Peyré, FX Vialard - Handbook of Numerical Analysis, 2023 - Elsevier
Optimal Transport (OT) has recently emerged as a central tool in data sciences to compare
in a geometrically faithful way point clouds and more generally probability distributions. The …

Space-time correspondence as a contrastive random walk

A Jabri, A Owens, A Efros - Advances in neural information …, 2020 - proceedings.neurips.cc
This paper proposes a simple self-supervised approach for learning a representation for
visual correspondence from raw video. We cast correspondence as prediction of links in a …

Pot: Python optimal transport

R Flamary, N Courty, A Gramfort, MZ Alaya… - Journal of Machine …, 2021 - jmlr.org
Optimal transport has recently been reintroduced to the machine learning community thanks
in part to novel efficient optimization procedures allowing for medium to large scale …

Auto-regressive image synthesis with integrated quantization

F Zhan, Y Yu, R Wu, J Zhang, K Cui, C Zhang… - European Conference on …, 2022 - Springer
Deep generative models have achieved conspicuous progress in realistic image synthesis
with multifarious conditional inputs, while generating diverse yet high-fidelity images …

Graph optimal transport for cross-domain alignment

L Chen, Z Gan, Y Cheng, L Li… - … on Machine Learning, 2020 - proceedings.mlr.press
Cross-domain alignment between two sets of entities (eg, objects in an image, words in a
sentence) is fundamental to both computer vision and natural language processing. Existing …

Deep graph matching consensus

M Fey, JE Lenssen, C Morris, J Masci… - arxiv preprint arxiv …, 2020 - arxiv.org
This work presents a two-stage neural architecture for learning and refining structural
correspondences between graphs. First, we use localized node embeddings computed by a …

Prompt learning with optimal transport for vision-language models

G Chen, W Yao, X Song, X Li, Y Rao, K Zhang - 2022 - openreview.net
With the increasing attention to large vision-language models such as CLIP, there has been
a significant amount of effort dedicated to building efficient prompts. Unlike conventional …

Wasserstein weisfeiler-lehman graph kernels

M Togninalli, E Ghisu… - Advances in neural …, 2019 - proceedings.neurips.cc
Most graph kernels are an instance of the class of R-Convolution kernels, which measure
the similarity of objects by comparing their substructures. Despite their empirical success …

[PDF][PDF] On dyadic fairness: Exploring and mitigating bias in graph connections

P Li, Y Wang, H Zhao, P Hong, H Liu - International Conference on …, 2021 - par.nsf.gov
Disparate impact has raised serious concerns in machine learning applications and its
societal impacts. In response to the need of mitigating discrimination, fairness has been …

Scalable Gromov-Wasserstein learning for graph partitioning and matching

H Xu, D Luo, L Carin - Advances in neural information …, 2019 - proceedings.neurips.cc
We propose a scalable Gromov-Wasserstein learning (S-GWL) method and establish a
novel and theoretically-supported paradigm for large-scale graph analysis. The proposed …